#BigData #QuantifiedSelf

Governance refers to a process of governing – the way in which norms, laws, and actions are structured, sustained, and held accountable, whether undertaken by the government, society, or the market economy. Essentially, governance involves the practice in which societies are organized, the logic or language of regulation. Hence governance also implies a way of exercising power over someone or something. [1] #AlgorithmicGovernance explores the formal and informal rules of organizing the living through #Algorithms (see key area #Encoding). Algorithmic governance refers to a form of »soft power« that interrupts habits and reorients action potentials. It is a producing force that generates the particular behavior that comes to the surface next; a force that acts before the behavior takes shape. [2] As such algorithmic governance offers a radically different form of managing all aspects of human life, be it socially, politically, economically, or environmentally. It raises immanent questions of how algorithmic processing should be regulated and legislated.

Underlying new forms of governance is the way in which data is gathered and analyzed in order to ascribe value. The last decade has seen an explosion in the amount of data that is being captured and processed in real time. Our environment is increasingly encoded (see key area #Encoding), rendered machine-readable, uniquely indexical, and identifiable by the vast assemblage of connected devices and sensors. Daily life is becoming more and more mediated by digital devices and facilitated by computational infrastructure. The #BigData undertaking strives at capturing society as a whole, the entirety of the population and its activities. [3] The endeavor of data collection and the quantification of the self (#QuantifiedSelf) is underpinned by the intention to produce sophisticated statistical models that characterize, simulate, and predict human life. The key to assembling all this data is the way in which information is correlated – the processing of data through various kinds of statistical analysis and #MachineLearning algorithms – which detect patterns and connections between pieces of data. Correlations of data become sources of knowledge and/or information.

In consequence, governance seems to have turned into a struggle of »how« data is evaluated and »by whom«. Essentially, what the correlation of data allows for is the assemblage of profiles for individuals and groups of people to determine so-called normal behavior and distinguish the abnormal. Individuals are thereby turned into »dividuals,« numerical bodies of code comprised of data assemblages [4] On the basis of these profiles governments and businesses implement their agendas. Whereas the latter adopt strategies to realize capital accumulation that will produce significant profits, the concern of the former is state security. With increasingly invasive means of profiling, companies seek on the one hand to personalize consumer behavior through micromarketing of products. On the other side stands the state which uses new technology to gather information that is supposed to prevent crime but at the same time can attempt to influence how the electorate votes through microtargeting. In both cases powerful algorithms in combination with predictive analytics are employed to conditions of life’s nextness. Control is exercised subtly, making it seem as if the dividual is acting autonomously, yet it lacks the ability to make decisions of its own volition.


[1] See Isabell Lorey, »States of Insecurity: Government of the Precarious,« Verso, London, New York, 2015, pp. 23ff.

[2] See Luciana Parisi, »Contagious Architecture: Computation, Aesthetics, and Space,« The MIT Press, Cambridge (MA), London, 2013, pp. 169ff.

[3] See Rob Kitchin, »The Data Revolution: Big Data, Open Data, Data Infrastructures and Their Consequences,« Sage Publications, Los Angeles, London, 2014, pp. 67ff.

[4] Gilles Deleuze, »Postscript on the Societies of Control,« in: »October,« vol. 59, Winter 1992, pp. 3–7.