Article Image
read

In “Algorithmic Bias and the ‘Fragment on Machines’” I quoted Marx’s Grundrisse, where it describes how the development of fixed capital (specifically tools, machinery, and automation) embody the built-up intellectual development of a given society. Marx writes that “the development of fixed capital indicates to what degree general social knowledge has become a direct force of production, and to what degree, hence, the conditions of the process of social life itself have come under the control of the general intellect and been transformed in accordance with it” (Marx, Grundrisse, 706). The phase of social life dominated by general intellect has often been equated with a liberation of labour from the exploitative conditions of capitalism, mainly by theorists of cognitive capitalism or “fully automated luxury communism”. However, earlier in the text, Marx describes a situation in which the human worker has been almost entirely replaced by machine, but is still caught within the web of oppressive commodity production relationships. Essentially, Marx sees the constant drive to replace living labour by dead labour as giving capital a vision of a future society in which profit continues to be produced with the worker reduced to all but zero. However, this “capitalist utopia”, as Nick Dyer-Witheford calls it, contains within it the seeds of its own transformation into a capitalist nightmare. The less labour is needed to produce surplus value, the freer it is from the domination of the mode of production, the more the necessary labour of reproducing human society can take place without us, freeing us up to achieve our maximum self-development.

The dream of instantaneous profit-production independent of human labour is the holy grail of computerized neoliberalism. High-speed finance, the virtuality of securities and derivatives, robotics and algorithmically-driven supply-chains, are all part of capital’s attempt to keep accumulating and expanding completely independent of the material world. The transition from Fordism-Taylorism to the new regime of production was an attempt to move beyond the productive capacity of the human body, and has now turned its sights on the human mind.

For Marx, the individual productivity of a given worker was immaterial; what mattered was the aggregate, average labour-power expended throughout the mode of production. We can think of this as a statistical understanding of material work, rather than an individual or phenomenological one. Given this way of thinking about aggregate labour, can we also think of the rapid expansion of machine learning, with its intimate connection to statistics, as the equivalent of aggregate, abstract labour in cognitive, intellectual terms?

Justin Joque, in his excellent article on the Bayesian revolution argues that this is precisely what is happening:

We may look back at this revolution in statistical methodology as being equally important as the Taylorist and Fordist revolutions in production at the turn of the 20th century. While those earlier revolutions created the conditions for the ‘automatic’ production of material goods, the Bayesian revolution is cementing the conditions for the automatic production of knowledge.

The Bayesian revolution consists of a change in the way of thinking about statistics. Joque traces this transition away from the “frequentist” methodology of hypothesis-testing, to the Bayesian one of modulating beliefs as more evidence becomes available. We can read this transition as the move - equivalent to Marx’s with respect to material labour - from individual, subjective belief, to an aggregate belief:

Though people start from different assumptions, beliefs… should converge with additional evidence, becoming less subjective and more objective. In this way, Bayesian thinking provides a guide for how to think - or more specifically, how to turn our subjective beliefs into increasingly objective statements about the world. Using Bayes’s theorem to update our beliefs based on evidence, we arrive ever closer to the same conclusions that others with the same evidence arrive at, slowly moving away from our starting assumptions.

The increased availability of computing power, the increased application of the Bayesian method, has led to the embodiment of this abstract, aggregate, system of beliefs within the machine. And since, for Marx, the machine is the embodiment of prior intellectual labour, we can understand the widespread deployment of Bayesian statistics in automated form, as the inaguration of the “general intellect” predicted by Marx.

It seems clear, then, that the Bayesian revolution is precisely the site of a (class) struggle. On the one hand we have the prediction of the liberation of labour and human subjectivity by both Cognitive Capitalism theorists and those at the forefront of machine learning, who feel that a rationalization of human decision-making and human action will result in a technological utopia. On the other hand, there is the prediction that the embodiment of the general intellect in algorithms and machines will lead to a condition of complete subjugation of human life. Joque writes that:

Bayesian thinking and its valorization of a science of doing rather than knowing has allowed a whole host of human activities to be predicted rather than theorized. Read in its most dystopian light, this revolution has allowed algorithms to treat subjective knowledge as though it were objective calculable, and ultimately predictable. […] The Bayesian automation of knowledge production has now concentrated wealth and information, threatening to circumscribe both the world we see and the way the world sees us.

For many years, the two visions present within Marx’ Grundrisse have been presented as a chronological process: first the capitalist utopia, then the victory of general intellect and the emancipation of labour. However, this strikes me as just as deterministic as the “productive force determinism” of the Second International. Instead, what Marx identifies is the the site of antagonism within the phase of general intellect, of artificial intelligence, and Bayesianism. Joque concludes that “the task ahead is to make modes of knowing that are more just, that do not simply serve to reproduce social injustices or concentrate wealth and knowledge in the hands of the few”. By seeing the Bayesian revolution as precisely the moment of “general intellect” described by Marx, I think we can gain a perspective on “the stakes and implications of the Bayesian revolution” that might help us to achieve this goal.

Image

Sam Popowich

Discovery and Web Services Librarian, University of Alberta

Back to Overview