In reading Chris Bourg’s great talk “What happens to libraries and librarians when machines can read all the books?” I kept coming back to a couple of things that are outside its scope, but relevant to the question that she asks.
Bourg is arguing, I think, that - given the fundamental changes in AI and machine learning - we need to drasticaly re-vision our services and the goals and outcomes that guide them.
I think we would be wise to start thinking now about machines and algorithms as a new kind of patron — a patron that doesn’t replace human patrons, but has some different needs and might require a different set of skills and a different way of thinking about how our resources could be used.
Libraries and librarians have long been part of the broader world of big data, analytics, text mining, machine learning, and automation, but we have simply looked at those things as tools within our traditional workflows and models rather than as fundamentally informing changes to how and what we do what we do.
But if this switch, from individuals reading books and articles one at a time in print to individuals reading books and articles one at a time on their own digital device is all we get from the digital revolution, then it won’t have been much of a revolution.
The idea of a “revolution in libraries” brings me to the question of culture. In my experience, speaking as an academic librarian, academic libraries continue to a) have a liaison model which is mired in traditional concepts of research and teaching which continues to have less and less validity and is predicated on too much wasted effort; b) our systems and infrastructure is slow to change - with some exemplary exceptions we resist changes to our tools, processes, and workflows which means we are unable to absorb and adopt newer technologies, unable to be flexible and broad-minded enough to engage with, for example, machine learning, at the library system level (this is partly due of course to our completely screwed up vendor ecosystem, which squats like a vampire on the possibility of innovation and advancement in library systems); c) perhaps most damningly, we have an ultra-hierarchical decision-making structure which owes its allegiance to the socio-political dynamics of university administrations rather than to libraries and librarians. Sites that could be used to expand our understanding and use of, for example, machine learning - i.e. poorly-conceived “digital scholarship centres” - are exercises in branding and self-promotion rather than honest attempts to engage in new technologies and ideas.
Bourg alludes to some of the reasons for resistence against the full adoption of machine learning (and its consequent culture-shift) but I’m not sure she gives enough weight to the inertia of library culture and tradition.
The other thing that struck me was in the discussion of jobs:
Robots will take our jobs – In an article in Library Journal in April 2016, Steven Bell writes about the Promise and Peril of AI for Academic Librarians – and he asks “Could artificially intelligent machines eliminate library jobs?
One reason people argue that AI will not replace library or other jobs is that machines can’t replace the deeply human skills of creativity and interaction; which may mean that those skills become more valuable or could mean that AI will usher in an era where creativity and empathy are devalued and rare
Another fear is that AI will eliminate the relationships between people and books, and between librarians and their community members
I’ve been reading Nick Wyer-Witheford’s two books Cyber-Marx (1999) and Cyber-Proletariat (2015), in which he discusses the rise of vast, high-speed networks of automation - predicated on the constantly lowering cost of computing power as well as capital’s need to replace human labour with machines - which drives, amongst other things, the rise of machine learning and practical AI. From Dyer-Witheford these developments are absolutely part of a strategy by capital for “robots to take our jobs” - indeed, this has been more and more the case since the 1980s, when robots began replacing automotive workers, making the Fordist assembly line obsolete. Until recently, intellectual and cultural workers were immune to this process, arguing that “the deeply human skills of creativity and interaction” could not be replaced by machine; in fact, capital was simply picking off the low-hanging fruit until the algorithmic sophistication and computing power reached the stage were “immaterial labour” too came under the sway of automation. This is the position we are currently in.
The image of the stock market traders on the floor of the NYSE, crying their trades in the crowd, are fictions from another age: they too have been replaced with high-speed networks of machine learning algorithms.
And we mustn’t be too sanguine about the replacement of labour by machine. There remains, and there may always remain, a category of human worker cheaper and more expendable than the cheapest machine. Currently those workers are in zones of the global south and east, at the bottom of the global supply chains that lead to the technologies of machine learning and artificial intelligence used in North America. We have to remember that all the technologies we use are, at bottom, the products of murderous, hyper-exploitative, deregulated zones of free trade and cheap labour far from our comfortable universities.
Finally, we have to be aware that the arrival of ultrasophisticated machine learning systems for intellectual and cultural work heralds neither a monolithic dystopia nor a post-scarcity (eu|u)topia, but simply the next phase of the subsumption of human life to the implacable logic of capital accumulation. However, this awareness opens up precisely the possibility of class struggle, the struggle to ensure that the adoption of AI does not erode “the values we care about (inclusion, privacy, democracy, social justice, authority, etc.)?”, the struggle to harness and capture the tools of capitalist exploitation for our own needs. The strategy of capital from the beginning has been to “eliminate the relationships between people”. Insisting on these relationships is the core, I think, of class struggle - one which library workers are going to be inaugurated into via AI and machine learning whether they like it or not. This might sound bad, but I agree with Chris Bourg that the advent of AI in libraries actually opens up a broader scope for change, which can only be a good thing.