Skip to content

‘Seriality for All’: The Role of Protocols and Standards in Critical Theory

By Geert Lovink and Ned Rossiter

Recommended music track to play while reading this:
http://www.youtube.com/watch?v=z6Xae9jsqxU

Tomorrow the world. ‘Whoever sets the standard has the power’. Strangely enough, this view has few disciples. If we talk about power, and dare to think that we can take over and be in charge, we rarely take Voltaire’s advice to focus all our attention on victory and instead indulge ourselves in self-criticism over how time and again we fail. Mention the word power and we will almost intuitively think of the political class and our revulsion for this profession. We prefer to believe media-savvy opinion makers control the political agenda. It is tempting to think that content, and not form, determines our lives. Those of us who publicly discuss protocols are easily dismissed as cynical techno-determinists or boring bureaucrats. The standard height of a computer table is 72 cm. But who gets bothered about that? Isn’t it the quality of the work that comes out of the computer on that very table which counts? An easy-on-the-eye font for a novel is nice enough, but what really counts is the writer’s gift to entertain us.

For many years, philosophers have been casting doubt on the common identification with meaning and signification as the primary human response mechanisms to the world. If we wish to understand anything about how our complex technical society is made up, we must pay attention to the underlying structures that surround us, from industry norms to building regulations, software icons and internet protocols. Yet our ordinary understanding of the world resists this very idea. If we call for another society, with more equality and style, it is not enough to think differently; the very framework of that thinking must be negated and overturned. Or implode, vaporize, fade away (if you are in Baudrillard mode). A ‘true’ revolution in today’s technological society is not one where politicians are replaced but all the very standards and protocols of the system as such are overthrown, or at least put into question. Media experts are already aware of this. If you want to make a lasting contribution that makes a substantial difference you will have to design the standards for communication. It is not enough to unleash a Twitter revolution: you have to develop “and own“ the next Twitter platform yourself. This is the politics of the standard: those who are able to determine the outline of the form determine like no other tomorrow’s world.

‘Protocol Now!’ We cannot deal with the unbearable truth that techno-determinism confronts us with. Reducing the world to rules that rule us shuts down the imagination and turns the otherwise docile and routine-minded Western subject into a rebel. Protocols, so we fear, cannot be questioned and are looked down upon as religious rituals that have lost their original social context. What remains are empty, meaningless gestures. Why is it that only a few attach any belief to the essence that hides behind our technical infrastructures? Guided by the real-time attention economy we are so easily distracted, moving from one surface to the next. The very idea that, when it actually comes down to it, a closed company of technocrats decides our window on the world should be cause for concern. It is not supposed to be the HD camera or the animation program that makes a film good or bad but the creative skills of the filmmaker to tell the story in such a way that we immediately forget the technical details. At least this is how we are repeatedly inclined to think in this Age of the Amateur.

Who really understands the degree to which the browser decides what we get to see on the internet?1 Who will finally map the influence that giants like Google, Apple, Facebook and Microsoft have on our visual culture? Marshall McLuhan’s sixties’ statement that ‘the medium is the message’ remains a misunderstood speculation, which has proved not untrue but rather irritating in its banality. The attention paid in the media to background standards and protocols is minimal. Instead, we gaze starry eyed at the whirlwind lives of the celebrity and the micro-opinions of presenters, bloggers and commentators. It is this sort of interference that reassures us. But when will the discomfort with the artist as an ‘eye candy maker’ actually emerge?

In his 2004 book Protocol New York media theorist Alexander Galloway introduced a critical theory and humanities reading of the protocol concept. The book addressed a young audience of geeks, artists, scholars and activists that were not primarily interested in legal issues, or the bureaucratic side of technology for that matter.2 Even though only a part of this edited PhD deals with the topic itself, the book quickly became popular, the main reason for this being that the author asked the often-heard question of how control can exist in distributed, decentralized environments. This interest coincided with the popularity of the ‘network of networks’ image of the internet and its self-correcting automated processes and auto-poetic machines that never stop producing and are so efficient exactly because there is no top-down interference of a Big Brother. The trick here is to read the liberating prose of the engineers and IT management gurus against the grain: the autonomous nodes are the new manifestation of power.

For Galloway ‘protocol refers to the technology of organization and control operating in distributed networks’.3 Ever since Galloway made the rounds in new media theory and arts circles there has been increased talk about ‘protocological’ power inside network cultures. The term is used to open up a dialogue over forms of power where we see no power. Protocol is introduced not so much as a fixed and static set of rules but rather to emphasize processual flows. In this view protocols create the shape of channels and define how information is embedded in social-technical systems.

The protocolization of society is part of wider transition from domination to hegemony, as Jean Baudrillard explains in The Agony of Power.4 In the old situation domination could only be reversed from the outside whereas under the current regime hegemony can only be inverted from the inside. ‘Hegemony works through general masquerade, it relies on the excessive use of every sign and obscenity, the way it mocks its own values, and challenges the rest of the world by its cynicism’.5 Protocols are the invisible servants of the new Soft Power. In order to resist and overcome the ruling protocols, or should we say the protocological regime, Baudrillard states that we cannot go back to the negative and should instead emerge into the ‘vertigo of the denial and artifice’. Leaving behind the theatrics of refusal we can start exploring ‘total ambivalence’ as a strategy for overcoming the protocological determination of resistance.

If we look at contemporary internet culture we can quickly notice that ‘Web 2.0’ and social media are not run by artistic windbags but average IT engineers whose job it is to implement software according to the given protocols. The internet is part of popular culture that focuses its values on ‘the crowds’ and is not the least bit interested in early 20th century activities of the avant-garde. The ‘cool’ image of the creative IT clusters should not confuse us. The unprecedented exercise of power by these start-ups is no longer part of a conspiracy. What is being executed here are MBA scenarios in which venture capitalists have the final say. Control 2.0 is no longer centred around individuals and their ideologies; it is decentralised and machine-driven, some would even call it a topological design of continual variation. This makes it more difficult to identify who really calls the shots. It seems that power is no longer in the hands of people, but manifests itself in software-generated social relationships, surveillance cameras and invisible microchips. So what can we do with this insight? Does it make us depressive because we do not know where to start, or rather joyful as we can simply hack into their leaking wikis and delete data, without harming humans, in case we want to take over? Or would we rather like to dismantle power as such? In that case, would it be possible to sabotage the very principle of ‘protocol’ itself? Is it sufficient to openly display the disfunctionality of the system or should we expect to also come up with a blueprint with viable alternatives before we attack?

‘Power to the Protocol’. Working out who defines and manages the technological standards has become a new method of power analysis. ‘Protocol’ once referred to a tape with verification and date stuck to a papyrus roll. Now, ‘protocol’ is promoted to a decisive collection of ambivalent and implicit rules on which today’s complex societies revolve. The protocol meme has turned cool and indicates that you understand how the shop is run ‘after control’. How can we get a grip on the invisible techno-class that prescribes these rules? Is it sufficient to urge participation? Demonstrating the undemocratic character of the closed consultation is one thing, but are alternative models available? Is it sufficient to discover the holes and bugs in protocols? What do we do with our acquired insight into the architecture of search engines, mobile telephone aesthetics and network cultures? It is one thing to become aware of the omnipresence of protocols at work. But what to do with all these insights? These are also questions about the politics of knowledge production, and the production of new subjectivities.

Complex problems (human rights violations, climate change, border disputes, migration control, labour management, informatization of knowledge) hold the capacity to produce trans-institutional relations that move across geo-cultural scales, and this often results in conflicts around the status of knowledge and legitimacy of expression. A key reason for such conflicts has to do with the spatio-temporal dynamics special to sites – both institutional and non-institutional – of knowledge production. Depending on the geo-cultural scale of distribution and temporality of production, knowledge will be coded with specific social-technical protocols that give rise to the problem of translation across the milieu of knowledge. This is not a question of some kind of impasse in the form of disciplinary borders, but a conflict that is ‘protocological’.

The most often used example of here is the internet and its TCP/IP protocol but in their book The Exploit: A Theory of Networks Alexander Galloway and Eugene Thacker also include DNA and biopolitics in their analysis of protocological control.6 Another example could be the global logistics industry whose primary task it is to manage the movement of people and things in the interests of communication, transport and economic efficiencies. One of the key ways in which logistics undertakes such work is through the application of technologies of measure, the database and spreadsheet being two of the most common instruments of managerial practice. In the case of cognitive labour, the political-economic architecture of intellectual property regimes has prevailed as the definitive instrument of regulation and served as the standard upon which the productivity of intellectual labour is understood. This is especially the case within the sciences and increasingly within the creative industries, which in Australia and the UK have replaced arts and humanities faculties at certain universities.

There are, however, emergent technologies of both labour management and economic generation that mark a substantial departure from the rapidly fading power of IPRs, which is predicated on state systems enforcing the WTO’s TRIPS Agreement – something that doesn’t function terribly well in places like China with its superb economies of piracy or in many countries in Africa where generic drugs are subtracting profits from the pharmaceutical industry and its patent economy.7 Intellectual property rights are no longer the site of real struggle for informational labour, although they continue to play a determining role in academic research and publishing when connected to systems of measure, such as global university and journal rankings, ‘quality assurance’ audits of ‘teaching performance’, numbers of international students, etc. In the age of cognitive capitalism, new sites of struggle are emerging around standards and protocols associated with information mobility and population management in the logistics industries. Key, here, is the return of materiality to computational and informatized life.

Like protocols, standards are everywhere. Their capacity to interlock with one another and adapt to change over time and circumstance are key to their power as non-state agents of governance in culture, society and the economy.8 Standards require a combination of consensus and institutional inter-connection (or hegemony) in order to be implemented through the rule of protocols. In this way, one can speak of environmental standards, health and safety standards, building standards, computational standards and manufacturing standards whose inter-institutional or technical status is made possible through the work of protocols. The capacity for standards to hold traction depends upon protocological control, which is a governing system whose technics of organization shape how value is extracted and divorced from those engaged in variational modes of production.

But there can also be standards for protocols. As mentioned above, the TCP/IP model for internet communications is a protocol that has become a technical standard for internet based communications. Christopher Kelty notes the following on the relation between protocols, implementation and standards for computational processes: ‘The distinction between a protocol, an implementation and a standard is important: Protocols are descriptions of the precise terms by which two computers can communicate (i.e., a dictionary and a handbook for communicating). An implementation is the creation of software that uses a protocol (i.e., actually does the communicating; thus two implementations using the same protocol should be able to share data). A standard defines which protocol should be used by which computers, for what purposes. It may or may not define the protocol, but will set limits on changes to that protocol’.9

A curious tension emerges here between the idea of protocols as new systems of control and standards as holding the capacity to limit that control. Without the formality of ‘cold’ standards, the ‘warm’ and implicit, indirect power of protocols is severely diminished. Standards, in other words, are the key site of the politics of adoption. Herein lies the political potential for the Revenge of the Masses. Social media serve as a good example: their strength is only as good as their capacity to maintain a hegemony of users. Think of what happened to MySpace once Facebook and Twitter took off in 2006 as the preferred social media apps. Having paid a crazy $580 million in 2005, Murdoch’s News Corp dumped its toxic asset for a paltry $35 million in June 2011, contributing to a 22% fall in profits in the quarter to June. The effective collapse of MySpace signals that while masses might not build standards at a technical level, they certainly hold a powerful shaping effect that determines whether a standard becomes adopted or not.

The next step is to decide, if not invent, our own standards and protocols in the world of social media. Let’s move beyond the dependency on Google, Facebook and Twitter for political organization. Recently we saw what happened during the London student protests against government funding cuts to education – numerous activists groups were deactivated and user accounts suspended by Facebook administrators without advance warning. There are distributed social media alternatives out there, and activists do need to be aware of the political implications of assuming communication protocols established by corporate media. Once the protocological layer of rules is established the political work of building affiliations around standards begins. How to organize the distributed and often conflicting interests of users around the work of creating robust standards is a key challenge for the next decade in which we will see major battles between large scale monopolies, increased state control and decentralized, networked initiatives.

When our research projects and knowledge production become fragmentary exercises that cleave open intermittent gaps in time both orchestrated and unforeseen, we require a catalogue of standards to help maximize the research outputs and amplify the experience of intellectual and social intensity. Various models have been tested in recent years, including the enormously popular speed-dating phenomenon of Pecha Kucha, which celebrates intellectual emptiness with 20 slides shown in 20 second intervals in design, fashion and architecture circles. At the high end of the tech-design scale, the TED Talks aim to populate the world with Silicon Valley inspiration. Critical research on network societies and information economies also needs to generate its own standards.

The design concept and practice of ‘seriality’ offers one technique and strategy of ‘total ambivalence’ with respect to organizing networks in ways that establish autonomous standards in this protocological society of control. As a term, seriality suggests some kind of correspondence with standardization as critiqued by Adorno and Horkheimer in their essay on the ‘culture industry’.10 The standardized distribution and production techniques in the film industry were seen by Adorno and Horkheimer as an industrial rationale to address the organization and management of consumer needs and the desire for uniformity – or what in the education industry today is referred to as ‘quality assurance’ – in the experience of cultural consumption. But we need to distinguish seriality from the Frankfurt School critique of standardization and industrial production, which is typically seen as synonymous with the Fordist assembly line and the production of undifferentiated docile subjects.

Seriality is a line to future possibility. McLuhan knew this well with his concept of probes. The line for us is not about orthodoxy or conforming to dominant protocological rule. When we deploy the term we do not invoke the well known phrase ‘toe the line…’. Instead, the line is a connecting device that facilitates production across otherwise disaggregated networks that are so often compelled to start from scratch when they begin a new venture or undertaking. The line enables the production of standards from a core, which is not equivalent to network metaphors of a hub with nodes or parent company with subsidiary corporations.

A design core is a distributed accumulation of practices, skills, lessons, capacities, connections, concepts, strategies and tactics that build on collective experience over time. The design challenge is to bring these variational elements into relation in such a way that they can be communicated and develop into standards. This is where organization enters in order for seriality to find its line of continual differentiation. Computer engineers are often tasked with this challenge but typically they step in to a project and tinker with the code and then leave for the next job. For some weird reason they are not seen often enough as crucial to the continuum of network cultures.

Serial design is not system design or industrial design. System design defines the parameters, standards and modularity necessary for product development. Industrial design is primarily about scale, inasmuch as aesthetics and usability are integrated in order to maximize extension. Serial design, on the other hand, also brings aesthetics and usability into a constituent relation, but it is not so much about scale itself. We do not need to expand our networks infinitely or extend them across space. We are not talking even about global networks. Instead, it is the problem of time that seriality engages in the time-starved universe of informational economies and cultures. We are all so pressed for time that we download the next widget in a hopeless gesture toward time-saving devices.

Hacklabs, barcamps, unconferencing, book sprints, mobile research platforms – these are all formats that through the work of seriality have become standards for network cultures. Combining on and offline dimensions, they are designed to maximize techno-social intensity and collectively develop products and accumulate experiences in a delimited period of time. Their hit-and-run quality might give the appearance of some kind of spontaneous flash-mob style raid, but in fact they are carefully planned weeks, months and sometimes years in advance. Despite the extended planning duration and intensive meeting space of these formats, they are notable for the way in which they occupy the vanguard of knowledge production. Only five or so years down the track do we see the concepts, models and phenomenon of these formats discussed in academic journals, by which time they have been drained of all life. A key reason for this has to do with the flexibility these formats retain in terms of building relations and taking off into unexpected directions as a result of the unruliness of collective desires. We might see this as the anti-protocological element of network cultures.

Nevertheless, the problem of sustainability still plagues the work of organization across network cultures. Certainly, the practice of seriality goes a long way to addressing this problem. But we need to advance the discussion and practice of seriality by connecting it to the collective development of standards. Only then will we begin to formalize seriality as a distributive practice that can sustain network cultures as new institutional forms. Once that happens, the conflict surrounding the hegemony of protocological regimes will come into full swing.

The hard reality once preached by the historic avant-garde is still valid, no matter how disastrous the implementation of utopian programmes may have been. There is an increasing number of artists who have the ambition to sketch the framework of society. They design new rules and do not simply produce cool design. What we must look for are the contemporary variants of Google. This media giant, with internet pioneer and ICANN domain name boss Vint Cerf (jointly) ruling in the background, is a perfect example of how economic, political and cultural power can be built up using technical laws (algorithms). We can do that as well. The connected multitudes, now in their billions, have reached the end of a long period in which the workings of power first had to be understood and subsequently dismantled. What we are designing now are new spaces of action. Before we concentrate on open standards for living, work and play, we should open a public debate about this matter. Can the loose networks of today organize themselves in such a way that they set the rules for tomorrow’s communication? Yes We Can: Set the Standard.

* This essay will be published in Pieter Wisse (ed.), Interoperabel Nederland, The Hague: Forum Standaardisatie, 2011.

  1. See Konrad Becker and Felix Stalder (eds), Deep Search: The Politics of Search Beyond Google, Innsbruck: Studien Verlag, 2009.
  2. Alexander R. Galloway, Protocol: How Control Exists after Decentralization, Cambridge, Mass.: MIT Press, 2004. A related title, often read in contrast to Galloway, is Wendy Chun, Control and Freedom: Power and Paranoia in the Age of Fiber Optics, Cambridge, Mass., MIT Press, 2006 (as Amazon.com says ‘frequently brought together’). An earlier, sociological approach to the global politics of internet governance and domain names in particular is given by Milton Mueller in Ruling the Root: Internet Governance and the Taming of Cyberspace, Cambridge, Mass., MIT Press, 2002.
  3. Alexander Galloway, ‘Protocol’, Theory, Culture & Society 23.2-3 (2006): 317.
  4. Jean Baudrillard, The Agony of Power, trans. Ames Hoges, Los Angeles: Semiotext(e), 2010.
  5. Ibid., p. 35.
  6. Alexander R. Galloway and Eugene Thacker, The Exploit: A Theory of Networks, Minneapolis: University of Minnesota Press, 2007.
  7. See the fascinating work of Melinda Cooper, who has been studying the economy and geopolitics of clinical labour trials within the pharmaceutical industries – the rise of which can partly be seen as a way of offsetting profits lost from the diminishing returns availed through IPRs as a result of the increasing availability of generic drugs, which in turn can be understood as a sort of pirate economy that even intersects with aspects of open source cultures. Melinda Cooper, ‘Experimental Labour-Offshoring Clinical Trials to China’, EASTS East Asian Science, Technology and Society: An International Journal 2.1 (March 2008): 73-92.
  8. See Martha Lampland and Susan Leigh Star (eds), Standards and their Stories: How Quantifying, Classifying and Formalizing Practices Shape Everyday Life, Ithaca: Cornell University Press, 2009.
  9. Christopher M. Kelty, Two Bits: The Cultural Significance of Free Software, Durham: Duke University Press, 2008, p. 330n28. Available at: http://twobits.net
  10. Theodor Adorno and Max Horkheimer, ‘The Culture Industry: Enlightenment as Mass Deception’, in Dialectic of Enlightenment, trans. John Cumming, London: Verso, 1979.
Creative Commons License
Organized Networks by Ned Rossiter is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.
Based on a work at nedrossiter.org The plaintxtBlog theme, © 2006–2008 Scott Allan Wallick, is licensed under the GNU General Public License.