Skip to content

Mediations of Labor: Algorithmic Architectures, Logistical Media, and the Rise of Black Box Politics

By Soenke Zehle and Ned Rossiter

Logistical Media and the Second Machine Age
In a recent panel on living and dead labor at a conference in New York City, respondent Doug Henwood delivered a series of salvos on why he finds cultural theorists so deficient in their comprehension of contemporary labor struggles.1 Declaring himself a “vulgar Marxist” interested in “the world that actually exists” as distinct from “people lost in the idea of artisanal labor and mental labor,” Henwood proceeded to invoke a barrage of statistics to assert the centrality of production and manual labor to the economy and political thought: “there are something like 10-12 million manufacturing workers still in the United States, manufacturing production is up something like 25% since the depths of the recession in this country. Things are still very, very important. There are more truck drivers in the United States than there are computer programmers, there are something like 2 million people working for Walmart doing very physical things. Their bodies are ruined by the job very often.” Drawing on the US Bureau of Labor Statistics, Henwood then went on to note that the top ten jobs in the US over the next decade are expected to be in the service industries (cashiers, food service workers, home health aids). “For the first time in history the majority of the population consists of wage earners. The world has become entirely proletarianized even if we think we are working for Google.” By Henwood’s reckoning, only two companies are making money off “post-material” activities—Google and Facebook.

Clearly incensed by this thinly veiled attack, fellow panelist McKenzie Wark charged Henwood with not being vulgar enough. He wondered why Henwood did not want to explore how material and “post-material” registers of work mesh to turn manual into mediated labor: “the number one company making money out of data is not Google, it’s Walmart. Walmart is a data company…. Logistics determines where the humans move. That’s where all the power is.” What Wark found missing from Henwood’s account was a comprehension of how the incorporation of “proletarian” labor into logistical infrastructures rests upon the work of computer programmers who design the algorithms that make infrastructure operational, and thus direct workers on where to go and what to do.

Historically, unions have organized around the trade-off between wage increases and technology-based productivity gains. This classical approach to modern labor organization has not only been under assault since the advent of neoliberal forms of governance and the globalization of manufacturing and service industries. The delinking of productivity gains from algorithmically managed forms of automation also registers the disappearance of the organizational form of the factory and the rise of a “second machine age.”2 This is not to say factories disappear (obviously they haven’t), so much as to note the terrain for labor organization has become highly obscure as a result of the incorporation of the algorithmic “black box” into the infrastructures of production and distribution. We know that a plethora of apps and interfaces (Uber, TaskRabbit, Amazon Mechanical Turk) for on-demand services enable forms of work and consumption in highly flexible and low-cost ways.

This dispute over the boundaries of labor and the effects of work on the laboring body is symptomatic of the difficulty of establishing a shared vision of what the focus of a politics of labor could and should be. This is not merely a matter of academic debate, of course; reflecting the emotional commitments that sustain a wide range of conceptual, empirical, and organizational activities, the intensity of the exchange also reminds us that the idioms we employ become part of how we relate, and perhaps more importantly, what modes of relation, cooperation, and solidarity we are able to envision. We invoke the conflictual subject of labor here, however, to highlight the need to explore the machinic capture of value through logistical infrastructures, cooling the anger of miscommunication to fuel the cold wrath of a different politics of labor.

Walmart, for example, is already building application programming interfaces (APIs) for developers contributing to its emerging social media strategy. The retail giant also operates like a software company, offering average salaries to software engineers that are higher than those of Amazon, Facebook, Intel, or Microsoft.3 Through the “Microbusiness Tracker,” a first-of-its-kind survey of businesses with fewer than five employees conducted quarterly in cooperation with Gallup, Walmart’s Sam’s Club “explores the emotional and economic concerns of the country’s smallest businesses,” providing “valuable insight into the mood and mind of microbusiness owners for decision makers and influencers in government and commerce by highlighting the unique concerns and challenges of microbusinesses along their path from startup to maturity.”4 Gallup features this aggregation of “emotional and economic concerns” as part of its behavioral economy strategy: “The Behavioral Economy gives you unrivaled insights into the state of mind of U.S. consumers, workers, and job seekers, and the health of the U.S. economy, based on behavioral economic metrics Gallup tracks and analyzes every day.”5 Big data not only allows Walmart to streamline its operations, but to act as a data-driven business association and target its philanthropic activities.6

Actively instructing its managerial staff to detect and discourage worker organization, Walmart’s hostility toward collective action has frequently lead to charges of workplace violations being filed by the National Labor Relations Board; it remains to be seen whether a new NLRB ruling will make it easier to organize subgroups of workers at the largest US private employer.7 Benefiting from a weak sustainability discourse that continues to separate ecological and social justice, efforts by Walmart to “green” its supply chains (selling electronics compliant with European Union’s Restriction on Hazardous Substances, for instance) continue to proceed without a fundamental commitment to establish fair labor conditions among its global suppliers.8

Above and beyond local organizing efforts, we believe that the mediatization of labor calls for more than a critique of automation, outsourcing, or the economies of capture that complement traditional managerialisms of command-and-control.

Semiotic Capitalism / Terminal Subjectivity
Maurizio Lazzarato suggests that we look at the trader rather than the truck driver as exemplary of contemporary labor subjectivity.9 Labor in finance and retail is enmeshed with the subjectivity of the machine in ways that facilitate our comprehension of the role of informatization in the structural transformation of labor more generally. Lazzarato conceptualizes the machine as something other than a tool, “which makes the machine an extension and projection of the human being.”10 Machines are neither characterized by structural or vitalist unity, nor do they turn us away from Being. Instead, machines are assemblages, operating below and above our levels of cognition and perception: “In a machine-centric world, in order to speak, smell, and act, we are of a piece with machines and asignifying semiotics.”11 Thus, within the machinic subjectivity of the financial trader, “each threshold it crosses to make a decision, to express an evaluation, and to indicate a price, subjectivity has no choice but to rely on machines, asignifying writing systems, and information codified and produced by mathematical instruments.”12 The signs in these languages of infrastructures are not simply directed at the trader, but also at other machines.13 While economic thought accounts for the agency of the trader in terms of mimetic behavior, mimesis cannot be reduced to a linguistic, communicative, or cognitive rationality: “Mimetic communication occurs through contagion and not through cognition.”14 The machinic subjectivity of the trader operates like an interface: “Instead of a rational subject who controls information and his choices, homo economicus is a mere terminal of asignifying, symbolic, and signifying semiotics and of non-linguistic constituents which for the most part escape his awareness.”15

To attend to the diagrammatic operations of machines neither returns us to a romantic vision of pre-individual subjectivity nor to the analytical horizon of cognitive capitalism. Instead, such a focus brings into view the role of asignifying semiotics in the organization of labor, left largely unexplored in sociologies of work and industrial psychologies. In Lazzarato’s diagrammatic pragmatics, even spoken injunctions become operational through the support of asignifying semiotics: “They do not first address the ‘I’ of the ‘salaried’ individual. They set off operations while bypassing consciousness and representation.”16 Rather than “move to ever-greater abstraction” to describe the role of managerial languages, “we would do better to analyze the way in which asignifying semiotics are increasingly used” in a mixed semiotics (asignifying, symbolic, signifying) to secure the control of labor as one of the elements in a process.17 This helps us understand the plans being hatched in Walmart’s Bentonville headquarters where “social physics” and behaviorism are inscribed in the company’s vision of the Internet of Things (IoT) and its attendant economy of data analytics.18

Machinic Arrangements and the Internet of Things
Much like other players in the IT sector such as Google, IBM, and electronic vehicle manufacturer Tesla Motors, the move by Walmart toward an “open source” patent policy in the name of innovation is also a strategy to guard against potential future litigation such as patent infringement.19 As legal scholar Dennis Crouch has observed, “these pledges are intended to assure the market that the pledged patents will not be used to disrupt or hinder the adoption of market-wide interoperability standards or open technology platforms.”20 Positioning itself like a software start-up, Walmart is trying to reconnect to its early days of being a technology pioneer: its interest in the efficiency gains of labor-saving technologies made it an early adopter of computerization and logistical media more generally.21 For it is indeed the case that algorithms are machines. The cybernetic feedback of data into algorithmic architectures designed to extract value, orchestrate efficiencies, and optimize productivity are machinic arrangements special to logistical media. Less a metaphor than a computational assemblage intersecting the activity of labor with infrastructural settings such as warehouses, data centers, transport systems, and shipping ports, the algorithmic apparatuses of logistical media extend to the IoT.

Once interoperability across sectors has been achieved (what business management understands as vertical integration and what political economists term monopoly control), firms and institutions have the capacity to harvest, aggregate, and analyze data from a range of sources and devices otherwise beyond their reach. Yet the more likely scenario for logistics—since it is already the dominant reality—is not one of increasing inoperability, but rather one of non-interoperability. Whether it is protocological conflicts, labor disputes, infrastructure failures, or supply chain blockages, logistical failures have their own generative effects. This is especially so in capitalist economies where monopoly interests remain a primary motivating force for corporate expansion. The logistical assemblages of global payment systems are examples of an IoT already in operation, and of new non-interoperabilities IoT-actors are likely to create in order to structure and sustain their respective business models.22

Regulation will continue to have a role in encouraging or constraining IoT technologies. Examples can be found in the forging of standards across mobile payment systems. Based on near-field communication standards, the release of Apple Pay coincided with a new financial regulation that requires merchants to support electronic EMV credit cards (Europay, Mastercard, and Visa) or risk liability for fraudulent card activity and identity theft, forcing merchants to invest in new payment infrastructure. Such developments in gesture-based technologies instantiate regimes of “lifestream logistics” whereby labor (as gesture) is modulated through an ever-expansive and integrated network of relations optimized in the interests of economic efficiencies and the extraction of value.23

Central here is the increasing agency of technical objects and algorithmic apparatuses. Within more formalized workplace settings, customer relationship management and enterprise resource planning software systems are installed at enormously high costs for organizations with the promise of increased workplace productivity and more efficient supply chains.24 Across this spectrum of networked software, activity is measured and made productive. Data in effect is put to work. The mediation of technical, social, and corporeal relations undergirded by such a multiplication of actors and agency presents challenging questions for the organization of labor, a key one being how labor movements will struggle against the invisible.

Black Box Politics
This returns us to a politics of the conjuncture in which the algorithmic operations of high frequency trading (HFT), for example, provide an insight into the infrastructural politics of data, which is also a politics of labor. It is in the world of HFT that we move from the mediatization of labor in and through logistical media to an extension of mediation and enmeshment that goes beyond the old automation-of-labor view debated by Henwood and Wark. Moreover, within HFT we see what is perhaps the sharpest operation of algorithmic autonomy, prompting the need for a black box politics.

Popular trade press books such as Michael Lewis’s much acclaimed Flash Boys, or even movies like Moneyball (2011, based on an earlier book by Lewis), Margin Call (2011), and The Wolf of Wall Street (2013), all highlight a form of computational decisionism.25 In these narratives, code is sovereign. Lewis’s Flash Boys goes a step further than these cinematic portrayals, detailing instead the materiality of infrastructure manifest as the proximity between the computers of HFT firms and co-location data exchanges. Within the “dark pools” of speed and anonymity that provide a trading advantage registered in milliseconds, HFT systems operated as agents of interception.26 Listed as tradable stocks, the fortunes of social media and IT companies such as Facebook, Twitter, and Google are not independent of such transactions. In noting this conjuncture, we suggest that analyses of “free labor” and the social production of value must necessarily turn attention toward the ways in which the subject of politics (labor made into data) within such machinic arrangements becomes indistinguishable from any number of objects indexed as nothing more than asignifying signs activated for exchange within the apparatus of high frequency trade.

A labor theory of value must adequately address abstractions of algorithmic capitalism to encompass machinic labor that produces data (e.g. stocks in pension funds) whose operative logic and capacity to produce effects is necessarily beyond comprehension due to the complexity and speed of exchange. These are occasions when society is confronted by unintelligible cybernetic operations. As the legendary science fiction writer and theorist of technology, Stanislaw Lem, has proposed: “The designer’s task will be to build a ‘black box’ which performs the necessary regulation.”27 Such work suggests a potential form of intervention into labor processes and financial transactions governed by algorithmic apparatuses.

Yet Lem is much more circumspect when it comes to complex systems such as the brain and society whose non-repeatable actions are beyond symbolic representation (in the form of an algorithm), and therefore elude technologies of prediction and preemption. For Lem, the algorithm is repeated twice: first by the technologist as a theory on paper, and then as it manifests as a course of action in real life. The movement from algorithm to action is the work of translation from the plan “into a series of material activities.”28 The algorithm (as black box) is thus rendered visible as a material trace. One might then propose that all material acts can be reengineered to reveal the black box of their making. But this would be to assume a linear process rather than the complex non-linear dynamics of feedback inherent to the concept of cybernetics.

In the case of repeatable events such as the buying and selling of shares on stock markets, the predictive technology of the algorithm in conjunction with infrastructures of speed foregrounds the invisibility of time registered within the instance of action.29 Depending on the scale of measure, events such as the 2010 “flash crash” can be made to disappear.30 As Eric Scott Hunsader’s analysis has demonstrated, if one accounts for the activity of HFT in a minute-by-minute aggregation of financial data rather than milliseconds then it can plausibly be said the crash never happened (unlike the much more gradual unfolding of the stock market crash of 1929).31 If even the brightest computer programmers and developers of algorithms for financial markets are unable to account for the cause of the 2010 crash, where does this leave analyses of the structural transformation of the technical object as a site of political struggle?32 This is the problematic facing a politics of the black box.

From Anonymity to Parametric Politics
When loyalty cards proliferate in our virtual wallets, when coupon systems and location based services are coupled with payment apps that track our patterns of consumption, we begin to get a sense of how shopping experiences are designed around economies of capture. To refuse is to perhaps miss out on that sweet feel of the discount, but at least we get a fleeting sense of having preserved our anonymity. Indeed, anonymity becomes a key algorithmic gesture, conceptual figure, and technical mechanism through which we might begin to design a black box politics within the horizon of logistical media.33 For to be anonymous renders the black box inoperable.

Beyond gestures of anonymity, a more engaged form of resistance might be found in parametric politics, which address the role of the designer and acknowledge the battle lines erected by technical standards more generally. The notion of parametric design relates, for example, to the distributed practices of maker movements; the maker enthusiasm sustains more than the hands-on version of the digital humanities-prettification of machinism (allowing us to respond with a near infantile faith in the scalability of our fabrication exercises, as represented in the 2014 film, The Lego Movie). We also see a re-elaboration of concept-making coupled with strategies of concept-distribution. An Arduino board (inexpensive microcontroller for designing interactive devices) contains so much media theory (and increasingly DIY financial technology) that it makes perfect sense to use it to facilitate design-thinking inspired policy debates across institutional divides, for instance. And if craft indeed takes command (beyond algo-storification start-ups such as narrative science, following in the footsteps of content-farms such as demand, as Morozov has it), there is more at stake than a depoliticized app-fetishism by humanities actors unable to reframe their individual and institutional agency beyond an assumed indispensibility of the liberal arts.

Instead, maker-cultures offer a way of comprehending the structural transformation of the technical object, keeping in mind, to paraphrase Spinoza, that we don’t fully know what an object can do.34 As noted in a recent algorithmic accountability report, for example, “What we generally lack as a public is clarity about how algorithms exercise their power over us.”35 We should attend to their agency in terms of operations such as “prioritization, classification, association, and filtering”—as exemplified in an inquiry of a teachers’ union into the models used to calculate output scores, based on the documentation of rankings and scores obtained through a Freedom of Information Law (FOIL) request.36 Another example is the entity analytics in IBM’s InfoSphere Identity Insight software, which is used by governmental social service management agencies to allocate “human” resources. In the report’s sober assessment of corporate and governmental transparency (disclosure) guidelines, watchdogging and whistleblowing remain as important as ever. But while US freedom of information laws also theoretically cover source code, the legal protection extended to trade secrets limits access: “Exemption 4 to FOIA covers trade secrets and allows the federal government to deny requests for transparency concerning any third-party software integrated into its systems.”37 Even if code is obtained, it may be too old to read without appropriate hardware—bringing both maker approaches and media-archaeological archival sensibilities into parametric politics. Similarly, the publication of source data makes sense quite literally only in combination with the algorithmic relations used to arrive at the interpretation of this data.38

Also understood as a new register of existing forms of inquiry and mobilization, the potential scope of a parametric politics is delimited directly by the restriction of rights to reverse-engineer algorithms in end-user license agreements or international treaties.39 As indicated by the case studies cited in the report—auto-complete algorithms designed to ignore specific search terms, price discrimination (different prices for different people) on e-commerce platforms, the presets in executive stock trading plans—this still seems rather subtle in terms of its analytical reach and political promise, assuming comprehensive computational literacies to access and translate transparency information. But if “minor politics is about engagement with the social relations that traverse us,” parametric politics will remain a minor politics: it intervenes in the protocols of self-constitution.40

Conclusion: Mediations of Labor
We have examined the mediation of media labor via infrastructure > algorithmic architectures, code > parametric politics, content > semiotic capitalism. Our core argument is that media should not only be understood in terms of communication systems, but as a constellation that organizes the production of life and labor. Any analytical separation of infrastructure, code, and content, necessitates a more differentiated sense of their interdependencies and multiple materialities via labor and logistics.41

To prioritize mediation is to acknowledge the structural transformation of media technology or the technical object, which “has itself become…problematic, if not obsolete … in the age of ubiquitous computing, ubiquitous media, intelligent environments, and so on.”42 “[M]edia studies can and should,” W. J. T. Mitchell and Mark B. N. Hansen contend, “designate the study of our fundamental relationality, of the irreducible role of mediation in the history of human being.”43 Mediation and the production of subjectivity are conjoined processes that undergird the operational logics of algorithmic capitalism.

To reflect on the logistical dimensions (and effects) of mediation involves attention to the study of infrastructures as “pervasive enabling resources in network form” that have to be understood in historical terms: “Understanding the nature of infrastructural work involves unfolding the political, ethical, and social choices that have been made throughout its development,” making infrastructures “a fundamentally relational concept.”44 To the extent that the mediation of labor involves databases, for instance, it remains linked to the history of the metrics of governance: “We are living the epoch of the database founded in the era of governmentality (late eighteenth century)—and all the claims that we see today about speed, time, and distribution have been with us since that epoch.”45 Understood in epistemological and ontological terms, infrastructures are part of the “matrices of experience” that structure and sustain the complex relationships involving others in the constitution of self-relations.46

In the case of labor, mediation includes more than the rise of automation, the informatization of work, or the expansion of logistical networks facilitating (and facilitated by) the structural transformation of the technical object. The idea of mediation resonates powerfully with the question of antagonism that has traditionally been at the heart of labor politics because it stresses the need to identify the sites where conflicts actually take place in societies in which the semiotization of labor calls into question the very distinctiveness of a politics of labor. Above and beyond critiques of unionism as a privileged political form to conceptualize and organize collective interventions to restructure labor relations, what is at stake is the epistemological and in fact ontological privilege we accord labor in relation to the singularity of human experience. By extension, this demands an interrogation of the aesthetic, economic, and political models we have built on this privilege.

The task will be to organize labor within algorithmic modulations of social relations and economic transactions—a computational paradigm that extracts value from diverse acts of non-labor within networked settings. Across this world of pervasive computing we find the twin challenge of the disappearance of the subject (of labor) coupled with the invisibility of the technical-institutional (machinic) object of study, politics, and intervention. Sites of manufacturing, production and bodies in pain continue to exist, have multiplied with the globalization of capital following the economic crises and social struggles since the late sixties. Yet how the operations of capital (production, distribution, exchange, labor power) organized through computational architectures and logistical models can move into the focus of political analyses and studies of contemporary labor is, it seems, significantly less well understood, as tried and tested mechanisms of making-visible don’t generate the representations in which such analysis could be built.

Following Deleuze and Guattari’s antisociological stance, Lazzarato suggests that the distinction between “dead” and “living” labor “is appropriate only from the point of view of social subjection” because “[m]achinic enslavement (or processes) precedes the subject and the object and surpasses the personological distinctions of social subjection.”47 Living labor can no longer be assumed to serve as horizon of emancipation: “Self-realization, identity formation, and social recognition through work have always been at the heart of the capitalist—and socialist—project itself.”48 What remains is the reorganization of the “logic of existentialization” (Guattari), including non-human vectors of subjectivation, through a parametric politics that engages new sites of struggle within the black box of algorithmic capitalism.

Paradoxically (and tragically), the site of struggle is at once ever present and nowhere to be found, in part because we have yet to figure out how to comprehend it as a site of algorithmic production where new forms of agency mesh with the bodies in pain whose labor power has also generated a surplus of (highly conventionalized) images. The challenge for media labor today is twofold: to organize without knowledge of what is inside the boxed world of algorithms, data centers, and parametric design, and to get inside and revolutionize black box politics.

Published in Richard Maxwell (ed.) The Routledge Companion to Labor and Media, New York: Routledge, in press.

Our special thanks to Richard Maxwell for his editorial recomposition of this chapter, and his patience.

  1. Panel with Richard Gilman-Opalsky, Todd Hoffman, Stevphen Shukaitis, McKenzie Wark and Doug Henwood, “Between Living and Dead Labor,” Living Labor: Ante-Conference Events, Department of Performance Studies, New York University, April 9-10, 2014,
  2. See Erik Brynjolfsson and Andrew McAfee, The Second Machine Age: Work, Progress, and Prosperity in a Time of Brilliant Technologies, New York: W. W. Norton & Company, 2014.
  3. @WalmartLabs, See also Lisa Mahapatra, “Companies That Pay The Most For Software Engineers: Walmart Pays More Than Facebook And Microsoft,” International Business Times, October 21, 2013.
  4. Sam’s Club/Gallup Microbusiness Tracker: Quarterly Results: July 2014,
  6. Through a 500,000 USD grant from the Sam’s Club Giving Program, the NGO First Children’s Finance will provides access to capital, training, and management assistance to childcare business owners, hoping to “help businesses overcome these challenges by making operations more efficient and integrated, placing a higher focus on measurable business results, and developing a framework that can be easily replicated by other communities” as well as “shape strategy for the eventual establishment of the FCF National Child Care Technical Assistance Centre.”
  7. Alan Pyke, “Here’s Walmart’s Internal Guide To Fighting Unions and Monitoring Workers,” Think Progress January 16, 2014,; Daniel Fisher, “Did The NLRB Just Make It Easier For Unions To Organize Walmart?,” Forbes, July 26, 2014,
  8. Walmart uses supplier scorecards developed in cooperation with Also see S. Sethi Prakash, “The World of Walmart,” Carnegie Council, May 8, 2013,
  9. Maurizio Lazzarato, Signs and Machines: Capitalism and the Production of Subjectivity, trans. Joshua David Jordan, Los Angeles: Semiotext(e), 2014, 29.
  10. Ibid., 80, 81.
  11. Ibid., 88.
  12. Ibid., 97.
  13. We read Michael Lewis’s Flash Boys, W.W. Norton & Company, 2014 as an attempt to map (and storify) such a machinic assemblage, ranging from the deregulation of financial markets to the hardware of low-latency information networks.
  14. Lazzarato, 99.
  15. Ibid., 99-100.
  16. Ibid., 115.
  17. Ibid., 117.
  18. “How can we create organizations and governments that are cooperative, productive, and creative? These are the questions of social physics, and they are especially important right now, because of global competition, environmental challenges, and government failure. The engine that drives social physics is big data: the newly ubiquitous digital data that is becoming available about all aspects of human life. By using these data to build a predictive, computational theory of human behavior we can hope to engineer better social systems.” Human Dynamics Lab, MIT,
  19. On Tesla’s open patent pledge, see Tyrone Berger, “Where’s the Real Value in Tesla’s Patent Pledge?,” August 20, 2014,
  20. Dennis Crouch, “Telsa Motors and the Rise of Non-ICT Patent Pledges,” Patently-O, June 16, 2014,
  21. Art Carden, “Retail Innovations in American Economic History: The Rise of Mass-Market Merchandisers,” in Randall E. Parker and Robert Whaples (eds), The Handbook of Major Events in Economic History, New York: Routledge, 2013, 402-414.
  22. See Lana Swartz and Bill Maurer, “The Future of Money-Like Things,” The Atlantic, May 22, 2014.
  23. See Soenke Zehle, “The Autonomy of Gesture: Of Lifestream Logistics and Playful Profanations,” Distinktion: Scandinavian Journal of Social Theory 13.3 (2012): 341-354. See also Vilém Flusser, Gestures, trans. Nancy Ann Roth, Minneapolis: University of Minnesota Press, 2014.
  24. See Ned Rossiter, “Locative Media as Logistical Media: Situating Infrastructure and the Governance of Labor in Supply Chain Capitalism,” in Rowan Wilken and Gerard Goggin (eds), Locative Media, New York: Routledge, 2014, 208-223.
  25. Michael Lewis, Flash Boys, New York: W. W. Norton & Company, 2014.
  26. See Andrew Ross, “Flash Boys by Michael Lewis – Review,” The Guardian, May 16, 2014. See also Donald MacKenzie, “Be Grateful for Drizzle,” London Review of Books 36.17 (2014): 37-30.
  27. Stanislaw Lem, Summa Technologiae, trans. Joanna Zylinksa, Minneapolis: University of Minnesota Press, 2013, 97.
  28. Lem, Summa Technologiae, 97.
  29. Lem: “A “black box” cannot be programmed with an algorithm. An algorithm is a ready-made program that predicts everything in advance” (97).
  30. Making reference to Lewis’s Flash Boys and the plight of those traders excluded from these infrastructural advantages, Donald MacKenzie notes one of the most “vehement” criticisms of HFT: “it leads to a vanishing market, one that disappears as soon as you attempt a trade.” MacKenzie, 28.
  31. Nanex Flash Crash Summary Report, September 27, 2010,
  32. See The Wall Street Code, dir. Marije Meerman, 2013,
  33. See Ned Rossiter and Soenke Zehle, “Privacy is Theft: On Anonymous Experiences, Infrastructural Politics and Accidental Encounters,” in Martin Fredriksson and James Arvanitakis (eds), Piracy: Leakages from Modernity, Sacramento: Litwin Press, 2014. See also Ned Rossiter and Soenke Zehle, “Toward a Politics of Anonymity: Algorithmic Actors in the Constitution of Collective Agency and the Implications for Global Economic Justice Movements,” in Martin Parker, George Cheney, Valérie Fournier and Chris Land (eds), Routledge Companion to Alternative Organization, London and New York: Routledge, 2014, 151-162.
  34. Yet if the dispersal of technical objects is understood in terms of the constitution of an agency “no longer focused on or assigned to the working-meaning subject,” we need to attend to depletion, destruction, exhaustion – not as a rekindled romance of the ruins, but as a limit of constructive reappropriationisms, of multilayered networks of dispersal so vast that we have yet to imagine vectors of reaggregation. See Erich Hoerl, “A Thousand Ecologies: The Process of Cyberneticization and General Ecology,” trans. James Burton, Jeffrey Kirkwood and Maria Vlotides, in Diedrich Diederichsen and Anselm Franke (eds), The Whole Earth: California and the Disappearance of the Outside, Berlin: Sternberg Press, 2013, 124. The question of dispersal and reaggregation also cuts across the pragmatic vectors of inquiry of free software philosophies. See also Soenke Zehle, “Documenting Depletion: Of Algorithmic Machines, Experience Streams, and Plastic People,” in Richard Maxwell, Jon Raundalen and Nina Lager Vesterberg (eds), Media and the Ecological Crisis, London: Routledge, 2014.
  35. See, for instance, Nikolas Diakopoulos, Algorithmic Accountability Reporting: On the Investigation of Black Boxes, Tow Center for Digital Journalism, Columbia Journalism School, 2014.
  36. Ibid., 3.
  37. Ibid., 12.
  38. Even Edward Snowden did not simply publish his material but decided to make it available to professional journalistic organizations and media outlets for analysis, interpretation, and visualization.
  39. Supporters of the Transatlantic Trade and Investment Partnership (aimed at the creation of a Euro-American free trade area, under negotiation at the time of this writing) want investor-state dispute settlement included: it “grants foreign corporations the right to go before private trade tribunals and directly challenge government policies and actions that corporations allege reduce the value of their investments”; corporations have already used this mechanism to bring cases against minimum wage laws, anti-smoking legislation, and toxics control. It seems likely that these and related provisions (including the strong protection of trade secrets in these and other agreements) severely limit the scope of any software-related transparency effort. See “Open letter of civil society against investor privileges in TTIP,” 2013,
  40. Nick Thoburn, “Minor Politics, Territory and Occupy,” Mute, April 17, 2012.
  41. See Lessig’s discussion of Yochai Benkler’s model in Lawrence Lessig, The Future of Ideas: The Fate of the Commons in a Connected World, New York: Random House, 2001: “Following the technique of network architects, Benkler suggests that we understand a communications system by dividing it into three distinct “layers.” At the bottom is a “physical” layer, across which communication travels. This is the computer, or wires, that link computers on the Internet. In the middle is a “logical” or “code” layer – the code that makes the hardware run. Here we might include the protocols that define the Internet and the software upon which those protocols run. At the top is a “content” layer – the actual stuff that gets said or transmitted across these wires. Here we include digital images, texts, on-line movies, and the like. These three layers function together to define any particular communications system” (23).
  42. Hoerl, 121-30, 124
  43. W. J. T. Mitchell and Mark B. N. Hansen, “Introduction,” in W. J. T. Mitchell and Mark. B. N. Hansen (eds), Critical Terms for Media Studies, Chicago: University of Chicago Press, 2010, vii-xxii, xii.
  44. Geoffrey C. Bowker, Karen Baker, Florence Millerand and David Ribes, “Toward Information Infrastructure Studies: Ways of Knowing in a Networked Environment,” in Jeremy Hunsinger, Lisbeth Klastrup and Matthew M. Allen (eds), International Handbook of Internet Research, Berlin: Springer Science+Business Media B.V., 2010, 97-117, 98, 99. Italics in original.
  45. Ibid., 103, n2.
  46. Michel Foucault, The Government of Self and Others: Lectures at the Collège de France, 1982-1983, ed. Arnold I. Davidson, trans. Graham Burchell, New York: Palgrave Macmillan, 2010, 41.
  47. Lazzarato, 120.
  48. Ibid., 121.
Creative Commons License
Organized Networks by Ned Rossiter is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.
Based on a work at The plaintxtBlog theme, © 2006–2008 Scott Allan Wallick, is licensed under the GNU General Public License.