Panopticon or Oligopticon: Neoliberalism, Capitalism and Algorithmic Governmentality in Commercial Platforms
- SoCient STS
- Jun 29
- 9 min read
The rapid expansion of platform economy is said to simulate a neoliberal way of working[1, 2, 3]. Digital platforms for services, such as food delivery and transportation sharing, promote a flexible and individualized working environment and rely on contract-based and non-unionized workforce. These con- stitute a ”deindustrialization” process which contrasts with the traditional work regime of the industrial revolution[2]. Rather than existing as a collective entity for their workforce, the platforms act merely as mediators between consumers and service workers and claim no deeper affiliation with the employed[4]. This shifts the burden of responsibilities and decision-making to the individual workers themselves. This insecure and commercialized form of labour is seen as a continuation of the spirit of neoliberalism[1].
The core of platform economy is algorithmic control[5]. It is indicated that the digital infrastructure of these commercial platforms carries a goal of algorithmic governmentality, i.e. applying real-time surveillance and control to achieve absolute rational operation and go beyond prediction to form a new reality[6]. It might be argued that these algorithms are engineered for the spread of capitalism and to enhance its profit-making process to a new level [7, 8].
According to Fisher the algorithm-centric platform economy traps individuals [9] in a way echoing Max Weber’s ”iron cage” concept. The ”iron cage” describes a situation in modern society in which individuals lost control under the highly rationalized and calculative, efficiency-optimized societal infrastructure [10]. As illustrated above, the contemporary case of platform economy has built such an infrastructure via algorithms. To what extent is our autonomy limited in such mega-structures? This paper is to explore this question by examining algorithmic control of the platform economy through a micro lens, i.e. an ”individual’s everyday life experience” approach[11]
Our Social Existence under Algorithmic Power: Panopticon or Oligopticon?
Foucault’s or Bentham’s ”Panopticon”[12] has been used to describe the all-seeing yet obscure character of digital technology. In the case of platform economy, an important argument is on how technology enables digital platforms to apply real-time, invisible control and surveillance on their workers[1]. The invisibility creates an illusion of freedom to the employees, as it is unlike working in a factory, being subjected to direct disciplinary action by a ”spectacle authority” [13]. However, it is pointed out that this freedom is eventually illusory because workers can not challenge the data which is held by the platforms, and therefore they are still trapped in the algorithmic panopticon [13].
As a counter image of ”Panopticon”, Bruno Latour introduced the concept of ”Oligopticon” in his Ac- tor Network Theory (ANT). Firstly, in contrast with the Utopian sense of a Panapticon, i.e. it is isolated from normal sites in human’s daily life, an Oligopticon is enacted in our normal social environment[14]. An Oligopticon is not universally transparent like a Panapticon, at least not to the level to satisfy the ”the megalomania of the inspector or the paranoia of the inspected”[14]. While limited to ”see much too little”, the observers in an Oligopticon do have a deep and precise access to the common people via these ”extremely narrow views” latour2007reassembling. Unlike the robustness of Panapticon provided by the absolute transparency, Oligopticon is fragile as it relies solely on the connection amongst the elements forming the whole. Thus if any part of the links or elements fail to hold, the view collapses: ”the tiniest bug can blind oligoptica”[14].
When it comes to algorithmic power, Panopticon is the usual metaphor considering the exposure of our personal selves under components of the digital infrastructure which try to capture every part of our lives. In such a view, the side who exercises power seems to have absolute control over the subject. However, I argue that this view places too heavy a focus on the power holder and assumes the subject is a passive entity — not surprisingly, considering that the original goal of a Panopticon was to enable absolute control over a prison’s population. When it comes to algorithms, their initial purpose is rather general — to automate human work processes[15]. The group of users might include states, commercial
enterprises or individual consumers. I would not deny that there will be power imbalance in between the users. But I would argue that such an imbalance is rather fluid, not as fixed and absolute as in a Panopticon, and the target of the power is not passively submitting to the authority[7]. Therefore, when dealing with digital infrastructure, we should look at power relationships in a dual way, on one side, we have an authority who uses algorithms as tools to control their targets, on the other, the targets also have an ability to resist and challenge the powers to be.
I would like to argue that Oligopticon could be a more realistic frame of analysis, especially when the subject of the power can use fissures inherent in the digital infrastructure to achieve their counter goals and impact the dynamics of the power relationship. As described above, an Oligopticon has three main signature features: being enacted in our actual sociality, having narrow but sturdy views, and an inherent fragility being fully dependent on inter-node connections. Firstly, algorithms are deeply embedded in our lives as they assist everyday human actions in terms of rationalization and automation[15]. It is not being built for a specific purpose like a prison for social disciplinary action, but covers a generalized application on various aspects of social life, work, entertainment, management and policy etc.. Secondly, how much do algorithms capture a panoramic view of individual life and human society? To take software development as an example: When defining requirements at the start, we may have very ambitious yet ambiguous goals, e.g. to automate panoramic surveillance. Yet to be able to implement the desired features, we need to narrow the scope down to the practical level: what exactly we want to inspect? Perhaps in the end we could only be able to achieve, for instance real time tracking and analysing the users’ geographic location. When it comes to designing the actual algorithm, work would start from a further narrowed aspect of user behaviour, such as defining a movement and the connection of one point to another in the map. Moreover, the algorithm design has to be connected to a very specific location, be it a government office, the software development team’s office, where each is related to only a specific part of human life. Yet, though narrow as it is, what the algorithm captures is a high-fidelity and precise picture, e.g. the detailed movement of a user on the map by seconds, and the holistic data of everyday routine footprints of every individual in the place. This is different than in the Panopticon, the inspector sees all but he could not develop a deeper view to every prisoner, e.g. how does prisoner A fold his clothes. Thirdly, as algorithms are highly connective, for example codes connect with each other, with the designing people, and with the physical machines where they are run, a bug in one specific part can blow up a whole program. Therefore algorithmic power is fragile.
Accordingly, because it can be part of a wide aspect of life, its gaze is narrow and it is fragile, when algorithms focus on one part of thing, the subjects of the algorithm can find other parts to work against its design. In other words, the oligoptic algorithmic power therefore allows fissures to occur in its infras- tructure which in turn enables the targets of the power to counteract. In Ferrari and Graham’s study on digital labour platforms, they conclude that while algorithms favour capitalists and fuel hypercapitalism to further exploit labour with new tricks which make people work more voluntarily with less welfare, algo- rithms are not hegemonic as there are ”fissures” where algorithms do not govern as they are intended[4]. Ferrari and Graham identified from empirical data three main types of fissures and labour counteractions in the case of digital platforms: manipulation (breaking rules), subversion (working around rules), and disruption (forming collective entities to rebel and rewrite rules)[4].
Using food delivery platforms as representatives of digital platform business[16], the following part of the paper describes how fissures in algorithms enable labourers to manipulate, subvert and disrupt the platform’s survaillance and control system.
Manipulation For instance, at the UK food delivery platform Deliveroo, some workers rent their accounts for profit to those who have not yet passed the platform’s right-to-work checks[17, 4]
Subversion More of the cases fall into the category of subversion. Empirical studies have confirmed that workers will try to figure out, make sense of the algorithms and play around the rules for their own convenience[4, 7]. Especially according to Sun, in China’s case, groups of workers have been able re-create their own ”algorithms” based on the actually existing one[7]. For example, during rush hours they will help each other to accomplish tasks faster by transferring orders to the peers nearby. Workers also take note when the algorithm makes mistakes and work out their own ”quasi algorithms” based on their own knowledge, for instance to select the fastest or shortest route. Furthermore, workers also learn to take advantage of the competition between different platforms to obtain higher income and bonus, for example, they overcome the random locations of algorithmic order assignment by switching between platform apps, and different platform workers cooperate by ordering for each other during the inter platform bonus wars.
Disruption Protesting by digital platform workers puts pressure on platform governance. Accord- ing to Joyce[16], data from 2015–2018 shows that globally, collective actions against digital platforms is steadily increasing, with Western Europe being the most active region and food delivery the second top industry (right behind transport sharing)[16]. Platform rules related to pay, working conditions, employ- ment status, and regulatory issues are the ones most being challenged by these disruption actions[16]. Take China as an example. As food delivery becomes an integral part of most people’s daily life, the increasing demand side also brings out a struggle on the supply side. An essay “Delivery Drivers, Stuck in the System”[18] in 2020 marks the beginning of open debate on unreasonable working time calculation and safety issues of food delivery labour[19]. Yet before this, unofficial collective organisations have al- ready formed on the labour side ever since the industry started to experience rapid growth. For example, the “Delivery Knights Alliance” online group was created in 2019 and has more than 10,000 riders in its multiple WeChat groups[20]. Since the 2020 open berating on food delivery labour struggles, the activist group leader used social media to reveal the pitfalls in the platform system such as the impossible rules of working hours for the holiday bonus in the bonus schemes (require working 18 hours per day to get the bonus)[20]. This pressure has raised awareness and led to action by both the platforms and the state to address the issue of efficiency-optimizing algorithms, for example by loosening delivery time limits to allow workers to drive safer[21].
Sum-ups and take-aways
Using the food delivery sector of the platform economy as an example, an oligoptic image of algorithmic power is drawn here, contrasting with the common panoptic portrayal of technology. Fissures isdescribed in the power structure which make it possible for the subjects of the power to counteract. In light of this, individual labourers may not be the total prisoners trapped in platform economy’s panoptic digital infrastructure with no control. The fissures of technology indicates that the inspector in the power relationship does not seem to enjoy universal transparency and absolute grip over the inspectees. Even though on many occasions the inspector may be on the stronger side of the relationship, the inspectee still has space to confront and even change platform rules, which means this is not merely ”an illusion of freedom”[13] where workers have no power to challenge the core structure. The question of whether we are living in a cage of neoliberalism and capitalism does not seem to have a simple and straightforward answer, as oligoptic power relations are much more dynamic than the panoptic image we are used to being presented.
References
[1] Alex Veen, Tom Barratt, and Caleb Goods. Platform-capital’s ‘app-etite’for control: A labour process analysis of food-delivery work in australia. Work, Employment and Society, 34(3):388–406, 2020.
[2] Paul Stewart, Genevieve Shanahan, and Mark Smith. Individualism and collectivism at work in an era of deindustrialization: Work narratives of food delivery couriers in the platform economy. Frontiers in Sociology, 5:49, 2020.
[3] Guy Standing. The precariat-the new dangerous class. Amalgam, 6(6–7):115–119, 2014.
[4] Fabian Ferrari and Mark Graham. Fissures in algorithmic power: platforms, code, and contestation.
Cultural Studies, 35(4–5):814–832, 2021.
[5] Hui Huang. Algorithmic management in food-delivery platform economy in china. New Technology, Work and Employment, 2021.
[6] Patrick Crogan. Bernard stiegler on algorithmic governmentality: A new regimen of truth? New Formations, 98(98):48–67, 2019.
[7] Ping Sun. Your order, their labor: An exploration of algorithms and laboring on food delivery platforms in china. Chinese Journal of Communication, 12(3):308–323, 2019.
[8] Nick Dyer-Witheford. Cyber-Marx: Cycles and circuits of struggle in high-technology capitalism. University of Illinois Press, 1999.
[9] Eran Fisher. Media and new capitalism in the digital age: The spirit of networks, volume 3. Springer, 2010.
[10] Terry Maley. Max weber and the iron cage of technology. Bulletin of Science, Technology & Society, 24(1):69–86, 2004.
[11] Rob Kitchin and Martin Dodge. Code/space: Software and everyday life. Mit Press, 2014.
[12] Regine Buschauer. Datavisions–on panoptica, oligoptica, and (big) data. The International Review of Information Ethics, 24, 2016.
[13] Jamie Woodcock. The algorithmic panopticon at deliveroo: Measurement, precarity, and the illusion of control. Ephemera, 20(3):67–95, 2020.
[14] Bruno Latour. Reassembling the social: An introduction to actor-network-theory. Oup Oxford, 2007.
[15] Jean-Luc Chabert, E´velyne Barbin, Jacques Borowczyk, Michel Guillemot, and Anne Michel-Pajus.
A history of algorithms: from the pebble to the microchip, volume 23. Springer, 1999.
[16] Simon Joyce, Denis Neumann, Vera Trappmann, and Charles Umney. A global struggle: worker protest in the platform economy. ETUI Research Paper-Policy Brief, 2, 2020.
[17] K Bryan. Deliveroo and uber eats takeaway riders rent jobs to ‘illegal immigrants.’. Sunday Times, 2019.
[18] People. Delivery drivers, stuck in the system. People Magazine, 2020.
[19] Liao Rita. Viral article puts the brakes on china’s food delivery frenzy. TechCrunch, 2020.
[20] Liu Diana. China: Beijing delivery rider and labour activist is detained after denouncing worker exploitation. The Observers, 2020.
[21] Liao Rita. New regulation in china to hit food delivery giants’ profit model. TechCrunch, 2022.

Comments