The gig economy is ground zero for algorithmic management – where computer programmes crunch millions of data points to manage, track and discipline workers. But algorithmic bosses, and the obsessive surveillance they feed off, are gaining a foothold in other industries too. Left unchecked, the twin phenomena could end up radically changing the nature of work.
Algorithmic management beyond the gig economy
Alongside gig economy platforms such as Uber, Deliveroo and Handy, e-commerce giant Amazon is a pioneer of algorithmic management techniques. “In quite a similar way to how algorithms direct Uber drivers or delivery riders around a city, algorithms direct workers around a warehouse,” says Alex Wood, lecturer in the sociology of work at Birmingham University. Workers are evaluated by algorithms that rank their performance relative to colleagues, and flags workers to be disciplined or terminated for not working fast enough.
Amazon is easily one of the most surveilled workplaces on earth, says Alessandro Delfanti, associate professor at the University of Toronto and author of upcoming book, The Warehouse: Workers and Robots at Amazon. The first layer of surveillance is the numerous body scanners employees have to walk through while moving around the warehouse, to check they haven’t stolen anything. The primary source of surveillance however is the scan gun. Much like a scanner used by cashiers in supermarkets, this device monitors workers’ productivity, bathroom breaks and whether they’ve met targets for picking products.
These devices are also used to measure ideological compliance with Amazon’s corporate culture, workers told Delfanti. “They are constantly polled through the scanner: ‘Are you happy?’ ‘Is your manager working well?’ ‘What are your concerns?’ ‘Do you feel positive today?’” This is combined with surveillance cameras positioned throughout the warehouse – some of which are now equipped with AI – and external policing of employee social media, particularly for unionising activity.
Outside of Amazon, algorithmic management techniques are increasingly popular in the retail and hospitality industries. For retailers, this is driven by the explosion in data that can be used to inform scheduling forecasts. Algorithmic scheduling systems automate the scheduling of workers to best match labour supply to customer demand. Some scheduling systems break schedules down into 15-minute blocks. “This creates a much more fragmented [schedule, and] much greater working time-insecurity for workers,” says Wood.
The most detailed case studies of algorithmic scheduling are, so far, in the areas of retail, distribution and logistics, but companies ranging from consumer goods manufacturer Unilever to accountancy giant Deloitte are reported to have experimented with algorithmic management. In the UK, the NHS has trialled products from wearable technology supplier Humanyze – which ostensibly analyses how workers communicate with each other, and tracks their physical movements and psychological state.
And the home-working boom has made white-collar work more susceptible to automated monitoring and management as well. “That can be anything from very simple algorithms, such as ones that monitor how much time somebody spends at their desk, to something very complex that’s doing sentiment analysis on the content of their emails to try and assess whether their employees are happy or dissatisfied,” says Patrick Brione, head of policy and research at the Involvement and Participation Association, which provides training, consultancy and research services to organisations across the private and public sectors.
How algorithmic management hurts workers
Algorithms in the workplace don’t have to erode the quality of the work, says Brione. They can be used to streamline processes, provide real-time feedback and minimise boring and monotonous tasks such as scheduling for managers. But the way they’re being deployed in many contexts today – particularly the invasive surveillance component – is eroding employee well-being.
One of the most obvious ways this is happening, according to Wood, is work intensification. “The more employers are able to neatly measure how productive someone is being, it creates a situation where workers have no choice but to work at breakneck speed,” he explains. What to managers can look like inefficiencies, are in fact “those moments where people are able to catch their breath and get a rest”, and stave off burnout, says Wood. In the case of workplaces like Amazon, this effect can greatly increase the chance of injuries.
The more employers are able to neatly measure how productive someone is being, it creates a situation where workers have no choice but to work at breakneck speed.
Alex Wood, Birmingham University
Constant algorithmic monitoring, combined with surveillance, also increases insecurity, because workers feel like they’re being constantly evaluated. “They don’t know how they’re being assessed, they don’t know what data points are being used,” explains Wood. “They don’t know what the benchmark is that they have to reach. And these algorithms can be changed without any input from them.”
Algorithmic direction (when algorithms instruct workers how to do their jobs) can erode workers’ mastery of their environments, and make them more closely resemble robots. For Uber drivers, a navigational app replaces knowledge and intrinsic human skill. “They’re given directions to follow, and they often feel like they can’t deviate from those directions,” says Wood. “If they do, they might have pay docked from them or be disciplined in some way.”
In Amazon too, as the work is segmented and standardised into discrete tasks, the same effect can result. “There’s a human need to try and use our human capacities,” says Wood. “If you’re just following the directions given to you on the screen, then it becomes very hard for workers that feel like they’re enjoying their job, which obviously has a negative impact on people’s well-being.”
Some of the technologies that Amazon has patented (but not rolled out yet) would increase this phenomenon even further. These include a haptic wristband that buzzes to let employees know how to move their hand while doing tasks and augmented reality headsets.
Ethical algorithmic management
Algorithms will form part of the future of work, but they need to be deployed ethically, stresses Brione. He has developed a set of principles in support of this, which include gaining the consent of the employees – algorithmic management shouldn’t be imposed upon employees, but should be discussed with them and implemented as collaboratively as possible. Transparency about what data algorithms will draw on, and what decisions they’re making, is also paramount, as is accountability. “An algorithm itself can never be morally responsible for a decision that is made,” says Brione. “So then who is responsible?”
An algorithm itself can never be morally responsible for a decision that is made. So then who is responsible?
Patrick Brione, IPA
Unions have been called upon to negotiate what are known as ‘new technology agreements’, whereby if a new technology is going to be implemented into the workplace, it becomes part of the collective collective bargaining process. Some European countries have co-determination laws, whereby workers can help shape the way in which new technologies are implemented through representation on work councils or company boards, and veto technologies that are considered potentially damaging to workplace well-being.
But there are other ways employees may reject algorithmic management. Following a legal challenge from the App Drivers & Couriers Union, an Amsterdam court ruled that ride-sharing companies Uber and Ola were compelled to disclose data used to assign jobs, deduct earnings and suspend drivers, as well as provide more information on how their driver surveillance systems worked (Uber’s Real Time ID system and Ola’s Guardian).
In more secure work, where employees are afforded more rights, employees may reject algorithmic management more directly. When British newspaper The Telegraph attempted to implement OccupEye technology that would track the amount of time people spent at their desks, employees pushed back so forcefully the company was forced to backtrack.
Even in more precarious platform work, workers can find ways to subvert algorithmic management systems. Wood has studied platform workers who were monitored by technology that took screenshots of workers’ screens. Soon, workers realised they could set up a separate monitor displaying the task they were meant to be working on and get the programme to take pictures of that instead. “I think it’s important not to underplay human ingenuity in overcoming these systems and pushing back against them,” says Wood. “It could also lead to new forms of resistance – algorithmic resistance against the systems.”