The algorithmic management practices of gig economy platforms dictate the lives of their workers but are typically opaque. Activists are using methods including lawsuits, apps, and data trusts to access the data that powers these practices, and harness it to push for better working conditions. As the algorithmic management techniques pioneered by the likes of Uber become widespread, data is shaping up to be a crucial frontier in the relationship between employers and their workers.
James Farrar is a former Uber driver and co-founder of the App Drivers & Couriers Union. At Uber, he explains, the system that decides how work should be allocated draws on “all sorts of things like your work performance, your relationship with the staff, your previous interactions with customer service, your customer ratings, your cancellation rate, your completion rate, your earnings profile, and also a fraud probability score”.
The ‘fraud probability’ score is one of the most troubling measures for taxi platform drivers, Farrar says. It doesn’t refer to traditional fraud, of the criminal variety, but is closer to “a moral calculation of your propensity to obey or break the rules [of the app]”. Gig platforms choose to use the term ‘fraud’ rather than the language of performance management, Farrar argues, because the latter would signal an employment relationship – something they have traditionally resisted (Uber last week acknowledged that its UK drivers are ‘workers’, affording them some rights, but not ’employees’).
These algorithmic management systems can even result in workers being sacked. In one of three cases that the ADCU brought against Uber and another taxi app, Ola, in Amsterdam, two Uber drivers were dismissed due to ‘fraudulent activity’ without the company substantiating what informed this dismissal, something that Farrar calls “psychologically devastating”.
The court’s rulings on these cases, delivered on 12 March, vindicated many of the criticisms of algorithmic management. In one case, the court found that Ola had used a completely automated system to decide to deduct wages from one driver’s earnings. This is unlawful under Article 22 of the EU’s GDPR, which states that data subjects “have the right not to be subject to a decision based solely on automated processing… that significantly affects him or her”. Despite the explosion in automated managerial decision-making, this is the first time a court has made such a ruling.
The court ruled that Ola should give drivers access to anonymised performance ratings, data used to create an earnings profile that influences work allocation, and data used to create “fraud probability scores”. And it ruled that Uber should give the two drivers accused of ‘fraudulent activity’ access to the data those decisions were based on, in addition to anonymised individual ratings on their performance, instead of a rolling average of trip ratings.
But it wasn’t all wins for ACDU. Unlike Ola, the court ruled that Uber had not dismissed drivers without human oversight. It didn’t order Uber to provide compensation to the claimants or provide more information on how prices were calculated or notes added to drivers’ profiles.
In the latter case, it said claimants had to specify more clearly which information they wanted that Uber had not already provided. This is something the lawyer representing the drivers, Anton Ekker, says is “problematic, because the court shifts the burden of proof to the driver”.
“The driver has to demonstrate what he did not receive [which] is very complicated if you look at the amounts of data we’re talking about,” says Ekker. “I understand the court might be struggling with the complexity of this, but in the end, it should be the responsibility of the data controller to show that he has given full transparency, not the other way around.”
It should be the responsibility of the data controller to show that he has given full transparency, not the other way around.
Anton Ekker, Ekker Advocatuur
When asked to comment on the case and Farrar’s remarks, Uber said: “The court has confirmed that Uber’s dispatch system does not equate to automated decision making, and that we provided drivers with the data they are entitled to. The court also confirmed that Uber’s processes have meaningful human involvement. Safety is the number one priority on the Uber platform, so any account deactivation decision is taken extremely seriously with manual reviews by our specialist team.”
The rise of gig workers’ data trusts
Another ruling could prove most impactful of all. The Amsterdam court confirmed that ADCU can make subject access requests – a provision of GDPR that obliges organisations to share the data they hold about an individual – on behalf of drivers. This could be crucial in allowing workers to wrest control from the platform operators.
The ADCU intends to use these subject access requests to populate a data trust, an independent body that looks after data on behalf of third parties, called the Worker Info Exchange. The data will be aggregated and used by drivers to negotiate for better working conditions. “What we’re trying to do is build a bigger picture of these things and a richer picture,” says Farrar. “Then we can understand, how much did you earn per hour after costs? […] What was your utilisation?”
‘Utilisation’ refers to the percentage of time that drivers spend carrying passengers relative to the time spent logged into the app. The company has typically only paid drivers for the time they have passengers in their cars, despite drivers being logged into the app for far longer. Farrar says that his utilisation over the past two years was 50% – meaning he was only paid for 50% of the time that he was on the job.
Now that it can make SARs on behalf of other individuals, Worker Info Exchange’s efforts will take the burden off drivers attempting to navigate Uber’s labyrinthine systems. “You wouldn’t believe how they frustrate the process if you go online and you try to make a subject access request. It’s just deliberately convoluted and confused,” says Farrar.
Right now, Worker Info Exchange has a target of making requests on behalf of more than a thousand drivers. Farrar believes that data could be the force that unites gig workers. “Everybody’s hungry for insight around how they worked, and why they earned and why they didn’t earn, and why somebody else earned,” he says.
Gig workers’ data trusts in the US
For US gig workers, the scope for making subject access requests is limited. But activists are building tools that allow workers to collect data about their pay and working conditions independently, and compare it with their peers.
Dan Calacci, PhD researcher at MIT, specialises in building frameworks and tools that allow worker collectives to collate data. Calacci was inspired by his experiences working for the delivery service Postmates. “I’ve been on the side of being at the mercy of these decision-making systems [choosing] when you’re going to be able to make your next buck,” he says. “If you’re suspicious that the processes that give you work or pay are unfair… you have no recourse in even understanding if that’s the case, because it takes a lot of work to collect that information in the first place if you’re just an independent person trying to figure out if your pay has changed.”
Calacci has developed an app for employees of grocery delivery service Shipt that analyses how its payment algorithm shifted over time. “We made it easier for workers to extract data from their working apps by submitting screenshots of their pay history,” says Calacci. “Those screenshots would be parsed by the tool that we built, and the data was made available back to workers.”
The tool also tracked the changes in payment over six to nine months to measure shifts in the payment algorithm. During this time, Shipt announced that its payments had changed, but on average would stay the same or get better. “But in the data that we collected, we saw that there was a group of workers who were consistently making less than what they’d been making under their previous payment arrangement,” says Calacci. “That was hidden in the data and the numbers that Shipt had released.” When invited to comment, Shipt directed Tech Monitor to an October blog post it published on questions of pay.
Calacci’s app is one of a crop of tools that have sprung up which help achieve similar aims. There’s Gig Compare, programmed by ex-Googler Charles Kemp, which calculates and compares the estimated hourly earnings of gig workers; UberCheats, a Google Chrome extension that helps spot pay discrepancies, developed by Uber Eats driver and software engineer Armin Samii; and Driver’s Seat, which helps workers collect and analyse their data from ride-hail and delivery apps including Uber, Lyft and DoorDash.
Demystify algorithmic management
A recent ILO report charted a fivefold increase in the number of digital labour platforms over the past decade. As a result, it found that the practices of these companies are ossifying into new labour norms, one of which is algorithmic management.
“Algorithmic management and the use of data to augment or possibly replace traditional managerial functions is something that started in the gig economy, but the underlying practices have started to come to workplaces across the socioeconomic spectrum,” says Jeremias Adams-Prassl, a professor of law at Oxford University who focuses on the gig economy and the future of work.
Data makes these practices possible – but it is increasingly being used to counteract their power over workers. “A lot of these technological systems monitoring our work to exercise control can also be used to ensure that working conditions are respected,” said senior economist at the International Labour Organisation (ILO), Janine Berg, speaking at a recent online event.
These companies could one day be forced to hand this data to labour inspectors or administrative authorities, Berg predicted. “Or if workers have access to their own data, could they then turn over this data to some sort of third party, whether it be a trade union or the labour inspectorate to ensure that their work conditions are being complied with?”
She cites the example of the New York City Taxi and Limousine Commission, which made transport network companies turn in four weeks worth of data. With this information, the commission hired researchers that were able to calculate the correct minimum hourly wage for these drivers and made companies agree to it. “This is an example about how once the authorities actually have the data, they can use that data to set adequate wages,” says Berg.
Although a great chunk of digital platforms’ data is generated by users and workers, it is typically considered to be the property of platforms – something legislation like GDPR seeks to redress. But the ILO report highlights the growing importance of collective user rights over community data, arguing it “should not translate into a monetary sum”, but “a collective stake in the resulting products or services of a company” and “at the very least, the resulting products or services should not be used in a way that is harmful to platform workers”.
“Part of the issue here is that our legal and policy and regulatory framework around data is highly individualistic,” says Farrar. “It’s about individual rights, and we don’t really have a good conceptual understanding of digital labour rights.” That’s something worker data trusts could soon hope to change.