View all newsletters
Receive our newsletter - data, insights and analysis delivered to you
  1. Policy
  2. Digital economy
January 27, 2021updated 01 Aug 2022 5:52am

Here’s how algorithms can harm consumers and damage competition

The UK's competition watchdog has mapped out the various ways that algorithms can harm consumers and interfere with fair markets.

By Laurie Clarke

Algorithms mediate all manner of commercial transactions. As a result, entire markets are shaped by the hidden rules embedded in apps, websites and other digital platforms. Competition watchdogs have taken note and are increasingly scrutinising the impact that algorithms have on competition and consumers. The UK’s Competition and Markets Authority (CMA) is stepping up its efforts by launching a dedicated Digital Markets Unit (DMU) in April.

“As more and more areas of our lives are affected by algorithmic systems, we have a responsibility, and also an opportunity, to properly investigate them and to make sure that regulation is effective, so that algorithmic systems are not harmful,” said the CMA’s director of data science Kate Brand at a recent virtual event hosted by the regulator.

During the event, Brand and her fellow panellists described how the use of algorithms by businesses can harm consumers and damage competition, intentionally or otherwise. They discussed how regulators might deal with this harmful activity – and the urgency with which they need to do so.

Somerset House Opens Major Exhibition Big Bang Data

Competition watchdogs are increasingly scrutinising the impact that algorithms have on competition and consumers. (Photo by Peter Macdiarmid/Getty Images for Somerset House)

Direct harms to consumers

The CMA identifies four broad categories of harm that businesses can directly hit consumers through sloppy or nefarious use of algorithms. These are “personalised pricing, non-price personalisation, algorithmic discrimination, and unfair ranking and design”.

Personalised pricing – tailoring prices to individuals based on their personal data – is not always harmful but “can be a problem where there is a lack of transparency, or where there is a lack of alternative providers – which is more likely where there is insufficient competition”, Brand explained.

It could become more harmful if businesses start to use data to determine how much an individual customer might pay for a product or service, Brand explained. And while the CMA has found little evidence of companies doing this so far, they would be unlikely to disclose it if they were, she noted.

‘Non-price personalisation’ describes tailoring dimensions of the online customer experience besides price to a particular user. For example, a website operator might manipulate the point at which they invite a customer to review their services, Brand explains, which the CMA believes has led to ratings inflation.

Content from our partners
Unlocking growth through hybrid cloud: 5 key takeaways
How businesses can safeguard themselves on the cyber frontline
How hackers’ tactics are evolving in an increasingly complex landscape

Algorithmic personalisation is the area of greatest concern to panellist Christo Wilson, an associate professor at the Khoury College of Computer Sciences at Northeastern University, whose research focuses on auditing algorithms. He said several businesses have told him they plan to introduce personalised pricing.

He added that non-pricing dimensions of the online experience will be harder to regulate, as there is no pre-existing framework with which to do so. “If it’s deceptive, how do you quantify the harm in a precise way, such that you can build a case to try and influence this behaviour or stop it?”

The other potential sources of consumer harm are algorithmic discrimination, in which an algorithm takes a decision that impacts a customer based on a protected characteristic, such as their race or medical history, and unfair ranking and design. The latter occurs when a platform provides a list of options based not on their relevance to the customer but on commercial considerations, without disclosing it.

How algorithms damage competition

Ungoverned algorithms can impair competition in markets, making it harder for new businesses to break through and indirectly harming consumers by limiting their choices. The CMA categorises these as exclusionary practices, algorithmic collusion and ineffective platform oversight.

“We define [exclusionary practices] as those in which algorithmic systems are used by dominant firms to prevent competitors from challenging their market position,” Brand explains. An example of this is self-preferencing, in which a company privileges its own products and services on its platform. This is the cornerstone of many antitrust cases, including the EU’s suit against Google.

“In the UK and EU, dominant firms have a special responsibility to ensure that their algorithmic systems do not harm competition,” said Brand. “However, they can be technically challenging to investigate due to the lack of transparency of the algorithmic systems that are involved.”

Algorithmic collusion describes automated price-fixing among competitive companies. “This might not sound like a serious harm, but it has the potential to raise prices and undermine the fundamental way that competition works,” said Brand. She explains there is evidence that “algorithms can learn over time to coordinate passively with competitors to achieve higher prices”. However, more empirical evidence of this happening in the wild is desperately needed, Brand said.

Algorithms can learn over time to coordinate passively with competitors to achieve higher prices.
Kate Brand, Competition and Markets Authority

Ariel Ezrachi, director of the University of Oxford Centre for Competition Law and Policy, highlighted the potential for unintentional collusion between businesses that outsource their pricing to third-party providers using the same algorithm as for their competitors. “Some of those providers actually promise the users that they will not only provide them with dynamic personalised pricing, but they will also guarantee that the likelihood of price wars will be reduced,” he said. For consumers, this results in a reduction in competitive pricing.

Ezrachi added that potential damage to competition is all the greater when harmful algorithms are deployed by digital platforms, which serve as gatekeepers between buyers and sellers. “Once you move that into a platform environment, where you […] have companies that control the totality of the ecosystem… they don’t just benefit from simple market power; they actually design all the parameters of competition.”

What should regulators do?

Despite this litany of potential harms, regulators have been slow to take action, according to William Kovacic, non-executive director of the CMA and former chair of the US Federal Trade Commission. He believes that the Google antitrust cases currently under way in the US, “tend to be wonderful opportunities to peel back the curtain and understand exactly what takes place inside the enterprises”.

Given the slow pace of legal interventions, however, it is likely that “the world races by like a Formula One racer, and we’re riding a bicycle trying to catch up through our traditional tools”, he added. “It makes one wonder whether or not the traditional mechanisms of litigation really are fit for purposes, and it focuses attention on the efforts to develop new regulatory tools and investigative tools to understand what’s taking place in a more timely manner.”

It makes one wonder whether or not the traditional mechanisms of litigation really are fit for purposes.
William Kovacic, formerly US Federal Trade Commission

Panellist Cathy O’Neil, author of Weapons of Math Destruction and founder of algorithmic auditing company ORCAA, said that examining individual algorithms isn’t sufficient for regulators. Regulators should also consider “how [algorithms] create their own dynamical systems”. She said the primary focus should be on “what goes in and what comes out”, instead of deep exploration into a particular company’s algorithm.

O’Neil advocated for regulatory sandboxes, where companies can test out their algorithms on data to see whether they break any rules, and for a more proactive approach to regulating algorithms. Regulators should inform business that “we’re going to proactively monitor the companies that are using algorithms like this, in contexts like this,” O’Neil argued, “and say, ‘you have to monitor your system to make sure that you’re not breaking this rule’.” This proactive approach resembles the ex ante regime proposed in the EU’s Digital Markets Act.

Even relatively straightforward measures, such as mandating a basic level of transparency, could prove transformational, Wilson argued. “You could imagine forcing firms just to reveal when algorithms are in play or when A/B testing is being conducted. Even that basic level of information is often not available.”

Kovacic called for public regulatory authorities to build the capacity to carry out algorithmic auditing, praising the CMA’s plans to set up the DMU. “From what I’ve seen around the world, it is plainly state of the art,” he said. “There’s not another competition authority in the world that has made this kind of commitment to building the knowledge necessary to do good work in the area.”

O’Neil heartily endorsed regulators building their own in-house capability, as opposed to relying on third-party auditors. She worked in the finance sector during the credit crunch, and witnessed the “corruption” of ratings agencies that were meant to hold financial institutions to account. “We can’t let that happen to algorithmic auditing.”

Websites in our network
Select and enter your corporate email address Tech Monitor's research, insight and analysis examines the frontiers of digital transformation to help tech leaders navigate the future. Our Changelog newsletter delivers our best work to your inbox every week.
  • CIO
  • CTO
  • CISO
  • CSO
  • CFO
  • CDO
  • CEO
  • Architect Founder
  • MD
  • Director
  • Manager
  • Other
Visit our privacy policy for more information about our services, how New Statesman Media Group may use, process and share your personal data, including information on your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.
THANK YOU