The EU’s landmark Digital Services Act (DSA) legislation – billed as a tectonic shift in the bloc’s treatment of Big Tech – is on the horizon. Due to be announced in early December, the DSA is poised to legislate on various issues relating to technology platforms including competition, data sharing and content moderation. To examine what is at stake, Tech Monitor spoke to Marietje Schaake, a pioneer of the digital agenda in Brussels.
During her time as Member of the European Parliament for the Dutch Democratic Party (D66) between 2009 and 2019, Schaake founded the Digital Agenda Intergroup, a network of interested MEPs. Now she is international director of policy at Stanford’s Cyber Policy Center, where she researches and advises on digital policy.
The EU prides itself on taking the global lead in confronting Big Tech’s supremacy. Anti-competition cases against Google have resulted in multi-billion euro fines, while investigations into Facebook, Amazon and Apple are underway. But for critics, the EU’s efforts have underwhelmed. For a trillion-dollar company, fines can be easily absorbed into the cost of doing business. Some of the proposed interventions have backfired too: search engine operators recently complained that EU measures intended to make the search market more equitable have actually shored up Google’s dominance.
“The EU so often gets labelled as the super-regulator that it’s now using that label as a self-identified strength,” says Schaake. “But at the same time, there’s so much that still needs to happen.”
She says the EU is somewhat coasting on its reputation for introducing measures such as GDPR. Instead, it needs to embrace a positive vision and plans to grow the European tech market, she believes, in addition to regulatory measures.
“The way I’ve looked at this is never to say that we need to rein in Big Tech, but we do need to make sure that democracy is not disrupted,” says Schaake. This includes ensuring electoral integrity and the protection of fundamental rights online, while clarifying who decides what constitutes harmful speech and how public safety and health should be balanced with freedom of expression, she says.
Responsibility for content
The DSA is expected to tackle the question of online speech, and how much responsibility platforms should have for what their users publish. Big Tech platforms have implored the EU not to hold them accountable for illegal content, saddling them with intensive content moderation responsibilities.
The same debate is raging in the US. Section 230 of the 1996 Communications Decency Act enshrines the right of internet platforms not to be held legally liable for the content published on them, but President-elect Joe Biden has suggested he would rip up the piece of legislation once in office.
Schaake believes that it is time for Section 230 to be adapted. “The writing has been on the wall for Section 230… mainly because of the companies themselves, because they are not acting as neutral platforms in any way, shape, or form,” she says. Although they might once have credibly claimed neutrality, Schaake argues that interventions by the likes of Facebook and Twitter to tackle misinformation around Covid-19 and the 2020 US election demonstrate that they are no longer mere platforms.
Tech platforms might argue they are in a Catch-22 situation. Politicians have exhorted them to take more action on illegal content and ‘harmful’ speech, on the threat of reforming Section 230. Now, that action is being used as a justification for reforming Section 230 anyway, on the basis that they are already acting like publishers.
There are more nuanced approaches available to scrapping Section 230 outright or keeping it as it stands. “This discussion often falls into a black and white stereotype… but I think tweaks to it, which would clarify harms and what kind of liability… I think that that’s all very well imaginable and also legitimate,” she says.
Furthermore, the size of a given company could be used to determine what responsibilities it should have, she says. If not, such legislation could actually end up entrenching the power of Big Tech.
Schaake points out that other legislation such as GDPR has had this effect. She says that GDPR was sold as a “hammer to hit tech executives”. Now we know that the opposite happened. “The big tech companies were fine in managing GDPR requirements, it is, in fact, smaller companies that have a much more difficult process and relatively higher costs, to comply,” says Schaake.
New rules on competition
The DSA is expected to be accompanied by a Digital Markets Act, designed to address the concentration of power in digital markets. Schaake believes that new thinking on competition is overdue: “Antitrust rules as we know them need to be updated for the digital world.”
In particular, using high prices as the sole indicator of anti-competitive activity is no longer fit for purpose, she says. “If you get a service for free, except you pay with your data, then how do you actually measure harm? You can never say, ‘the consumer paid too much’, because the company can always say, ‘well, in fact, they pay zero, so what are you talking about?’.”
In her research, Schaake has proposed the creation of ‘middleware’ companies, that would act as an intermediary layer between tech platforms and their users “which would give the internet user more agency, for example, to curate news, not to see ads or to have higher privacy protections”.
“The idea would be that you create another party in between the big platforms and the internet user, and that would be a way to at least diffuse profits, because these middleware companies will also, in principle, be able to make money.”
Thinking bigger than Big Tech
Whatever the details of the Digital Services Act are revealed to be next month, Schaake predicts that it will be just the start of a process of debate and revision. “If any of my experiences in the European Parliament serve as a compass, then it will still take a while before the dust has settled, and it may actually get watered down between different interests,” says Schaake.
In particular, she expects the Big Tech companies to apply the full force of their lobbying efforts, perhaps even more than they did for GDPR. In October, French newspaper Le Point published details of a leaked document describing aggressive efforts by Google to remove “unreasonable constraints” on its business model and “reset the political narrative”.
Whatever impact the DSA may have, Europe’s digital agenda must be broader than simply taming tech giants. She rejects the framing of the politics of technology as a “race” between the US and China. “I think it doesn’t serve this much bigger question of how do you uphold the rule of law online? And what kind of values and norms do we want to build our governance model on top of?”
At the same time, the EU needs to step up “without naivety about where the rest of the world is” in its use of technology for geopolitical aims. She was surprised that the AI White Paper from the European Commission, published in February, explicitly avoided the topic of the military use of AI. She believes that if there is to be a geopolitical EU, something that she says is long overdue, “this is the language of power that you need to start speaking”.
As for the increasing threat of an internet divided along geopolitical lines, as evidenced in the US’s move to break up TikTok, Schaake believes that we are simply seeing “the inevitable catching up of geopolitics with the open internet.” The idea that the internet could operate above and beyond global divisions “was always very much an articulation of an ideal, not of a feasible reality”.
Still, the EU should aim to cultivate “a positive, enabling environment and not a restrictive, quasi-nationalistic approach,” not least because Europe has yet to build a domestic digital industry of its own. “If you’re very dependent, you may not want to close off the borders.”