Advertisement

We need your help now

Support from readers like you keeps The Journal open.

You are visiting us because we have something you value. Independent, unbiased news that tells the truth. Advertising revenue goes some way to support our mission, but this year it has not been enough.

If you've seen value in our reporting, please contribute what you can, so we can continue to produce accurate and meaningful journalism. For everyone who needs it.

Explainer

Explainer: How two big new EU laws will try to rein in Big Tech

It doesn’t sound like the kind of thing to get pulses racing – but two new laws will change a lot about our digital lives.

FOR THE FIRST time in twenty years, Europe is trying to regulate big tech. Two major new EU laws should pass in the next few months, if not weeks: the Digital Markets Act and Digital Services Act.

The European Commission, which proposed them, says the laws “aim to create a safer digital space where the fundamental rights of users are protected and to establish a level playing field for businesses”.

It’s not the kind of thing calculated to set pulses racing unless you’re a tech lobbyist or European Parliament rapporteur. But the laws will eventually change lots of different areas of digital life. If you download apps on your smartphone, the Digital Markets Act affects you. If you publish content online and would like it to stay up, start learning about the Digital Services Act.

There are, very broadly, four main things that the laws aim to do:

  • Stop online platforms from rinsing businesses that use them to reach customers
  • Help smaller platforms compete with the big, established ones
  • Get the biggest platforms to make plans to reduce the amount of dodgy content on them
  • Give internet users more rights to request that content be taken down or challenge their own stuff being taken down

The main targets are the five mega-corporations that control a good chunk of online commerce: Apple, Microsoft, Amazon, Alphabet (Google), and Meta (Facebook). They aren’t particularly happy about it, but the laws are moving ahead with – for the EU – lightning speed.

The Digital Markets Act

Big tech has traditionally been regulated by competition laws (what Americans call anti-trust). Competition law is partly about stopping dominant companies from taking unfair advantage of consumers or other businesses. In 2017, for example, the European Commission fined Google €2.4 billion for giving Google Shopping more prominence in its search results and reducing the prominence of competing websites.

But competition investigations take ages. By the time officials have put together their analysis of how the rules are being broken and defended it through the courts, the market could be completely different, or the technology in question irrelevant.

The Google Shopping case started life in 2010, saw the fine levied in 2017, and is still under appeal at the EU Court of Justice. “It’s not really directly relevant what the court decides because few consumers use comparison shopping services any more”, says Zach Meyers of the Centre for European Reform, a think tank.

Similarly, the Commission spent years pursuing Microsoft for pushing its own Internet Explorer at the expense of competing browsers, even as use of Explorer in practice was plummeting.

Regulators now want to set the rules on digital competition in advance, rather than trying to react to situations as they arise. Those rules would apply to “gatekeeper” firms of a certain size that run a “core platform service”: Meta’s Facebook social network, Alphabet’s Google search or Apple’s iOS operating system, for example. 

What these platforms have in common is that they stand between other businesses and potential customers. The Digital Markets Act tries to make sure that gatekeepers don’t make it too hard or expensive for other businesses to reach customers via Google search or the App Store or whatever it might be.

For example, instead of having to work out whether Google’s search practices are in breach of competition law in a given situation, the Digital Markets Act would simply say that a gatekeeper can’t rank its own products or services “more favourably” than similar offerings from a “third party”.

Other specific rules include:

  • Allowing phone users to uninstall and change default apps
  • Giving app developers “fair, reasonable and non-discriminatory” access to app stores
  • Stopping gatekeepers from combining customer data from different services without explicit consent, or to compete with businesses using the platform

A second objective of the Digital Markets Act is to encourage entirely new platforms to spring up – rather than just regulating the ones that exist. Google would have to provide data to competing search engines so that they can improve their results. Operating systems like Apple iOS and Android would have to allow mobile phone apps that aren’t sold on the App Store/Google Play. Amazon would have to let sellers flog their wares cheaper on other sites.

Meyers reckons that the law is likely to have more of an impact in terms of creating a level playing field on existing platforms than in fostering plucky competitors. “Despite all of these rules, it’s still incredibly difficult to go out there into the market and create entirely new platforms without having huge amounts of data, a lot of capital behind you and good relationships with other companies in the ecosystem”, he tells The Journal.

But I don’t think we should only judge success by whether there are smaller companies that suddenly become really big and compete with, say, Facebook. Another metric of success is whether the companies that are already big start innovating more quickly or spending more money on innovation because otherwise they would lose market share.

“If they became a little bit more afraid of competition, that would also be a good outcome – even if you don’t see market shares dramatically changing”.

The full list of dos and don’ts are in Articles 6 and 7 of the draft law. They will definitely apply to the big five US tech companies: Alphabet, Amazon, Apple, Meta and Microsoft. About a dozen more are in the frame as well, but may or may not be included depending on the final version of the law, which is being finalised over the next few months.

These include household names like Airbnb, Zoom, Yahoo and Booking.com, plus more business-focused firms like SAP, Oracle and Salesforce, according to the think tank Bruegel. Off the hook are Netflix, Spotify, Twitter, Slack and Uber.

The first draft of the legislation was unveiled in December 2020. Things have moved with, in European Union terms, lightning speed since then, reflecting what the law firm Clifford Chance calls “remarkable consensus” on what the law should do. The EU Council, Parliament and Commission are now holding “trilogue” talks: a standard, if murky, way of getting EU laws finalised.

The French government, which holds the rotating presidency of the Council and is facing elections in April and June, wants to get it through in the next few weeks.

The Digital Services Act

The second draft law is the Digital Services Act. This is also about regulating online platforms, but in terms of the content that’s on them rather than how they monetise it. Although it covers the online world in general, not just the biggest players, some of the most important rules apply only to the giants.

In that sense, it complements the Digital Markets Act. Meyers says “it’s impossible to look at the economic power of big tech companies, which is what the DMA tries to tackle, without also looking at their social and political power”.

The basic issue is that there’s a lot of bad stuff on the internet. The European Commission says that “citizens are exposed to increasing risks and harms online… These issues are widespread across the online ecosystem, but they are most impactful where very large online platforms are concerned”.

The Digital Services Act addresses this in two main ways. One is to get those very large companies to take stock of what’s going on their platforms and come up with strategies for addressing what the law calls “systemic risks”. The other is to give internet users more rights, both in terms of getting content taken down but also to get an explanation if their own content is taken down.

The risk assessment bit is addressed to “very large online platforms”: those with 45 million monthly users or more. They would have to do a sort of annual audit, or “risk assessment”, of their platforms. This would check not only for “the dissemination of illegal content”, but also “intentional manipulation” with a “negative effect” on society. The risk assessment would also have to cover “any negative effects” on human rights.

That’s all very well when it comes to illegal content like child sexual abuse images. But the law ranges much more widely than that, and leaves a lot of room for interpretation.

“The language in the DSA is very vague about what exactly the problems are,” says Ronan Fahy of the University of Amsterdam’s Institute for Information Law. “It talks about things the platform does that may impact on freedom of expression: that is incredibly broad. It talks about ‘inauthentic’ use of a platform’s service. It isn’t very concrete”.

Having done their risk assessment, firms then have to put in place “mitigation measures” to tackle those risks. These could include things like better content moderation, changing the algorithms that decide what posts people see on a social network, or tweaking their terms and conditions.

Again, though, lawmakers haven’t been all that precise about what companies are obliged to do. “There’s an awful lot of discretion around what the platform has to do to mitigate those risks,” Fahy reckons.

A lot depends on who checks up on these plans. The European Council and Parliament have amended it to give the European Commission an enforcement role, rather than leaving firms to self-regulate. That should give the law considerably more teeth – but also increases the risk that companies will err on the side of censorship.

“You could imagine that if the Commission is concerned about disinformation, then the platform kind of over-moderates, when disinformation isn’t in itself illegal”, Fahy tells The Journal. “You can see that there could be knock-on consequences for freedom of expression”.

The second big theme is to do with taking down individual pieces of content. Online platforms would have to provide a system for people to get in touch asking for illegal content to be pulled, and prioritise requests from organisations certified as “trusted flaggers” by national regulators.

It would also make it easier for people who are having their content taken down to make their case to the platform. For example, if YouTube were to pull someone’s video for breaching their terms and conditions, it would have to give reasons for its decision. Companies would also need to have a complaints procedure for people to challenge moderation decisions.

“These user rights are going to be quite important for people in terms of when their content is taken down from the platform, or if they want to get content taken down”, Fahy says. “They’ll hopefully have a big impact. At the moment, you can’t really contest what the platforms do”.

This set of obligations on platforms are pretty specific. How exactly people will be able to enforce these rights if a company doesn’t play ball is less obvious at the moment. The legislation envisages each EU country having a “digital services coordinator”, similar to a data protection commissioner for privacy complaints.

“It’s not quite clear whether you can complain about all the provisions in the DSA to those digital service coordinators, or if you first have to complain to the platform and go through their process of appeal”, Fahy says.

That might be firmed up when the final version of the law emerges from EU negotiations. A lot of the detail is still up for grabs: the European Parliament made 457 amendments to the original draft. Like the Digital Markets Act, the legislation is now being hammered out in talks between Parliament, Commission and governments, but has progressed rapidly compared to most major EU laws and should be ready before the French elections.

This work is co-funded by Journal Media and a grant programme from the European Parliament. Any opinions or conclusions expressed in this work are the author’s own. The European Parliament has no involvement in nor responsibility for the editorial content published by the project. For more information, see here.

Your Voice
Readers Comments
9
This is YOUR comments community. Stay civil, stay constructive, stay on topic. Please familiarise yourself with our comments policy here before taking part.
Leave a Comment
    Submit a report
    Please help us understand how this comment violates our community guidelines.
    Thank you for the feedback
    Your feedback has been sent to our team for review.

    Leave a commentcancel