Advertisement

We need your help now

Support from readers like you keeps The Journal open.

You are visiting us because we have something you value. Independent, unbiased news that tells the truth. Advertising revenue goes some way to support our mission, but this year it has not been enough.

If you've seen value in our reporting, please contribute what you can, so we can continue to produce accurate and meaningful journalism. For everyone who needs it.

(File image) Elon Musk said his company was "working hard" to comply with the new Digital Services Act. Alamy Stock Photo
Digital Services Act

The European Union's tough new laws on social media content have officially come into force

Platforms with at least 45 million monthly active users will be subjected to fulfil every new obligation.

THE EUROPEAN UNION’S Digital Services Act which will crack down the availability of illegal content and enforce stronger moderation on the largest social media platforms across the continent has become “legally enforceable” today.

The law aims to make social media companies more transparent, with a particular focus on banning or limiting targeting practices and introducing tighter controls on content.

19 online platforms will have to be in agreement with the Digital Services Act (DSA) and all will be subject to an annual audit, where those found in breach of the new rules face fines of up to 6% of their annual global turnover. 

Platforms with at least 45 million monthly active users, which the EU has described as “very large”, will be subjected to every aspect of the DSA, including new standardised codes of conduct and data-sharing and monetisation regulations which the EU plan to fully develop and implement by the beginning on 2025.

Search engines and online stores such as, Alibaba AliExpress, Amazon Store, Apple AppStore, Booking.com, Wikipedia, Zalando, Bing and Google search, are subject to every aspect of the new obligations brought on by the new law.

All major social media platforms, such as Facebook, Instagram, LinkedIn, Pinterest, Snapchat, TikTok, Twitter and YouTube, are also required to fulfil the new obligations.

Companies must ensure the provision of easy-to-use reporting systems for illegal content, stronger content moderation and opt-out options on certain content for all European users on their apps.

The DSA states that the platforms should “serve to remove or disable access to the specific items of information considered to constitute illegal content, without unduly affecting the freedom of expression and of information of recipients of the service”.

European Commissioner for the Internal Market Thierry Breton announced that the law has now become “legally enforceable” adding that he and his services will be “very rigorous” to check that the platforms comply with the DSA.

“We will be investigating and sanctioning them, if not the case.”

Breton said on Twitter today: “These systemic platforms play a very, very important role in our daily life and it is really the time now for Europe, for us, to set our own rules.”

Industry reaction

Twitter was among five social media platforms that undertook a “stress-test” this summer to gauge whether they were compliant. Breton warned Musk he needed more resources to moderate dangerous content, but after the billionaire’s takeover, he unleashed a wave of firings.

Twitter owner Elon Musk said his company was “working hard” to comply with the DSA, in a direct reply to Breton post.

Musk recently hinted that he intends on removing the ability to block other users on his platform. While it is unclear from the legislation if this move will be a breach, the law does request that users be notified for the reasons why a platform has decided to restrict their access or “shadow ban” the user.

The law also allows companies to ban or terminate accounts that are run by bots or are spam accounts – an issue which caused a major stir with Twitter and Musk after he pulled out of the initial purchase of the platform in May 2022 after claiming the website was overrun with fake accounts.

Microsoft vowed this afternoon that the company will provide more information on targeted adverts and protect users against any new risks from artificial intelligence in response to the law coming into force.

Microsoft-owned LinkedIn said in a blog post they had implemented this change for the desktop and mobile version of the network.

AI has also dominated headlines with its dizzying advances after chatbot ChatGPT, developed by OpenAI which Microsoft has invested in, burst onto the scene last year.

The EU is racing to approve the world’s first law regulating AI by the end of the year.

Courtney Gregoire, Microsoft’s chief digital safety officer, vowed Microsoft would “implement additional safeguards to protect against new risks related to AI as they arise and will continue to be transparent about our approach” in a blog post.

Gregoire said other measures taken by Microsoft to comply with the DSA include creating an “Ad Library”, giving European users access to information about the adverts they see on the platform. LinkedIn has taken a similar step.

Gregoire added that Microsoft would also “better explain to users how Bing search works, including its ranking principles, moderation policies, and user controls”.

Amazon and German-owned online retailer Zalando have already contested the Union’s description as “very large”.

Snapchat, owned by Snap, unveiled changes as well this week to its app that include giving users control over the content they see and restricting targeted advertising to children aged 13 to 17 in the EU but also in Britain.

Breton said that he hopes the DSA will make the online environment “safer for everyone in Europe”.

Google’s services apps – Maps, Play, and Shopping – must also meet the new requirements.

Sector-wide regulation

The DSA will enforce four sectors of the tech industry: intermediary services offering network infrastructure, hosting services such as cloud and webhosting service, online marketplaces, app stores, collaborative economy platforms and social media platforms. and very large online platforms.

While intermediary services must meet basic consumer and transparency reporting clauses, online social media and market platforms must fulfil the majority of the content moderation and transparency of online advertising obligations.

Members states must now appoint Digital Services Coordinators by 17 February 2024 so that the correct enforcement procedures can begin.

Users of the platforms will have a right to lodge a complaint against the websites, through their respective country’s coordinator, and both parties will have a right to be heard and receive “appropriate information about the status of the complaint”.

If the platform is found to be in breach of the regulation, per the complaint, recipients of the service shall have the right to seek compensation, paid to them by the platform, “in respect of any damage or loss suffered due to an infringement by those providers”.

Many inside and outside of the EU hope the DSA will be a beacon for other countries to take similar action and bring more regulatory oversight of big tech worldwide.

Additional reporting from © AFP 2023

Your Voice
Readers Comments
45
This is YOUR comments community. Stay civil, stay constructive, stay on topic. Please familiarise yourself with our comments policy here before taking part.
Leave a Comment
    Submit a report
    Please help us understand how this comment violates our community guidelines.
    Thank you for the feedback
    Your feedback has been sent to our team for review.

    Leave a commentcancel