Multimedia Technik Nutzung

© stock.adobe.com/tippapatt

Almost one month after the political agreement on the Digital Markets Act (DMA), the EU institutions and the Member States have now also agreed on a common Digital Services Act (DSA). The focus of the DSA is on a uniform legal framework for the digital single market to make online marketplaces and social media platforms more accountable. The majority of the new obligations will apply to very large online platforms with more than 45 million users, including Facebook, Amazon and Google. In order to avoid placing too great a burden on small enterprises and startups, these companies will be exempted from the requirements.

State Secretary at the Federal Ministry for Economic Affairs and Climate Action Sven Giegold said: "By adopting the Digital Services Act, Europe will set the world’s strictest standards for a free and democratic internet. Not least in view of the war in Ukraine and the related disinformation campaigns, this will considerably fortify the internet. The DSA will protect consumers and their fundamental rights in the internet. In combination with the Digital Markets Act, Europe will thus also strengthen international rules. One thing is clear: the principle is not that ‘might is right’.

The DSA prohibits, for example, dark patterns, i.e. misleading user interfaces which make users act unintentionally. In future, personalised advertising which is targeted to children or young people or uses sensitive data, including sexual orientation, will be forbidden. And it will become easy to quickly identify and remove illegal content as every platform will have to establish a reporting mechanism."

In detail:

On 22 April 2022, the Council, the European Parliament and the European Commission agreed on the Digital Services Act (DSA) – the regulation on digital services – in the context of the fifth trilogue. The aim is to make the provisions of the 2000 E-Commerce Directive fit for the future. The DSA for the first time contains uniform rules on duties and responsibilities of online platforms. The larger the platform, the more obligations it must fulfil. This provides new opportunities to offer services across borders – with a high level of protection for users regardless of where they live in the EU.

More specifically, this means:

  • Measures to counter illegal content by means of a mandatory reporting mechanism allow users to flag such content and platforms to work together with "trusted flaggers".
  • New rules on traceability of business users in online marketplaces to facilitate the identification of sellers of illegal goods.
  • Effective complaint mechanism, including the possibility for users to challenge platforms’ content moderation decisions.
  • More transparency by online platforms, especially on the algorithms used for the platforms’ recommendation systems.
  • Obligations for very large platforms to analyse risks related to the misuse of their systems and to take countermeasures. The platforms’ risk management will be reviewed by an independent body.
  • Access for researchers to key data of the largest platforms so that they can independently examine how the algorithms used work and what risks may evolve for society and democracy.
  • Enforcement provisions that take account of the complexity of the online space: in this context, the Member States will play the key role, supported by a new body, the European Board for Digital Services. As regards very large platforms, the European Commission will be responsible for supervision and enforcement.