• Dr. Anna-Kristine Wipper, Partner |

Highlights

  • Messenger services to be regulated at European level by the Digital Service Act (DSA)

  • General adherence to the exemption from liability of messenger services for third-party content

  • Tightening of transparency obligations for messenger services

  • Notice-and-takedown procedure to develop into a notice-and-action procedure

  • Messenger providers will have to adapt their general terms and conditions/community standards accordingly and implement reporting and deletion procedures

Hate, lies and contempt for humanity are omnipresent on the internet. The Network Enforcement Act, which has been passed at federal level, represents one effort to curb their spread. And since this spread does not stop at national borders and affects all EU member states, the European Commission is seeking regulation within the framework of the Digital Service Act, the draft of which is now available.

Since 2018, the Network Enforcement Act ("NetzDG") has obliged the major networks to react quickly to user notices and to delete obviously punishable content within 24 hours. The law against right-wing extremism and hate crime also modifies the NetzDG to the effect that networks will in future be obliged to report death threats and incitement of the people to the Federal Criminal Police Office. Messenger services such as Telegram, however, are not covered by the NetzDG, neither in its current nor in its modified form, although their importance for the exchange of opinions, especially among youths and young adults, is clear:

Survey on political exchange on online platforms of young people 2019

Draft of the Digital Service Act - DSA

The EU Commission wants to address this regulatory deficit at European level with the draft Digital Service Act (COM(2020) 825 final, "DSA-E"): According to the proposal for the DSA-E, platform providers must give themselves clear, unambiguous communication rules and apply such rules objectively and proportionately. Furthermore, transparency obligations will provide insight into whether the algorithms continue to reward hate and aggression with more attention or whether the "social" networks will counteract this in the future.

The overarching goal of the DSA-E is to contribute to a safe, predictable and trustworthy online environment in which fundamental rights are protected (Art. 1 No. 2 lit. b)). As in the past, intermediaries should not be responsible for third-party content published in the future. A ban on general monitoring obligations is planned (Art. 7), but authorities and courts will be able to issue orders to stop infringements of rights by specific infringing content (Art. 3 III, Art 4 II, Art. 5 IV).

Transparency obligations of the DSA-E

Furthermore, the DSA-E tightens transparency obligations for all intermediaries, who must establish a central contact point for electronic communications (Art. 10) and, if they do not have an EU branch (e.g. the messenger provider Telegram), must name a legal representative in the EU who is liable for DSA violations (Art. 11). Requirements that regulate the handling of content (restrictions on use and moderation of content) must be included in the terms of use in a publicly available form.

Notice-and-action system

The DSA-E develops the notice-and-take-down procedure applicable to host providers into a notice-and-action system. According to the new system, intermediaries are liable (as before) for illegal content if they become aware of it and do not remove or block it immediately. Host providers, however, must now additionally provide easily accessible mechanisms for specific and substantiated reports of illegal content by private individuals, confirm the receipt of such reports, process them promptly, carefully and objectively, and provide substantiated feedback on the outcome. They must also provide information on the use of automated systems and on legal protection options (Art. 14). If content is blocked or removed, uploaders must be informed in a comprehensible and reasoned manner (Art. 15). The notice-and-action system is similar to the regulations of the NetzDG and also shares its immanent overblocking risk in that providers receive additional incentives to delete content and block user accounts. It is not yet apparent whether the notice-and-action system will also include stay-down obligations.

Stricter provisions of the DSA-E not appliable to messenger services

It remains unclear whether the stricter provisions for online platforms (Art. 16 et seq.), which provide for (among other things) an internal complaints management system, out-of-court dispute resolution before arbitration bodies and an obligation to report certain serious criminal offences to state prosecution authorities, are applicable to the regulation of messenger services such as Telegram. This is because these stricter provisions are to apply to online platforms that not only store content on behalf of users, but also publicly disseminate it. Since public dissemination in the sense of the DSA-E is understood to mean the provision of information to a potentially unlimited number of third parties, the exchange of content via closed (Telegram) groups should not be able to be subsumed under public dissemination. Messengers (at least their closed groups) would thus typically not constitute online platforms in the sense of the DAS-E.

According to the above, the strictest regulatory requirements of the DSA-E for the regulation of systemically relevant platforms do not apply to messenger services, because these requirements are also only applicable if content is publicly distributed.

Assessment and outlook

In my opinion, the regulation of messenger services via the DSA-E represents a cautious further development of the existing regulations. The notice-and-action system, for example, maintains the basic liability rule that intermediaries are not liable for third-party content. This preserves the development possibilities of existing and new provider services (expression of freedom of ownership) as well as the interests of users to upload and transmit content without prior review by providers (freedom of expression). For messenger providers, the adoption of the DSA will primarily result in adjustments to the GTCs and community standards as well as in the implementation of reporting and deletion procedures. 

If fake news and disinformation are not illegal information, they continue to be unregulated in the DSA-E.  In this respect, it remains to be seen how the discussion will evolve, and if there is desire - and capacity - to cover these topics in regulatory terms.

  • Dr. Anna-Kristine Wipper

    Anna-Kristine Wipper

    Partner, Head of IP-Law, KPMG Law Rechtsanwaltsgesellschaft mbH

    Blog articles

Multilingual post

This post is also available in the following languages