The German Federal Court of Justice recently clarified online platforms' liability for user-generated content, ruling platforms aren't directly liable without knowledge but must act swiftly upon notification, significantly impacting digital service providers across Europe.
Australia’s eSafety Commissioner ordered Telegram to pay AUD 1 million for ignoring transparency obligations. Officials requested details on terrorist and child sexual content steps, but Telegram delayed months, triggering enforcement under the Online Safety Act.
On 28 February 2025, Japan’s Cabinet announced significant plans to introduce a Bill to promote research, development, and practical application of artificial intelligence technologies. The legislation focuses on transparency, protection of rights, and international cooperation.
European Commission Supports Revised Code to Counter Illegal Hate Speech on Online Platforms
The European Commission and the European Board for Digital Services support a revised Code of Conduct to counter illegal hate speech online. Integrated with the Digital Services Act, it mandates stricter reporting, monitoring, and transparency from online platforms.
Revised EU code of conduct sets new standards for combating online hate speech
On 20 January 2025, the European Commission and the European Board for Digital Services announced their support for the revisedCode of Conduct on Countering Illegal Hate Speech Online, now aligned with the Digital Services Act (DSA).
This update builds upon the original 2016 Code, aiming to enhance how online platforms manage content deemed illegal hate speech under EU and national laws.
Technology Law
Read the latest Technology Law updates and news on artificial intelligence, privacy and data protection law, digital assets regulation, and beyond—delivered straight to your inbox!
No spam. Unsubscribe anytime.
Key Commitments of the Revised Code
The updated Code introduces several commitments for signatory platforms:
External Monitoring: Platforms will undergo monitoring by qualified non-profit or public entities, referred to as "monitoring reporters." These entities are responsible for assessing the platforms' handling of hate speech reports.
Timely Review of Reports: Signatories commit to reviewing at least two-thirds of hate speech reports submitted by monitoring reporters within 24 hours. This rapid response aims to ensure swift action against illegal content.
Transparency and Collaboration: The Code emphasises the importance of transparency in content moderation processes and encourages collaboration among stakeholders to effectively address hate speech.
User Awareness: Platforms are encouraged to raise user awareness about what constitutes hate speech and how to report it, empowering users to play an active role in maintaining a respectful online environment.
Enhancing Compliance with the Digital Services Act
By aligning the Code with the DSA, the European Commission aims to bolster compliance and enforcement mechanisms.
The DSA provides a comprehensive framework for regulating digital services, and the revised Code serves as a tool to support platforms in meeting these obligations.
The European Commission and the European Board for Digital Services also recommend that the Code provide data on measures taken to address hate speech, along with country-level information on internal classifications of flagged content.
This data-driven approach aims to enhance transparency and accountability in how platforms handle illegal content.
Major tech companies, including Meta (Facebook), X (formerly Twitter), Google's YouTube, and others, have agreed to strengthen their efforts against online hate speech under this updated Code.
These companies have pledged to cooperate with non-profit and public organisations to monitor and review hate speech reports, aiming to address at least two-thirds of such notices within 24 hours.
The German Federal Court of Justice recently clarified online platforms' liability for user-generated content, ruling platforms aren't directly liable without knowledge but must act swiftly upon notification, significantly impacting digital service providers across Europe.
Australia’s eSafety Commissioner ordered Telegram to pay AUD 1 million for ignoring transparency obligations. Officials requested details on terrorist and child sexual content steps, but Telegram delayed months, triggering enforcement under the Online Safety Act.
The European Commission recently submitted a proposal for an EU Blueprint on cybersecurity crisis management. The recommendation outlines response mechanisms, promotes Union, and calls for collaboration between civilian authorities and military partners.
China's new rules on military content sharing impose tighter guidelines on what can be posted online. The rules mandate platforms to follow official sources, banning misinformation while promoting government-approved perspectives on national defence, history, and military achievements.