The German Federal Court of Justice recently clarified online platforms' liability for user-generated content, ruling platforms aren't directly liable without knowledge but must act swiftly upon notification, significantly impacting digital service providers across Europe.
Australia’s eSafety Commissioner ordered Telegram to pay AUD 1 million for ignoring transparency obligations. Officials requested details on terrorist and child sexual content steps, but Telegram delayed months, triggering enforcement under the Online Safety Act.
On 28 February 2025, Japan’s Cabinet announced significant plans to introduce a Bill to promote research, development, and practical application of artificial intelligence technologies. The legislation focuses on transparency, protection of rights, and international cooperation.
European Data Protection Board Explores How Pseudonymisation Enhances GDPR Compliance and Privacy Standards
The European Data Protection Board has introduced draft Guidelines on pseudonymisation, highlighting its role in reducing privacy risks under GDPR. It outlines technical measures, safeguards, and the need to maintain compliance through robust security practices and proactive implementation.
European Data Protection Board’s Consultation on Pseudonymisation Guidelines: What You Need to Know
On 17 January 2025, the European Data Protection Board (EDPB) unveiled its draft "Guidelines 01/2025 on Pseudonymisation" and launched a consultation open until 28 February 2025.
This initiative aims to clarify pseudonymisation’s role under the General Data Protection Regulation (GDPR) and its practical application for data controllers and processors.
Here’s a closer look at the key elements and why they matter.
A core message of the Guidelines is that pseudonymised data remains classified as personal data under GDPR. This stems from the fact that, even with direct identifiers removed, the possibility of re-identifying individuals through additional information still exists.
Pseudonymisation, therefore, does not exempt organisations from GDPR compliance but serves as a risk-reduction tool.
For organisations processing vast amounts of data, pseudonymisation can create a middle ground. While it does not anonymise data entirely, it enhances privacy and reduces the likelihood of harmful breaches.
By substituting identifying information with unique identifiers or pseudonyms, the risk to individuals’ privacy is mitigated, provided that sufficient safeguards are implemented to prevent re-identification.
The EDPB highlights the importance of technical and organisational measures to reinforce pseudonymisation’s effectiveness. For instance, keeping key-coding data separate from pseudonymised datasets is essential to prevent unauthorised access or identification.
Technology Law
Read the latest Technology Law updates and news on artificial intelligence, privacy and data protection law, digital assets regulation, and beyond—delivered straight to your inbox!
No spam. Unsubscribe anytime.
Supporting Legitimate Interests and Privacy by Design
The Guidelines delve into how pseudonymisation aligns with GDPR principles, particularly privacy by design and data minimisation. Controllers using pseudonymisation can strengthen their case for invoking legitimate interests as a lawful basis for data processing.
By reducing identifiability risks, organisations may process personal data while demonstrating a proactive approach to privacy protection.
The EDPB identifies pseudonymisation as a practical way to achieve data protection by design and default. It ensures that personal data are adequately secured throughout processing activities, reducing risks for both organisations and individuals.
For example, organisations implementing pseudonymisation for customer data analytics can still derive insights while ensuring that sensitive identifiers remain protected.
Additionally, pseudonymisation can enhance security. In cases of data breaches, pseudonymised information is less likely to expose individuals directly.
For organisations, this could mean fewer reputational damages and legal liabilities, provided that safeguards such as encryption and key management are robustly in place.
Technical Measures: Getting It Right
A major focus of the EDPB’s Guidelines is the technical side of pseudonymisation. The document analyses specific measures and best practices to maximise the confidentiality and security of pseudonymised data.
Techniques such as tokenisation, encryption, and hashing are examined for their roles in ensuring that re-identification risks remain negligible.
The Guidelines stress that pseudonymisation should not be a one-time process but a dynamic one. As technology evolves, so do the risks of re-identification. Organisations are advised to regularly review and update their pseudonymisation practices, incorporating the latest tools and methodologies.
Furthermore, the Guidelines advise implementing access controls and ensuring that data necessary for re-identification, such as decryption keys or lookup tables, are stored separately with restricted access. This separation of roles and resources is vital to prevent breaches or misuse.
For organisations operating across borders or sharing data between entities, the EDPB offers guidance on ensuring that pseudonymisation measures comply with GDPR while accommodating varying levels of technological sophistication.
Italy has enforced new rules requiring digital devices to support parental control apps, ensuring parents can monitor children's online activity. The law also prevents companies from using collected data for advertising or profiling, strengthening privacy protections.
The CFPB seeks to categorise certain data brokers as consumer reporting agencies under Regulation V. Doing so would tighten obligations, require more transparency, and ensure consumers can see, correct, and control their own information.
House Bill H.210, introduced in Vermont, outlines new guidelines for digital platforms handling minors’ data. By mandating default high-privacy settings and transparent practices, legislators aim to reduce risks of emotional harm and excessive data harvesting.
Both GDPR and HIPAA are key regulations focused on protecting sensitive data. GDPR applies to personal data of EU residents, while HIPAA governs healthcare data in the U.S. Organisations must comply with both for international operations.