Much of today’s internet exploits attention & vulnerability instead of supporting free, informed choice. The Digital Fairness Act (DFA) can change that, by banning manipulative design & making fairness-by-design the norm. EDRi has submitted its response to the European Commission's call for evidence, urging: 🔴 Bans on manipulative design 🔴 Fair defaults & real consent 🔴 Behavioural design impact checks 🔴 Strong, coordinated enforcement Read more ➡️ image
The @European Commission called out TikTok & Meta for breaking key transparency & accountability rules under the #DSA The two tech giants made unnecessarily difficult to access data and, in Meta’s case, used deceptive interface design that confuses & discourages users from flagging illegal content. After this preliminary decision, we urge the Commission to move swiftly toward final remedies & fines — because justice delayed is justice denied. Read the Commission's PR ➡️ image
In a nutshell: genuine simplification should make rules clearer, enforcement stronger & justice accessible. Simplification means: ✅ Clearer rules and stronger enforcement ✅ Strengthen protections for people, not weaken them ✅ Ensure equal access to rights for all communities, especially those most affected by technology harms. Anything less risks rolling back years of progress, putting 500 million people at risk of privacy, equality, and security breaches. (6/n)
But that’s not it: the process itself is deeply flawed. Short consultation timelines, industry-dominated “reality checks,” and optimistic impact assessments leave little room for alternative views. That’s not meaningful participation: it creates legal uncertainty and reduces scrutiny, undermining democratic oversight, while leaving room for corporate wish-lists. Something we have previously raised together with other 470 CSOs & trade union groups. (5/n)
The Omnibus also targets the EU’s new AI Act before it’s even in force. While the Act could go further, it’s a major step for accountability, transparency and fairness in AI. Cutting or delaying it now would create legal uncertainty & make it harder to enforce rights. Meanwhile, the Commission’s other plans – like “AI Continent”, “Apply AI” - push for massive AI adoption at the risk of trampling privacy, environmental protections & public accountability (4/n)
Then we have cookie banners: Article 5(3) of the ePrivacy Directive, a cornerstone of EU digital rights. Under the pretext of “cookies annoyance”, the Commission is ready to cut on rules that protect the privacy and confidentiality of our communications. Weakening them means more silent surveillance by companies and governments. The fix isn’t deregulation, it’s better enforcement and privacy-by-design tools that make choices simple. (3/n)
Data protection is under threat. Part one of the Omnibus tackles data flows. But “simplifying” could blur the lines between what’s personal and what’s not. The #DataGovernanceAct already left cracks: fuzzy consent rules, weak post-sharing protections, and shaky anonymisation standards. If the Commission “simplifies” without fixing these problems, GDPR crumble. (2/n)
The EU Commission has been running “reality checks” on 5 core areas of EU digital law: data, privacy, cybersecurity, AI, & eID. Here’s the paradox: these laws are being “simplified” before they’ve even been properly implemented or enforced. These cuts are part of a broader “digital simplification” agenda that could: ❌ Undermine EU hard-won data protection framework ❌ Weaken net neutrality ❌ Follow the withdrawal ePrivacy Regulation & AI Liability Directive (1/n)
In the next month, the European Commission plans to drop the so-called #DigitalOmnibus, a deregulation package that could quietly rewrite digital rights in the EU. EDRi responded to the Commission’s call for evidence, warning it’s not an administrative tidy-up but a political step that treats rights as “red tape,” weakening protections and boosting corporate influence ➡️ Let’s unpack it 🧵 image
🚨 From Big Tech to Big Regulator? That’s the case for 🇮🇪 newly appointed Data Protection Commissioner - with Meta on their CV. This raises serious doubts about impartiality and trust in GDPR enforcement. With 40 CSOs, we’re calling on the European Commission to step up and protect the independence of data protection authorities across Europe. Because only with truly independent regulators can we have credible rights enforcement. Read the full open letter 👉 image