1/5 🚨 On 5 November, MEPs in the LIBE Committee of the European Parliament will vote on a new #Europol Regulation, part of the EU’s so-called “Facilitators Package.” Behind the rhetoric of fighting “smuggling,” this reform expands Europol’s #MassSurveillance powers & threatens #FundamentalRights Here’s how 👇🏾
📨 The latest edition of your favourite #DigitalRights newsletter is out! This time, we're grappling with changing seasons 🍂 🌧️ and much more. In this #EDRigram, read about: 💰 Budget cuts impairing Austria's #DataProtection Authority, and what NGOs are doing about it 📚 A new book on #spyware reiterating the call for a ban ⚖️ EDRi's take on #DigitalFairness and what we'd like to see in the upcoming #DFA ... and more!
Much of today’s internet exploits attention & vulnerability instead of supporting free, informed choice. The Digital Fairness Act (DFA) can change that, by banning manipulative design & making fairness-by-design the norm. EDRi has submitted its response to the European Commission's call for evidence, urging: 🔴 Bans on manipulative design 🔴 Fair defaults & real consent 🔴 Behavioural design impact checks 🔴 Strong, coordinated enforcement Read more ➡️ image
The @European Commission called out TikTok & Meta for breaking key transparency & accountability rules under the #DSA The two tech giants made unnecessarily difficult to access data and, in Meta’s case, used deceptive interface design that confuses & discourages users from flagging illegal content. After this preliminary decision, we urge the Commission to move swiftly toward final remedies & fines — because justice delayed is justice denied. Read the Commission's PR ➡️ image
In a nutshell: genuine simplification should make rules clearer, enforcement stronger & justice accessible. Simplification means: ✅ Clearer rules and stronger enforcement ✅ Strengthen protections for people, not weaken them ✅ Ensure equal access to rights for all communities, especially those most affected by technology harms. Anything less risks rolling back years of progress, putting 500 million people at risk of privacy, equality, and security breaches. (6/n)
But that’s not it: the process itself is deeply flawed. Short consultation timelines, industry-dominated “reality checks,” and optimistic impact assessments leave little room for alternative views. That’s not meaningful participation: it creates legal uncertainty and reduces scrutiny, undermining democratic oversight, while leaving room for corporate wish-lists. Something we have previously raised together with other 470 CSOs & trade union groups. (5/n)
The Omnibus also targets the EU’s new AI Act before it’s even in force. While the Act could go further, it’s a major step for accountability, transparency and fairness in AI. Cutting or delaying it now would create legal uncertainty & make it harder to enforce rights. Meanwhile, the Commission’s other plans – like “AI Continent”, “Apply AI” - push for massive AI adoption at the risk of trampling privacy, environmental protections & public accountability (4/n)
Then we have cookie banners: Article 5(3) of the ePrivacy Directive, a cornerstone of EU digital rights. Under the pretext of “cookies annoyance”, the Commission is ready to cut on rules that protect the privacy and confidentiality of our communications. Weakening them means more silent surveillance by companies and governments. The fix isn’t deregulation, it’s better enforcement and privacy-by-design tools that make choices simple. (3/n)