toad.social is one of the many independent Mastodon servers you can use to participate in the fediverse.
Mastodon server operated by David Troy, a tech pioneer and investigative journalist addressing threats to democracy. Thoughtful participation and discussion welcome.

Administered by:

Server stats:

295
active users

#digitalrights

16 posts16 participants0 posts today

THE INCREDIBLE REASON HOUSE STATION LIVE CHANGED SERVERS
Or a survival guide for e-businesses facing bans or legal disputes with a platform

For nearly two years, we watched our entire platform become invisible. Not because of bad content, policy violations, or lack of effort... but because of a silent algorithmic shadowban. We had no warnings, no appeals, and no answers. Worse: under YouTube’s terms of service, any legal dispute must be handled in a U.S. court (even if you're based in France and pay taxes there). This is how global platforms sidestep national laws... and why creators are left legally unprotected in their own countries.
¯

_
PART ONE – WHEN A SHADOWBAN SHUTS DOWN YOUR BUSINESS
¯

_
Two years ago, we left Dedibox, a French hosting company we judged incapable of meeting even our most basic expectations in terms of customer service. In a field as critical as data hosting, the professionalism of the technical support team cannot be optional... it must be the company's showcase, the reassuring human face you turn to when something goes wrong. This pursuit of reliability led us to GoDaddy, based in Arizona, whose configuration tools, WordPress diagnostics, interface design, and especially their technically skilled support team had earned our trust... far beyond the empty promises of typical commercial discourse. But everything collapsed suddenly, swept away by a digital catastrophe we didn’t see coming. A brutal, invisible blow: the shadowban. House Station Live was ghosted (to use the terminology of our virtual assistant, GPT). Disappeared from search results, ignored by YouTube recommendations, erased from the Android Play Store. For eighteen months, despite heavy investments and extensive testing in formats, lengths, languages, thumbnails, titles, even hosts, nothing changed. Every video was locked between 20 and 30 views. We were trapped in that narrow range, with no human contact, no way to file a complaint, and no hope of improvement.

Facing this algorithmic wall, we made the only logical decision: open an investigation and build a legal case. Not to prove a “perfect crime” but to demonstrate that even the most opaque algorithms leave traces. During this inquiry, we came across a particularly disturbing fact: according to YouTube’s terms of use, any dispute must be brought before a U.S. judge. It doesn’t matter that you are based in France, targeting a French audience, or that French law requires foreign companies to have a legal presence in the country... Google circumvents this by distinguishing between headquarters, local offices, and legal jurisdiction. The result is clear: you are automatically excluded from the protection of your own legal system. This system is so airtight that very few individuals or businesses attempt legal action against Google. The GAFAM is protected by a lethal triad: algorithmic opacity, extraterritorial legal shielding, and the complicity of a U.S. government that views tech giants as national pride (even strategic weapons in the global information war). While France leaves its citizens exposed and helpless against digital abuse, the United States has conquered the Internet on a global scale by imposing its law as if it were sovereign territory.

To illustrate just how absurd and dangerous this has become, let’s take the example of music licensing. Every month, House Station Live pays royalties to SACEM, the French government’s music rights agency. In return, we are legally authorized to broadcast commercial works, provided we submit monthly playlists so that royalties can be fairly distributed to artists. In theory, everything is legal and in order. But the United States has its own system: the DMCA. And if you stream House Station Live through any platform based in the U.S. (like GoDaddy, YouTube, etc.), you are automatically subject to U.S. law, even if your legal entity is based in France. France, in turn, declares itself incompetent in such cases because the “crime scene” is legally located on American soil, where the servers are hosted. So the SACEM fee we pay offers zero protection, neither domestically nor abroad... where we’re treated like pirates. Imagine buying a product from a foreign website: you pay the foreign VAT, a currency conversion fee, and then the French customs tax. Three layers of taxation. A 30 € item ends up costing you 150 €. That’s digital over-taxation. And the same applies to our royalties.

Worse still, the U.S. considers you to be operating on their soil the moment your server is physically located there... regardless of where you are based, where your company is registered, or what contracts you’ve signed with your local rights agencies. Even if your SACEM contract is supposedly international, it offers you no protection in this skewed legal context. The U.S. has simply annexed the Internet, claimed it as their jurisdiction, and imposed their extraterritorial laws on the rest of the world (with institutional blessing).
¯

_
||#HSLdiary #HSLpartners

Regulator and lawmakers around the world are finally targeting organisations that use #darkpatterns to manipulate consumers into using products or services.

Chandni Gupta, Deputy CEO of the Consumer Policy Research Centre (and friend of EFA) has wrapped up some fantastic research on this issue. Chandni met with scores of regulators, enforcement agencies, consumer advocacy groups and choice architecture experts in the US, UK, Singapore and India about dark patterns and how they regulate them.

Case in point is Amazon's Project Iliad, a process designed to make it harder for customers to cancel their Prime membership. How? By making the process so needlessly complex that many users would give up and abandon the form.

Chandni writes that Australia is falling behind on protecting citizens from dark patterns. Legislating consumer potections against dark patterns is vital to protect us from manipulation of our behaviour and choices online.

Read the CPRC report: cprc.org.au/report/made-to-man

Replied in thread

"The EU is once again trying to erode citizens' right to digital privacy" -
but the proposal is open to feedbackfrom us European citizens here:
ec.europa.eu/info/law/better-r

Current status is that 2808 people have added their comments so far.
Go ahead, leave yours!

"Impact assessment on retention of data by service providers for criminal proceedings"
Feedback period:
21 May 2025 - 18 June 2025

(You may use your own native language.)

Thanks to @Devijand

European Commission - Have your sayEuropean Commission - Have your sayEuropean Commission - Have your say

"Syria was once described as “one of the most dangerous places to use the internet in the world.” Under the rule of Bashar al-Assad, every online step carried risk: deep-packet inspection to facilitate surveillance and “analyze and control the activities of Syrian Internet users,” blocked websites to control the flow of information, and detention of activists, journalists, and even ordinary citizens for their online speech and activities.

With the regime’s fall in late 2024, Syrians see an opportunity for a free, safe, and open web: a virtual space for civil society, entrepreneurs, students, and families long torn apart by war. Yet there are significant challenges ahead. Half of Syria’s infrastructure is “destroyed or rendered dysfunctional,” including its communication networks. This destruction is compounded by decades-long stifling sanctions that not only choke Syrians and hinder humanitarian assistance, but also bar reconstruction and economic recovery, including the export of telecom and dual-use equipment.

Last week, the new U.S. administration suddenly announced its intention to lift its sanctions on Syria, and earlier this week, the EU followed suit, announcing that it would remove all economic sanctions. These welcome developments are critical first steps toward letting Syrians reclaim their digital future. Connectivity underpins humanitarian assistance, open and connected spaces, economy, governance, and people’s ability to enjoy their human, social, and economic rights.
(...)
However, rebuilding Syria’s internet requires not only hardware restoration and sanctions relief, but also policy overhaul. In this blog post, we outline the legacy of Syria’s digital repression and highlight some of the technical and legal challenges the transitional government must overcome to cut ties with the past, and deliver a free and open internet for all Syrians."

accessnow.org/syria-sanctions-

Access Now · Syria sanctions: reclaiming Syria's digital futureLifting sanctions on Syria is a critical first step forward, but work remains for Syrians to reclaim their digital future.
Data Retention and Democratic Resilience
A critical policy analysis of security, legality and fundamental rights within the European Union.

https://scribe.disroot.org/post/2914542

#EU #Surveillance #DigitalRights #Privacy #AI #DataRetention


scribe.disroot.org
scribe.disroot.orgDoes Data Retention Prevent Crime? - D•ScribeDoes Data Retention Prevent Crime? A Critical Analysis in Light of Union Law, Fundamental Rights, and Alternative Policy Models Executive Summary This study provides a critical analysis of the effectiveness of general and indiscriminate data retention practices practices within the European Union (EU) regarding crime prevention. Particularly in the context of criminal proceedings, it becomes evident that data retention practices does not deliver the expected security gains, while simultaneously posing significant threats to EU fundamental rights such as the right to privacy, freedom of expression, and the protection of private life. Key Findings: Weak Legal Foundation: The practice of comprehensive data retention practices contradicts primary EU law, as confirmed by the jurisprudence of the Court of Justice of the European Union (CJEU) (Digital Rights Ireland, Tele2 Sverige). Limited Effectiveness: Independent research indicates that it does not improve the crime clearance rate. Interference with Fundamental Rights: Journalism, activism, and political opposition are affected by chilling effects. Increasing Economic Burden: Smaller providers bear excessive costs; cybersecurity risks are on the rise. Technological Inadequacy: In the age of IoT, 5G, and artificial intelligence (AI), data volumes are exploding, making these storage models increasingly intrusive and uncontrollable. Key Recommendations to the Commission: General forms of data retention practices should be strictly avoided. Member States should promote targeted models with time limits and independent judicial authorization. The right to encryption and anonymity should be respected, and users’ digital privacy strengthened. Supportive infrastructures and policies to alleviate the burden on small and medium-sized enterprises should be developed, with particular consideration for economic impacts. 1. Introduction With ongoing digitalization, methods of combating crime have also evolved. One such method is data retention practices, which is widespread, under the pretext of counter-terrorism. Nevertheless, these general and indiscriminate practices pose significant risks to individual freedoms and often fail to achieve the intended security objectives. This study, as part of the European Commission’s Impact Assessment process, thoroughly examines the dimensions of necessity, proportionality, impact on EU fundamental rights, utility for criminal justice, and economic costs of current data retention practices policies. Furthermore, their future viability in the age of new technologies is analyzed. 2. Methodology The study is based on a qualitative content analysis. Key sources include: Jurisprudence of the Court of Justice of the European Union (Court of Justice of the European Union (CJEU)): the Digital Rights Ireland and Tele2 Sverige decisions. European Court of Human Rights (European Court of Human Rights (ECtHR)): the Big Brother Watch v. United Kingdom decision. Independent Reports: such as those from ENISA (European Union Agency for Cybersecurity) and the European Parliament. National Case Studies: such as the Turkish ByLock case. Additionally, statistical data and economic cost analyses were included to evaluate both the effectiveness and the social and technical consequences of data retention practices. 3. Literature Review Research on data retention practices can be broadly categorized into three areas: 3.1 Legal Approaches Binns (2018) meticulously analyzes the incompatibility of these practices with the right to privacy. De Hert & Poullet (2013) address the legitimacy of such measures in light of EU fundamental rights. 3.2 Effectiveness Assessment Hoofnagle et al. (2012) demonstrate that data retention practices measures introduced under the Patriot Act in the USA showed no measurable effect. ENISA (2020) highlights the technical and financial burdens faced by small providers. 3.3 Political and Societal Impacts Lyon (2018) links these policy forms to the emergence of a “surveillance society.” Zuboff (2019) exposes how platforms commercially exploit personal data—a phenomenon she describes as “surveillance capitalism.” 4. Data Retention and EU Law: Necessity and Proportionality 4.1 Necessity Test In the view of the Court of Justice of the European Union (CJEU), general data retention practices practices do not pass the necessity test. They have so far failed to provide clear evidence of their suitability against terrorism or serious crimes. 4.2 Proportionality Test The principles of proportionality are violated because: All citizens are indiscriminately affected. No suspicion is required. The retention period is excessive (up to two years). No prior authorization by independent courts is mandated. 5. Utility for Criminal Justice 5.1 Presumption of Innocence This policy encourages “fishing expeditions,” which in turn undermines the presumption of innocence. 5.2 Example Turkey – ByLock Case Millions of individuals were suspected without concrete evidence based solely on the use of an app (ByLock); mere presence in metadata was sufficient. 6. Economic and Technical Costs 6.1 Impact on Service Providers ENISA (2020) notes that small providers, in particular, face disproportionate financial pressure. 6.2 Cybersecurity Risks The inability to securely store sensitive data leads to: Massive data breaches. Increased risks to public safety. Loss of trust in digital systems. 7. Future Outlook: IoT, 5G, and Artificial Intelligence The explosive increase in data volumes due to IoT, 5G, and artificial intelligence (AI) renders traditional storage models unsuitable. artificial intelligence (AI) today goes beyond mere analysis—it can derive new correlations that further endanger EU fundamental rights. 7.1 New Risks Posed by artificial intelligence (AI) and Mass Data Analysis Automated Profiling and Discrimination: artificial intelligence (AI) models learn from historical data. If these data contain systematic biases (e.g., association with a criminal offense solely based on using an app like ByLock, which can lead to collective stigmatization), such biases can be automatically reproduced and discriminatory practices intensified, unjustly targeting groups or individuals. False Positives and Weakening of the Presumption of Innocence: Statistically relevant correlations can be misleading or oversimplified. For example, a model might falsely identify a user group as statistically linked to “suspicious” activity based on the use of a particular app, even if there is no concrete individual evidence. This undermines the presumption of innocence. Opacity and Lack of Transparency: Often, artificial intelligence (AI) systems operate as black boxes, whose decision-making processes are not explicit or easily traceable. This makes it difficult for affected individuals to ascertain the reasons for surveillance measures or other decisions concerning them, or to effectively defend themselves against them, thereby impairing the right to an effective remedy. 7.2 Lack of Adaptation of Current Policy to New Technologies Existing data retention practices rules are not equipped to handle the rapidly increasing data streams from IoT, the speed of 5G, or the predictive capabilities of artificial intelligence (AI). This leads to general storage models becoming unmanageable and misuse risks increasing. Without independent judicial control, massive risks of abuse threaten to undermine social justice and EU fundamental rights. Conclusion to Chapter 7 In developing future artificial intelligence (AI)-supported methods for combating crime, the principles of data collection and analysis must be subjected not only to technical but also to strict ethical and legal limits. Otherwise, general data retention practices measures combined with artificial intelligence (AI) could lead to a structure that contradicts the values of democratic societies, violates human dignity, and opens the door to arbitrary interventions. 8. Conclusions and Recommendations General data retention practices possesses neither a stable legal basis nor demonstrable effectiveness. It directly attacks the most EU fundamental rights. Specific Recommendations to the Commission: The Commission should determine that these practices are incompatible with primary Union law, as confirmed by Court of Justice of the European Union (CJEU) jurisprudence. Member States should receive clear guidelines to promote targeted, proportionate models limited to serious crimes, under independent judicial control. Frameworks for the protection of encryption, anonymity, and digital privacy must be strengthened. Small and medium-sized service providers burdened by the requirements should be relieved through technical and financial support. Bibliography Binns, R. (2018). Algorithmic Accountability and Transparency in the EU GDPR. Philosophy & Technology, 31(2), 211–233. De Hert, P., & Poullet, Y. (2013). The Data Retention Directive: The Ghost that Should Not Walk. Computer Law & Security Review, 29(6), 673–683. Hoofnagle, C. J. et al. (2012). How Different is Privacy Law in Europe vs. the US? Berkeley Technology Law Journal, 28(2), 411–454. Lyon, D. (2018). The Culture of Surveillance: Watching as a Way of Life. Polity Press. Zuboff, S. (2019). The Age of Surveillance Capitalism. PublicAffairs. European Court of Human Rights (ECtHR) (2021). Big Brother Watch and Others v. the United Kingdom, Application no. 58170/13. Court of Justice of the European Union (CJEU) (2014). Digital Rights Ireland Ltd v. Minister for Communications, Marine and Natural Resources and Others, Case C-293/12. Court of Justice of the European Union (CJEU) (2016). Tele2 Sverige AB v. Post- och telestyrelsen and Secretary of State for the Home Department v. Tom Watson and Others, Joined Cases C-203/15 and C-698/15. ENISA (2020). Data Retention Practices in Europe. European Union Agency for Cybersecurity. European Parliament (2019). Privacy and Data Protection in Law Enforcement.

Data Retention and Democratic Resilience
A critical policy analysis of security, legality and fundamental rights within the European Union.

scribe.disroot.org/post/2914542

scribe.disroot.orgDoes Data Retention Prevent Crime? - D•ScribeDoes Data Retention Prevent Crime? A Critical Analysis in Light of Union Law, Fundamental Rights, and Alternative Policy Models Executive Summary This study provides a critical analysis of the effectiveness of general and indiscriminate data retention practices practices within the European Union (EU) regarding crime prevention. Particularly in the context of criminal proceedings, it becomes evident that data retention practices does not deliver the expected security gains, while simultaneously posing significant threats to EU fundamental rights such as the right to privacy, freedom of expression, and the protection of private life. Key Findings: Weak Legal Foundation: The practice of comprehensive data retention practices contradicts primary EU law, as confirmed by the jurisprudence of the Court of Justice of the European Union (CJEU) (Digital Rights Ireland, Tele2 Sverige). Limited Effectiveness: Independent research indicates that it does not improve the crime clearance rate. Interference with Fundamental Rights: Journalism, activism, and political opposition are affected by chilling effects. Increasing Economic Burden: Smaller providers bear excessive costs; cybersecurity risks are on the rise. Technological Inadequacy: In the age of IoT, 5G, and artificial intelligence (AI), data volumes are exploding, making these storage models increasingly intrusive and uncontrollable. Key Recommendations to the Commission: General forms of data retention practices should be strictly avoided. Member States should promote targeted models with time limits and independent judicial authorization. The right to encryption and anonymity should be respected, and users’ digital privacy strengthened. Supportive infrastructures and policies to alleviate the burden on small and medium-sized enterprises should be developed, with particular consideration for economic impacts. 1. Introduction With ongoing digitalization, methods of combating crime have also evolved. One such method is data retention practices, which is widespread, under the pretext of counter-terrorism. Nevertheless, these general and indiscriminate practices pose significant risks to individual freedoms and often fail to achieve the intended security objectives. This study, as part of the European Commission’s Impact Assessment process, thoroughly examines the dimensions of necessity, proportionality, impact on EU fundamental rights, utility for criminal justice, and economic costs of current data retention practices policies. Furthermore, their future viability in the age of new technologies is analyzed. 2. Methodology The study is based on a qualitative content analysis. Key sources include: Jurisprudence of the Court of Justice of the European Union (Court of Justice of the European Union (CJEU)): the Digital Rights Ireland and Tele2 Sverige decisions. European Court of Human Rights (European Court of Human Rights (ECtHR)): the Big Brother Watch v. United Kingdom decision. Independent Reports: such as those from ENISA (European Union Agency for Cybersecurity) and the European Parliament. National Case Studies: such as the Turkish ByLock case. Additionally, statistical data and economic cost analyses were included to evaluate both the effectiveness and the social and technical consequences of data retention practices. 3. Literature Review Research on data retention practices can be broadly categorized into three areas: 3.1 Legal Approaches Binns (2018) meticulously analyzes the incompatibility of these practices with the right to privacy. De Hert & Poullet (2013) address the legitimacy of such measures in light of EU fundamental rights. 3.2 Effectiveness Assessment Hoofnagle et al. (2012) demonstrate that data retention practices measures introduced under the Patriot Act in the USA showed no measurable effect. ENISA (2020) highlights the technical and financial burdens faced by small providers. 3.3 Political and Societal Impacts Lyon (2018) links these policy forms to the emergence of a “surveillance society.” Zuboff (2019) exposes how platforms commercially exploit personal data—a phenomenon she describes as “surveillance capitalism.” 4. Data Retention and EU Law: Necessity and Proportionality 4.1 Necessity Test In the view of the Court of Justice of the European Union (CJEU), general data retention practices practices do not pass the necessity test. They have so far failed to provide clear evidence of their suitability against terrorism or serious crimes. 4.2 Proportionality Test The principles of proportionality are violated because: All citizens are indiscriminately affected. No suspicion is required. The retention period is excessive (up to two years). No prior authorization by independent courts is mandated. 5. Utility for Criminal Justice 5.1 Presumption of Innocence This policy encourages “fishing expeditions,” which in turn undermines the presumption of innocence. 5.2 Example Turkey – ByLock Case Millions of individuals were suspected without concrete evidence based solely on the use of an app (ByLock); mere presence in metadata was sufficient. 6. Economic and Technical Costs 6.1 Impact on Service Providers ENISA (2020) notes that small providers, in particular, face disproportionate financial pressure. 6.2 Cybersecurity Risks The inability to securely store sensitive data leads to: Massive data breaches. Increased risks to public safety. Loss of trust in digital systems. 7. Future Outlook: IoT, 5G, and Artificial Intelligence The explosive increase in data volumes due to IoT, 5G, and artificial intelligence (AI) renders traditional storage models unsuitable. artificial intelligence (AI) today goes beyond mere analysis—it can derive new correlations that further endanger EU fundamental rights. 7.1 New Risks Posed by artificial intelligence (AI) and Mass Data Analysis Automated Profiling and Discrimination: artificial intelligence (AI) models learn from historical data. If these data contain systematic biases (e.g., association with a criminal offense solely based on using an app like ByLock, which can lead to collective stigmatization), such biases can be automatically reproduced and discriminatory practices intensified, unjustly targeting groups or individuals. False Positives and Weakening of the Presumption of Innocence: Statistically relevant correlations can be misleading or oversimplified. For example, a model might falsely identify a user group as statistically linked to “suspicious” activity based on the use of a particular app, even if there is no concrete individual evidence. This undermines the presumption of innocence. Opacity and Lack of Transparency: Often, artificial intelligence (AI) systems operate as black boxes, whose decision-making processes are not explicit or easily traceable. This makes it difficult for affected individuals to ascertain the reasons for surveillance measures or other decisions concerning them, or to effectively defend themselves against them, thereby impairing the right to an effective remedy. 7.2 Lack of Adaptation of Current Policy to New Technologies Existing data retention practices rules are not equipped to handle the rapidly increasing data streams from IoT, the speed of 5G, or the predictive capabilities of artificial intelligence (AI). This leads to general storage models becoming unmanageable and misuse risks increasing. Without independent judicial control, massive risks of abuse threaten to undermine social justice and EU fundamental rights. Conclusion to Chapter 7 In developing future artificial intelligence (AI)-supported methods for combating crime, the principles of data collection and analysis must be subjected not only to technical but also to strict ethical and legal limits. Otherwise, general data retention practices measures combined with artificial intelligence (AI) could lead to a structure that contradicts the values of democratic societies, violates human dignity, and opens the door to arbitrary interventions. 8. Conclusions and Recommendations General data retention practices possesses neither a stable legal basis nor demonstrable effectiveness. It directly attacks the most EU fundamental rights. Specific Recommendations to the Commission: The Commission should determine that these practices are incompatible with primary Union law, as confirmed by Court of Justice of the European Union (CJEU) jurisprudence. Member States should receive clear guidelines to promote targeted, proportionate models limited to serious crimes, under independent judicial control. Frameworks for the protection of encryption, anonymity, and digital privacy must be strengthened. Small and medium-sized service providers burdened by the requirements should be relieved through technical and financial support. Bibliography Binns, R. (2018). Algorithmic Accountability and Transparency in the EU GDPR. Philosophy & Technology, 31(2), 211–233. De Hert, P., & Poullet, Y. (2013). The Data Retention Directive: The Ghost that Should Not Walk. Computer Law & Security Review, 29(6), 673–683. Hoofnagle, C. J. et al. (2012). How Different is Privacy Law in Europe vs. the US? Berkeley Technology Law Journal, 28(2), 411–454. Lyon, D. (2018). The Culture of Surveillance: Watching as a Way of Life. Polity Press. Zuboff, S. (2019). The Age of Surveillance Capitalism. PublicAffairs. European Court of Human Rights (ECtHR) (2021). Big Brother Watch and Others v. the United Kingdom, Application no. 58170/13. Court of Justice of the European Union (CJEU) (2014). Digital Rights Ireland Ltd v. Minister for Communications, Marine and Natural Resources and Others, Case C-293/12. Court of Justice of the European Union (CJEU) (2016). Tele2 Sverige AB v. Post- och telestyrelsen and Secretary of State for the Home Department v. Tom Watson and Others, Joined Cases C-203/15 and C-698/15. ENISA (2020). Data Retention Practices in Europe. European Union Agency for Cybersecurity. European Parliament (2019). Privacy and Data Protection in Law Enforcement.

Vendor lock-in via apps, or "software tethering" as it's officially called by the FTC, is not a fantasy, but a present reality we already live in. Examples include:
🔸 Printers that disable basic USB printing unless you install the vendor’s “mobile print” app
🔸 Smart scales that won’t show your weight on their built-in display without pairing to the companion app
🔸 Wi-Fi plugs/outlets locked to on/off control via the manufacturer’s smartphone app only
🔸 Electric scooters or bikes that refuse to unlock or change settings without the provider’s app login
🔸 Digital thermometers whose readings remain hidden until you sync via their proprietary app
🔸 Wireless headphones that lock equalizer or firmware updates behind an app download

#righttorepair #smarthome #digitalrights #bigtech #bigtechboycott #surveillancecapitalism #iot #smarthome #enshitification #capitalism

New #wjds paper published:

Author Xin Ye explores how online platforms use manipulative, attention-grabbing tactics that undermine user autonomy. Current #EU laws fall short - this study offers 3 key policy fixes.

Read more: doi.org/10.34669/wi.wjds/5.3.2

The paper is part of a special issue on "Well-being in the Digital World"

@dougaparry #DarkPatterns #AddictiveDesign #TechEthics #DigitalRights #PlatformRegulation #GDPR #DSA #research #law #socialscience #computerscience

EU’s GDPR reform is supposed to speed up enforcement — but the current proposal might do the opposite.

📍 Longer deadlines
📍 More complexity
📍 Structural imbalance between users & companies
📍 Breach of fundamental rights

Possible annulment challenge upcoming. Worth following.

More on the draft law and context: linkedin.com/feed/update/urn:l

www.linkedin.com#gdpr #dataprotection #privacylaw #eulaw #fundamentalrights #digitalrights… | Aleksandra Samonek⚖️ 𝘓𝘦𝘨𝘢𝘭 𝘜𝘱𝘥𝘢𝘵𝘦: 𝘒𝘦𝘺 𝘋𝘦𝘷𝘦𝘭𝘰𝘱𝘮𝘦𝘯𝘵𝘴 𝘪𝘯 𝘌𝘜 𝘎𝘋𝘗𝘙 𝘌𝘯𝘧𝘰𝘳𝘤𝘦𝘮𝘦𝘯𝘵 𝘙𝘦𝘧𝘰𝘳𝘮 – 𝘞𝘩𝘢𝘵’𝘴 𝘢𝘵 𝘚𝘵𝘢𝘬𝘦? The EU’s ongoing trilogue negotiations on the proposed 𝗣𝗿𝗼𝗰𝗲𝗱𝘂𝗿𝗮𝗹 𝗥𝗲𝗴𝘂𝗹𝗮𝘁𝗶𝗼𝗻 𝗳𝗼𝗿 𝘁𝗵𝗲 𝗚𝗗𝗣𝗥 may significantly change how privacy law and fundamental rights are implemented. The regulation intended to harmonize and accelerate enforcement across Member States is now in its final stages. However, watchdog organizations, such as the noyb.eu, Fundacja Panoptykon, European Digital Rights, and many others, raised significant concerns about the draft law’s implications. Among them: 🚫 𝗘𝘅𝘁𝗲𝗻𝗱𝗲𝗱 𝗧𝗶𝗺𝗲𝗹𝗶𝗻𝗲𝘀: Despite initial goals of streamlining procedures, the proposed regulation introduces complex, multi-phase processes. According to noyb’s breakdown, deadlines for even basic procedural steps could already exceed 12 months, with the total time to decision potentially stretching to 2–3 years. The regulation itself won’t apply until 2026 or 2027, meaning cases may not reach enforceable deadlines until 2029. 🚫 𝗜𝗻𝗰𝗿𝗲𝗮𝘀𝗲𝗱 𝗔𝗱𝗺𝗶𝗻𝗶𝘀𝘁𝗿𝗮𝘁𝗶𝘃𝗲 𝗖𝗼𝗺𝗽𝗹𝗲𝘅𝗶𝘁𝘆: Rather than centralizing documentation in a single digital system, the proposal foresees a fragmented structure where documents must be duplicated and manually shared among over 40 national authorities. This could significantly increase administrative workload and costs across the EU. 🚫 𝗣𝗿𝗼𝗰𝗲𝗱𝘂𝗿𝗮𝗹 𝗜𝗺𝗯𝗮𝗹𝗮𝗻𝗰𝗲: The regulation introduces structural disparities between the rights afforded to companies and to individuals. For example: 👉🏽 Companies may have oral hearings; individuals typically do not. 👉🏽 Companies receive documents through their lead authority; users must request them across borders. 👉🏽 Companies are granted a “right to be heard”; users are given a more limited “opportunity to make views known.” These provisions may conflict with fundamental rights under the EU Charter, including: 👉🏽 Right to Data Protection (Article 8) 👉🏽 Equal Treatment (Article 20) 👉🏽 Right to Good Administration (Article 41) 👉🏽 Right to a Fair Hearing and Timely Procedure (Article 47) The implications of the draft law underscore how 𝗽𝗿𝗼𝗰𝗲𝗱𝘂𝗿𝗮𝗹 𝗱𝗲𝘀𝗶𝗴𝗻 𝗱𝗶𝗿𝗲𝗰𝘁𝗹𝘆 𝗮𝗳𝗳𝗲𝗰𝘁𝘀 𝗿𝗶𝗴𝗵𝘁𝘀 𝗲𝗻𝗳𝗼𝗿𝗰𝗲𝗺𝗲𝗻𝘁. For professionals in data protection, legal compliance, and digital rights, this regulation could significantly shape the future of GDPR application and accountability in Europe. 🔗 Links to the draft law and other resources in the comment. #GDPR #DataProtection #PrivacyLaw #EUlaw #FundamentalRights #DigitalRights #noyb #RegulatoryReform #Compliance #DueProcess #LegalNews #LegalUpdate #DataLaw #PrivacyRights #EDRi #HumanitarianAtWork

You know what? As a society, I don't think we have a data privacy problem, actually.

What we truly have is a
consent-respecting problem.

If, as a society and as individuals, we truly respected people's informed and freely given consent, we would not have problems with data privacy at all.

If we culturally improve our consent-respecting practices, we will inevitably understand better how to improve our privacy-respecting practices.

Ask first.
Give options.
Respect people's choices.
Never share without prior permission.