Crypto Ticker:
Personvern from EFF Updates

Getting Digital Fairness Right: EFF's Recommendations for the EU's Digital Fairness Act

Christoph Schmon
4 hours ago
1 Views
0 Comments
Getting Digital Fairness Right: EFF's Recommendations for the EU's Digital Fairness Act

Digital Fairness in the EU

The next few years will be decisive for EU digital policymaking. With major laws like the Digital Services Act, the Digital Markets Act, and the AI Act now in place, the EU is entering an enforcement era that will show whether these rules are rights-respecting or drift toward overreach and corporate control. With the proposed EU’s Digital Fairness Act (DFA), the Commission is now turning to increasingly visible risks for users, such as dark patterns and exploitative personalization. Its “Digital Fairness Fitness Check” makes clear that existing consumer rules need updating to reflect how digital markets operate today. 

But not all proposed solutions point in the right direction. Regulators are already flirting with measures that rely on expanded surveillance, such as age verification mandates—surface-level fixes that risk undermining fundamental rights while offering little more than a false sense of protection. 

For EFF, digital fairness means addressing the root causes of harm, not requiring platforms to exert more control over their users. It means safeguarding privacy, freedom of expression, and the rights of users and developers.

If the DFA is to make a real difference, it must tackle structural imbalances. Lawmakers should focus on two interlocking principles. First, prioritize privacy. Reforms should address harms driven by surveillance-based business models, alongside deceptive design practices that impair informed choices. Second, strengthen user sovereignty, which is also a necessary precondition for European digital sovereignty more broadly. Strengthening user sovereignty means taking measures that address user lock-in, coercive contract terms, and manipulative defaults that limit users’ ability to freely choose how they use digital products and services.

Together, these principles would support the EU’s objectives of consistent consumer protection, fair markets, and a more coherent legal framework. If implemented properly, the EU could address power imbalances and build trust in Europe’s digital economy. 

Ban Dark Patterns  

Dark patterns are practices that impair users’ ability to make informed and autonomous decisions. Many companies deploy these tactics through interface design to steer choices and influence behavior. Their impact goes beyond poor consumer decisions. Dark patterns push users to share personal data they would not otherwise disclose and undermine autonomy by making alternatives harder to access. 

The DFA should address this by clearly prohibiting misleading interfaces that distort user choice in commercial contexts. While the Digital Services Act introduced a definition, it only partially bans such practices and leaves gaps across existing consumer law rules. The DFA should close these gaps by, at the very least, introducing explicit prohibitions and clearer enforcement rules, without resorting to design mandates. 

Tackle Commercial Surveillance 

At the core of digital unfairness lies the pervasive collection and use of personal data. Surveillance and profiling drive many of the harms regulators are trying to address, from dark patterns to exploitative personalization. The DFA should tackle these incentives directly by reducing reliance on surveillance-based business models. These practices are fundamentally incompatible with privacy and fairness, and they distort digital markets by rewarding data exploitation rather than quality of service. At a minimum, the DFA should address unfair profiling and surveillance advertising by strengthening privacy rights and banning pay-for-privacy schemes. Users should not have to trade their data or pay extra to avoid being tracked. Accordingly, the DFA should support the recognition of automated privacy signals by web browsers and mobile operating systems, which give users a better way to reject tracking and exercise their rights. Practices that override such signals through banners or interface design should be considered unfair. 

Addressing surveillance and profiling also protects children, since many online harms are tied to the collection and exploitation of their data. Systems that serve ads or curate content often rely on intrusive profiling practices, raising concerns about privacy and fairness, particularly when applied to minors. Rather than turning to invasive age verification, the focus should be on limiting data use by default.

Strengthen User Sovereignty  

There is a major gap in how EU law addresses user autonomy in digital markets: Many digital products and services still restrict what people can do with what they pay for through opaque or one-sided licensing terms, technical protection measures, and remote controls. These mechanisms increasingly limit lawful use, modification, or access after purchase, allowing providers to revoke access, disable functionalities, or degrade performance over time. In practice, this turns ownership into a conditional rental.  

Consumers must be able to use and resell digital goods without hidden limitations and with clear licensing terms. Too often, technical and contractual lock-ins, including remote lockouts and unilateral restrictions on functionality, erode that control. Recent legal reforms show that progress is possible. Rules such as those under the Digital Markets Act have begun to curb technical and contractual barriers and promote user choice. However, many restrictions persist.

The DFA must address these practices by targeting unfair post-sale restrictions and strengthening users’ ability to control and switch services. This means setting clear limits on unfair terms and misleading practices, alongside robust transparency on how digital services function over time. It should also strengthen interoperability and support user control, allowing people to access third-party applications and to let trusted applications act on their behalf, reducing lock-in and expanding meaningful choice in how users interact with digital services. 

Was this helpful?
Share:

Comments (0)

Please login to post a comment

No comments yet. Be the first to comment!