Article By Adela Nuță, Managing Associate, BACIU PARTNERS
The rules of data privacy in Europe, defined by the General Data Protection Regulation (GDPR) are becoming increasingly challenging as controllers, particularly those operating large-scale digital platforms such as social networks, e-commerce sites, and streaming services, seek innovative ways to monetize user engagement without breaching compliance.
One approach that has drawn significant attention is the 'Consent or Pay' model, where users must either agree to the use of their personal data (usually for advertising) or pay a fee to access the service without data collection.
This model has fueled extensive debate, including at the IAPP Europe Data Protection Congress 2024 in Brussels, where industry leaders and experts examined its implications. Discussions highlighted concerns that such models may exacerbate socio-economic inequalities, as privacy rights increasingly become a luxury afforded only by those who can pay, challenging GDPR’s goal of universal protection.
At first glance, this approach might seem to strike a fair balance between providing free access to services and respecting users’ privacy preferences. But the European Data Protection Board (EDPB) challenged this notion, underscoring significant concerns around the freedom of consent and the power dynamics at play in such models. The issue is not just theoretical but it cuts to the core of what valid consent under GDPR looks like, the choice between enjoying private life only if you pay being a reminder of an age where the rule of law was not quite in place
This dynamic raises critical questions about the sustainability of practices that prioritize economic goals over GDPR’s foundational principles, particularly the universality of data protection as a right.
A rest of free will: can consent really be freely given?
At the center of the issue is a key principle of the GDPR: consent must be freely given. In theory, giving users the option to pay instead of consenting to data processing should maintain this freedom. But in practice, the model may backfire. According to the GDPR, consent can only be valid if individuals have genuine and real choices, free from any imbalance of power, undue influence, or disadvantage resulting from a refusal to consent.
In the EDPB’s Opinion 08/2024, the board makes it clear: if users feel compelled to consent because the alternative (paying for the service) is not feasible or reasonable, then consent is not truly “freely given.” For platforms that dominate their markets, especially social networks, dating apps, or job platforms where participation can feel essential, the stakes are even higher. Users may feel they have little choice but to surrender their data if the fee for non-consent is prohibitively high or the service is simply irreplaceable.
The EDPB further highlighted that offering a choice that disproportionately burdens economically disadvantaged users undermines the principle of fairness, as it conditions privacy rights on financial capacity rather than treating them as universal protections.
Case study: Meta's 'Consent or Pay' model
Meta’s implementation of a “Consent or Pay” model in Europe exemplifies the complexities and regulatory scrutiny tied to such practices under GDPR. Under this approach, users of platforms like Facebook and Instagram must either consent to personalized advertising or pay a subscription fee, estimated at €13 per month, for an ad-free experience. While this setup appears to provide a choice, many argue it unfairly penalizes users unable to afford the fee, creating a system where privacy rights are disproportionately accessible to wealthier individuals, thus challenging GDPR’s principle of universal and equitable data protection.
The European Commission and privacy advocates have expressed concerns that this model leverages Meta’s dominant market position, placing its business interests above user rights. Given the critical role platforms like Meta play in communication, networking, and professional development, users may feel they have no real alternative but to comply, further exacerbating the imbalance of power between the user and the platform.
Moreover, linking user autonomy to financial incentives raises significant questions about the validity of consent under GDPR. The EDPB pointed out that consent cannot be considered “freely given” if tied to a detriment, such as high subscription fees. This forces users into a difficult choice: pay for privacy or give up control over their data. Such a trade-off turns personal information into a commodity, challenging the fairness of consent and reinforcing the idea that users must "pay" in one form or another for basic rights. This issue is further amplified by the Digital Markets Act (DMA), which imposes heightened responsibilities on gatekeepers like Meta to ensure their practices do not exploit their dominance or distort competition. Consequently, Meta’s approach could face intensified scrutiny, both for potential GDPR violations and anti-competitive behavior under the DMA.
To address these concerns, the EDPB has recommended exploring less intrusive monetization strategies, such as contextual advertising, which does not rely on behavioral profiling. Such alternatives could help platforms monetize their services without heavily infringing on user privacy or creating economic barriers. Additionally, Opinion 08/2024 emphasizes the importance of transparency and user control, urging platforms to design consent mechanisms that uphold the principles of GDPR while balancing commercial and user rights.
Legitimate interest: is it a viable path?
Faced with the challenges of obtaining valid consent in such contexts, some businesses may consider relying on legitimate interest as an alternative legal basis for data processing under Article 6(1)(f) of the GDPR. This rationale is often used to justify practices such as behavioural advertising or data analytics, which are integral to many revenue models and serve clear commercial purposes. However, the GDPR explicitly requires that legitimate interest must always be balanced against the rights and freedoms of data subjects. Where significant power imbalances exist, or if the service is essential to daily life, legitimate interest may not provide a sufficient legal basis, as it risks disproportionately impacting users fundamental rights.
It is insufficient for businesses to merely declare their data processing as being in their legitimate interest. They must demonstrate that their interests do not override the fundamental rights of users and that the data processing is necessary and proportionate to achieve their objectives. This places particular scrutiny on companies operating in sectors where their services are indispensable, such as social networks or digital platforms. Regulators, including the Court of Justice of the European Union (CJEU), have repeatedly stressed the need to protect users from practices that indirectly pressure them into giving up their privacy rights under the pretense of legitimate business interests.
Moreover, the EDPB’s Opinion 08/2024 underscores that legitimate interest is particularly unsuitable for practices like personalized advertising when users lack meaningful awareness or cannot easily object. This highlights the necessity for businesses to assess whether reliance on legitimate interest genuinely respects user autonomy, particularly in cases involving essential services or platforms where users may feel compelled to participate as it may give the impression that businesses prioritize profits over the fundamental right to privacy, thereby challenging the balance that GDPR aims to uphold.
Transparency and granularity: a new standard of accountability
With both consent and legitimate interest presenting risks, controllers face growing pressure to ensure transparency in their dealings with users. The EDPB has recommended adopting clear, interactive consent mechanisms that allow users to manage their privacy preferences dynamically, promoting continuous compliance. For instance, granular consent enables users to selectively agree to data processing types, enhancing control over how their data is used. Platforms that rely on behavioural advertising should prioritize non-intrusive methods like contextual advertising, balancing monetization with user rights.
For example, a user should be able to decline consent for their data to be used in advertising but still receive access to a service for basic functionality. This allows users to retain more control over how their data is used, reinforcing the GDPR’s principle of data minimization. If controllers bundle various types of data processing into one blanket consent, they risk violating the very foundation of GDPR’s consent requirements, as such practices force users into a dilemma that echoes the concept of “Your (private) life or your money,” compelling them to either sacrifice their privacy for access to services or accept financial costs to maintain their rights.
Additionally, platforms should incorporate user-friendly interfaces that allow dynamic changes to privacy settings, ensuring continuous alignment with GDPR’s emphasis on user control and transparency.
This level of transparency and granularity puts an additional burden on businesses to rethink their privacy notices and consent mechanisms. It is not enough to provide users with dense, jargon-filled legal documents as the GDPR requires a clear and concise explanation of data processing purposes and the consequences of withholding consent. This shift toward user empowerment is a fundamental part of GDPR compliance, particularly for large platforms engaging in behavioural advertising or data monetization.
Conclusion: a cautionary tale for data controllers
The EDPB’s Opinion 08/2024 serves as a clear reminder that consent under GDPR must be genuine, fair, and accessible to all. While ‘Consent or Pay’ models may seem to offer choice, they risk undermining these principles if not carefully structured. By designing consent mechanisms that prioritize transparency, reduce economic disparities, and minimize coercion, controllers can uphold GDPR’s principles while building user trust. Moreover, aligning these practices with ethical data handling not only ensures compliance but also fosters long-term sustainability and competitiveness in an increasingly privacy-conscious marketplace, thus diminishing the dilemma in the title.
“Your (private) life or your money” might be a relevant picture to paint when discusion about the risks and realities of ‘Consent or Pay’ Models under GDPR.
The rules of data privacy in Europe, defined by the General Data Protection Regulation (GDPR) are becoming increasingly challenging as controllers, particularly those operating large-scale digital platforms such as social networks, e-commerce sites, and streaming services, seek innovative ways to monetize user engagement without breaching compliance.
One approach that has drawn significant attention is the 'Consent or Pay' model, where users must either agree to the use of their personal data (usually for advertising) or pay a fee to access the service without data collection.
”This model has fueled extensive debate, including at the IAPP Europe Data Protection Congress 2024 in Brussels, where industry leaders and experts examined its implications. Discussions highlighted concerns that such models may exacerbate socio-economic inequalities, as privacy rights increasingly become a luxury afforded only by those who can pay, challenging GDPR’s goal of universal protection”, mentioned Adela Nuță who authored a comprehensive piece on the topic touching on:
A rest of free will: can consent really be freely given?
Case study: Meta's 'Consent or Pay' model
Legitimate interest: is it a viable path?
Transparency and granularity: a new standard of accountability
Conclusion: a cautionary tale for data controllers
The entire piece is available in the first comment below.