‘If the product is free, you are the product’
A vision of humanity in the Digital Services and Digital Markets Acts
Is the EU’s regime, including the Commission’s recent proposals in form of the DSA and DMA, fit to face the demands of our era of Big Tech and ‘big data’? I think the challenges posed by new technological developments necessitate a rethinking of the foundations of the regulatory system.
The vision(s) of humanity in the EU’s constitutional framework
What we find at the foundations of all our forms of expression and social ordering, including our legal systems, is a vision of humanity and a worldview: the way we think about what drives people and what they are capable of, determines the way we envision rights and allocate responsibility. This is what hermeneutic philosophers, such as Paul Ricoeur or James Boyd White, teach us.
The dominant view in EU law – in what we could call the constitutional framework of the EU - is, rather unsurprisingly, the Enlightenment idea of the human as a rational, self-interested individual. The EU’s origin as a project of economic integration means that this rational individual has sometimes been referred to as the ‘homo economicus’, or even as ‘self-entrepreneur’, responsible for running their lives as if they were running a business. This assumes a rather high level of personal capabilities, and also of a high level of control and responsibility.
Have we, as users of these digital services, transformed from consumers (who ‘pay’ with their data), into commodities?
In addition, the rational individual in the role of the ‘average consumer’ has also been of central importance. This benchmark assumes that individuals are reasonably well-informed and careful, and that they generally act in their own self-interest.
A tension, however, is also present in the EU’s vision of humanity, because the EU’s constitutional framework also envisions us as citizens of the Union, with accompanying rights and (to a lesser extent) duties. Moreover, the EU increasingly focuses on non-market interests and fundamental rights. Fundamental rights do not necessarily have the rational, successful individual as their benchmark, as they are generally based on a vision of humanity that is more vulnerable and more social.
Regulating Big Tech on the Digital Single Market: consumers, citizens, or …?
What kind of vision of humanity is at play in the DSA and DMA? Predominantly, they focus on the consumer (or ‘user’ or ‘recipient’). The centrality of the consumer is perhaps unsurprising, since the DSA is complementary to, and a kind of ‘follow-up’ of, the e-Commerce Directive, which has this focus too.
However, as noted by Viktorija Morozovaite, the hypernudging techniques used by a large number of Big Tech companies are ‘designed to surpass users’ rationality’ and they can be employed to ‘subvert autonomous choice and potentially manipulate consumers into unwarranted outcomes’. Her research challenges the consumer paradigm that underlies these rules.
The DSA and the DMA promise to ‘rebalance’ the responsibilities of ‘users, platforms, and public authorities’ ‘according to European values, placing citizens at the centre.’ It promises citizens more choice against lower prices, less exposure to illegal content, and an enhanced protection of fundamental rights. As noted by several authors of blogs in this series (Gerbrandy, Lalikova, and De Vries) the power of very large digital platforms means that the boundaries between public and private parties are increasingly blurred. The notion of ‘consumer’ therefore needs to be reconciliated with that of ‘citizen’.
Balance in vision of humanity?
However, I see no clear sign that the legal language and the overarching structure of the DSA and the DMA in combination with the rest of the EU’s legal regime governing ‘Big Tech’ adopt a more comprehensive vision of humanity.
The data protection regime currently offered by the GDPR also seems to view the data subject predominantly as consumer of a service. However, the GDPR acknowledges the vulnerability of the data subject, and is geared towards increasing his or her control. The lack of control in the asymmetric relationship between data subjects and ‘Big Tech’, is, however, also evidenced by the lack of responsibility that the users of digital technologies seem to have. In the upcoming publication of my research, I show that in the legislative framework as well as the interpretation of the CJEU, the data subject is viewed as entirely passive and vulnerable, with an uncommonly high level of protection of their fundamental rights. This stands in sharp contrast to the level of responsibility accorded to the individual as self-entrepreneur in free movement and EU citizenship case law.
Am I a consumer paying with data or a commodity?
An ongoing concern is the notion of consumers paying with data, whether they are aware of that or not. Accepting such a model treats personal data as a mere economic asset and does not sufficiently take the data subject’s fundamental rights into account. This concern is not fully addressed in the DSA and DMA proposals.
But I wonder: is the conversation here still about a responsible, versus a vulnerable consumer, or have we, as users of these digital services, transformed from consumers (who ‘pay’ with their data), into commodities? And if that is so, what does that mean for our use of these services as citizens, in our democratic processes? How these matters should be solved, remains to be seen. However, it is necessary to achieve coherence of the legal regime at a deeper level than merely stating that one piece of legislation (the DSA and/or the DMA) is ‘without prejudice’ to another (for instance, the GDPR).
Pauline Phoa, Assistant professor of European law and post-doctoral researcher in the ERC-funded 'Modern Bigness’ project, Utrecht University.