Economist Mariana Mazzucato has recently been writing about the governance of data. I have some problems with her proposals.

Mazzucato's theses (primarily found in her books The Entrepreneurial State[1] and The Value of Everything[2]) have generated recent public interest. Her work inherits a lot from innovation studies, a compelling literature in particular from the 90s onwards which has not recieved as much popularity as it perhaps should have. In short, Mazzucato argues that the assumption that those who recieve an income from economic activity must have made a corresponding contribution to the creation of this value is broken. Innovative firms, for example, take the credit and reap the economic rewards from high-risk state financing at the early stages of innovation, or investment into infrastructure which is a prerequisite to a business model. If we rethink the way value creation is understood, Mazzucato argues that it is only legitimate that value-creators recieve corresponding benefits of the value they create.

I don't have any fundamental problem with that thesis. I do, however, think that framing the governance of data and the governance of technology giants in this manner leads us to some concerning conclusions.

Linking the legitimacy of interventions to the provenance of value marginalises the most vulnerable

Mazzucato highlights inequality as one of her core concerns around the use of 'big data'. This topic has many angles, but needs no introduction here.[3] She writes

The privatization of data to serve corporate profits rather than the common good produces a new form of inequality — the skewed access to the profits generated from big data.[2:1]

This is a concern shared by many critical scholars who have long studied the issues around data and surveillance. It is also a concern of legislators and of courts. Yet it is what Mazzucato suggests next that divides her view from other critics, and which makes it a problematic frame to think about this challenge.

Mazzucato does not see the legitimacy to govern data to stem from fundamental rights to privacy or autonomy of individuals or communities. Instead, she writes that the right to intevene in data processing operations should stem directly from the provenance of the investment and the value-creation activities behind the data infrastructure and business models that exploit it.

the data that one generates when using Google Maps or Citymapper – or any other platform that relies on taxpayer-funded technologies – should be used to improve public transportation and other services, rather than simply becoming private profits.[4]

Which taxpayers are these which have some economic (or potentially moral) claim over the downstream use of technologies? Not all states can afford high-risk investments in digital infrastructure, or in other technologies.And indeed, not all states have previously spent money on it in the past. Mazzucato highlights the public sector development of the Internet, which emerged, as many of the technologies she highlighted in The Entrepreneurial State, through US military investment.

Given the immense risks the taxpayer takes when the government invests in visionary new areas like the Internet, couldn't we construct ways for rewards from innovation to be just as social as the risks taken?

A world where value-creation is rewarded more neatly by power to determine where the fruits of that value are used might decrease the relative power of technology giants, or telecoms firms, but would do so at the extent of a new - or old - kind of global dominance by the typical Western powers. Some would call it neo-colonialism.

New powers benefit from this logic as well as incumbents. Sweeping investments under China's Belt and Road Initiative see a range of countries as investment locations in China's efforts to establish a 'Information Silk Road'.[5] The idea that the investors should be rewarded with an ongoing stake in the use of these systems would make these already heavily geopolitical efforts even more valuable.

Similar dynamics have already been seen in the field. Linnet Taylor and Dennis Broeders, based on 60 interviews and 2 years of research, study the ways in which data from a range of companies relating to individuals in the Global South, such as those in telecoms, are regularly 'generated, collected and processed under the auspices of private-sector corporations and are shared, often on a pro-bono basis, at the level of international academic research institutions or development actors such as the UN'.[6] They call attention to the power asymmetries, entrenched power relations and new kinds of geography that are emerging through this corporate-led 'Data for Development'. Telecoms companies provide data to international organisations and consultancies, going around and disempowering national statistical agencies in favour of working with UN bodies.

Commodifying personal data is not the answer

There is a counterpoint to this: that perhaps if we consider the individuals whom the personal data relate to as providers of 'value' in this system, we can arrive to a more just outcome. This argument would broadly proceed that once companies have had to internalise the societal costs associated with using personal data, they would not choose to process data in the often invasive and societally damaging manners in which they currently can. This argument is very old. Mazzucato presents it as follows

The infrastructure that companies like Amazon rely on is not only publicly financed (as discussed, the Internet was paid for by tax dollars), but it feeds off network effects which are collectively produced.[2:2]

A main objection around assigning a value to data is: 'value to whom'? Following Helen Nissenbaum, privacy makes little sense outside of context: if you were trying to place a 'value' on personal data and data privacy, it would be contingent on who is gathering the information, who is analysing it, to whom it is disclosed, the nature of the information, the relationships between the parties who are processing data, and the broader social context in which it is being processed.[7] The contextual nature of valuation of issues of identity and personality make valuation an unworkably and unmeasurably contingent frame to place meaningful societal safeguards.

Added to this, there is the regularly trodden question of inalieability and exploitability. Individuals in financial need, marginalised by society, may not want to sell their privacy, but they may have to in order to keep themselves and their loved ones above water. There is a reason why it is difficult to contract away fundamental rights, and authors proposing this would be recommended to examine the long literature in this area carefully.[8]

Perverse privacy incentives and public-private surveillance partnerships

Mazzucato's argument creates an array of perverse incentives in relation to fundamental rights. Technology giants currently collect much, much more data than they require to deliver a service. Consider Facebook: protocols to message each other, join groups and share media can function well without hidden third party trackers in place on 43% of mobile apps[9] or nearly 30% of all webpage loads.[10] It is easy to drink the Koolaid, and imagine that 'more data' results in better societal outcomes, somehow, but there is scant evidence for this being true in a general sense, and persuasive arguments against it in the form of issues such as bias and the inherent limitations of secondary re-use of data in relation to rigourous analysis.[11] Privacy-enhancing technologists have been showing, using techniques such as differential privacy, secure multiparty computation, homomorphic encryption and zero-knowledge proofs, that business models often do not require centralisation of large datasets at all to do what they have already been doing.[12]

Providing greater control to the builders of infrastructure and those generating ideas upstream creates perverse incentives for those actors to promote technologies that help maximise the informational value they can extract from downstream citizens. There is no incentive for producing technologies that do more with less: if the informational pie is to be shared, why not make it as large as possible? The way in which social media companies have been already co-opted by national security is no longer a secret. The correct societal response would be to change the nature of their activities and the form of their data processing. Mazzucato's logic does not help us greatly there.

Legitimate governance of optimisation systems does not need an economic excuse

Thankfully, both privacy and data protection are fundamental rights in Europe. European law requires that only the minimal data required for a certain purpose be processed. An individual cannot sell her data inalieably: she retains rights over it that she cannot be coerced out of by financial need. Creating a market where an individuals' data is valued creates a market where that individual can 'sell' their identity and their autonomy: individuals who will be dispropotionately marginalised and poor.

I am not suggesting Mazzzucato is arguing we do away with existing frameworks and replace them with her own ideas. However, they come at a time when, for example, a potential US Presidential candidate is running on a platform of seeing 'Data as a Property Right',[13] and narratively compelling 'solutions' are not often presented with crucial nuance, context and counterpoints. We certainly need to govern data use, and societally we need to be more in control of data infrastructures. But let's keep our eyes on human rights, and the rights of communities, not on new logics of economic desert.

  1. Mariana Mazzucato, The Entrepreneurial State (Anthem Press 2013) ↩︎

  2. Mariana Mazzucato, The Value of Everything (Penguin 2018) ↩︎ ↩︎ ↩︎

  3. It can mean a few things, however. It might mean inequality in the sense of equality, or non-discrimination law, see Solon Barocas and Andrew D Selbst, ‘Big Data’s Disparate Impact’ (2016) 104 California Law Review 671. It might mean inequality in the sense of competition law, see e.g. Jacques Crémer and others, Competition Policy for the Digital Era (European Commission, DG COMP 2019). It might also mean the relationship between technological infrastructures and other forms of systemic oppression, see generally Seeta Peña Gangadharan and Jędrzej Niklas, ‘Decentering Technology in Discourse on Discrimination’ (2019) 22 Information, Communication & Society 882; Oscar H Gandy, ‘Engaging Rational Discrimination: Exploring Reasons for Placing Regulatory Constraints on Decision Support Systems’ (2010) 12 Ethics and Information Technology 29. ↩︎

  4. Mariana Mazzucato, 'Preventing Digital Feudalism' (Oct 2 2019) ↩︎

  5. Sen Gong and others, The Impact of the Belt and Road Initiative Investment in Digital Connectivity and Information and Communication Technologies on Achieving the SDGs (Institute of Development Studies 2019) ↩︎

  6. Linnet Taylor and Dennis Broeders, ‘In the Name of Development: Power, Profit and the Datafication of the Global South’ (2015) 64 Geoforum 229. ↩︎

  7. Helen Nissenbaum, Privacy in Context: Technology, Policy, and the Integrity of Social Life (Stanford University Press 2010). ↩︎

  8. See generally Corien Prins, ‘Property and Privacy: European Perspectives and the Commodification of Our Identity’ in L Guibault and PB Hugenholtz (eds), The Future of the Public Domain, Identifying the Commons in Information Law (Kluwer Law International 2006). ↩︎

  9. Reuben Binns and others, ‘Third Party Tracking in the Mobile Ecosystem’ in Proceedings of the 10th ACM Conference on Web Science (WebSci ’18, New York, NY, USA, ACM 2018). ↩︎

  10. Arjaldo Karaj and others, ‘WhoTracks.Me: Shedding Light on the Opaque World of Online Tracking’ [2018] arXiv:180408959. ↩︎

  11. See e.g. Mireille Hildebrandt, ‘Privacy as Protection of the Incomputable Self: From Agnostic to Agonistic Machine Learning’ (2019) 20 Theoretical Inquiries in Law 83. ↩︎

  12. The Royal Society, Protecting Privacy in Practice: The Current Use, Development and Limits of Privacy Enhancing Technologies in Data Analysis (The Royal Society 2019). ↩︎

  13. ↩︎