To me there seems to be something fundamentally wrong with plans I come across where companies would pay people for access to their personal data. This is not a well articulated thing, it just feels like the entire framing of the issue is off, so the next paragraphs are a first attempt to jot down a few notions.

To me it looks very much like a projection by companies on people of what companies themselves would do: treating data as an asset you own outright and then charging for access. So that those companies can keep doing what they were doing with data about you. It doesn’t strike me as taking the person behind that data as the starting point, nor their interests. The starting point of any line of reasoning needs to be the person the data is about, not the entity intending to use the data.

Those plans make data release, or consent for using it, fully transactional. There are several things intuitively wrong with this.

One thing it does is put everything in the context of single transactions between individuals like you and me, and the company wanting to use data about you. That seems to be an active attempt to distract from the notion that there’s power in numbers. Reducing it to me dealing with a company, and you dealing with them separately makes it less likely groups of people will act in concert. It also distracts from the huge power difference between me selling some data attributes to some corp on one side, and that corp amassing those attributes over wide swaths of the population on the other.

Another thing is it implies that the value is in the data you likely think of as yours, your date of birth, residence, some conscious preferences, type of car you drive, health care issues, finances etc. But a lot of value is in data you actually don’t have about you but create all the time: your behaviour over time, clicks on a site, reading speed and pauses in an e-book, minutes watched in a movie, engagement with online videos, the cell towers your phone pinged, the logs about your driving style of your car’s computer, likes etc. It’s not that the data you’ll think of as your own is without value, but that it feels like the magician wants you to focus on the flower in his left hand, so you don’t notice what he does with his right hand.
On top of that it also means that whatever they offer to pay you will be too cheap: your data is never worth much in itself, only in aggregate. Offering to pay on individual transaction basis is an escape for companies, not an emancipation of citizens.

One more element is the suggestion that once such a transaction has taken place everything is ok, all rights have been transferred (even if limited to a specific context and use case) and that all obligations have been met. It strikes me as extremely reductionist. When it comes to copyright authors can transfer some rights, but usually not their moral rights to their work. I feel something similar is at play here. Moral rights attached to data that describes a person, which can’t be transferred when data is transacted. Is it ok to manipulate you into a specific bubble and influence how you vote, if they paid you first for the type of stuff they needed to be able to do that to you? The EU GDPR I think takes that approach too, taking moral rights into account. It’s not about ownership of data per se, but the rights I have if your data describes me, regardless of whether it was collected with consent.

The whole ownership notion is difficult to me in itself. As stated above, a lot of data about me is not necessarily data I am aware of creating or ‘having’, and likely don’t see a need for to collect about myself. Unless paying me is meant as incentive to start collecting stuff about me for the sole purpose of selling it to a company, who then doesn’t need my consent nor make the effort to collect it about me themselves. There are other instances where me being the only one able to determine to share some data or withhold it mean risks or negative impact for others. It’s why cadastral records and company beneficial ownership records are public. So you can verify that the house or company I’m trying to sell you is mine to sell, who else has a stake or claim on the same asset, and to what amount. Similar cases might be made for new and closely guarded data, such as DNA profiles. Is it your sole individual right to keep those data closed, or has society a reasonable claim to it, for instance in the search for the cure for cancer? All that to say, that seeing data as a mere commodity is a very limited take, and that ownership of data isn’t a clear cut thing. Because of its content, as well as its provenance. And because it is digital data, meaning it has non-rivalrous and non-exclusionary characteristics, making it akin to a public good. There is definitely a communal and network side to holding, sharing and processing data, currently conveniently ignored in discussions about data ownership.

In short talking about paying for personal data and data lockers under my control seem to be a framing that presents data issues as straightforward but doesn’t solve any of data’s ethical aspects, just pretends that it’s taken care of. So that things may continue as usual. And that’s even before looking into the potential unintended consequences of payments.

7 reactions on “On Selling Access to Your Data and Ownership of Data

  1. Replied to On Selling Access to Your Data and Ownership of Data by Ton Zijlstra Ton Zijlstra (Interdependent Thoughts)
    This is a brieft test. I’ve been reading this in Aaron Parecki’s Monocle social reader, and am now commenting inline from within the reader.

  2. Hi Ton,

    Yes, this Is an interesting area for thought and research. I had to think of an HBR-article I read many moons ago, when I wrote an update to our privacy/Open Web/etc-dinner:

    After a lot of Search, I found it – its from 1997 and is called (brace yourself) “The Coming battle for customer information” ( A lot in the article is still very relevant today.

    What always appealed to me, is the notion of an entity (company/cooperation/etc) that would act as a go-between for you and someone else or some company. You could compare the exchange of data a bit as how IRMA should work: I give you just the information that is enough to grant me access. Or even better: my go-between assures a company that it negotiates for a group of people with certain characteristics, but won’t give access who exactly is in that group.

    Plus, I hope that the Public Space-initiative ( will help to speed up the discussion about personal data in a public area. I dream of a time when my virtual behaviour is as visible as my physical one: I don’t object being observed, but that’s about it: if someone want more info about me aside from my outside appearance, then they need to have an exchange with me. Comparable with seeing someone in the street, therefore.

    Wouldn’t it be wonderful if our standard status would be that we are all in public and therefore: not followed?

    • That chimes very much with how I approach privacy generally: it is an intrinsic part of being in public, of the commons. Privacy is a gift from others to me, while I interact in the commons. In a physical public space there usually is immediate retribution, a social cost, to not giving someone their privacy. We have a whole spectrum of words for it, from impoliteness and rudeness all the way to harassment, abuse, and assault. If I do not give you your privacy it will at the very least tarnish my reputation with others that witness it. The online equivalent is slightly different. There the cost of not respecting someone’s privacy can be zero, due to a big social distance between the people involved. The lack of immediate social feedback introduces a strong asymmetry: it is much less costly to be a troll than it is to defend against being trolled.

      It is telling in that regard that the EU GDPR is not a privacy law but a data protection law. It says under which conditions personal data can be used in which ways. It deliberately goes nowhere near actual privacy considerations, as the social dynamics of that are likely impossible to formulate in legalese.

Comments are closed.