To me there seems to be something fundamentally wrong with plans I come across where companies would pay people for access to their personal data. This is not a well articulated thing, it just feels like the entire framing of the issue is off, so the next paragraphs are a first attempt to jot down a few notions.
To me it looks very much like a projection by companies on people of what companies themselves would do: treating data as an asset you own outright and then charging for access. So that those companies can keep doing what they were doing with data about you. It doesn’t strike me as taking the person behind that data as the starting point, nor their interests. The starting point of any line of reasoning needs to be the person the data is about, not the entity intending to use the data.
Those plans make data release, or consent for using it, fully transactional. There are several things intuitively wrong with this.
One thing it does is put everything in the context of single transactions between individuals like you and me, and the company wanting to use data about you. That seems to be an active attempt to distract from the notion that there’s power in numbers. Reducing it to me dealing with a company, and you dealing with them separately makes it less likely groups of people will act in concert. It also distracts from the huge power difference between me selling some data attributes to some corp on one side, and that corp amassing those attributes over wide swaths of the population on the other.
Another thing is it implies that the value is in the data you likely think of as yours, your date of birth, residence, some conscious preferences, type of car you drive, health care issues, finances etc. But a lot of value is in data you actually don’t have about you but create all the time: your behaviour over time, clicks on a site, reading speed and pauses in an e-book, minutes watched in a movie, engagement with online videos, the cell towers your phone pinged, the logs about your driving style of your car’s computer, likes etc. It’s not that the data you’ll think of as your own is without value, but that it feels like the magician wants you to focus on the flower in his left hand, so you don’t notice what he does with his right hand.
On top of that it also means that whatever they offer to pay you will be too cheap: your data is never worth much in itself, only in aggregate. Offering to pay on individual transaction basis is an escape for companies, not an emancipation of citizens.
One more element is the suggestion that once such a transaction has taken place everything is ok, all rights have been transferred (even if limited to a specific context and use case) and that all obligations have been met. It strikes me as extremely reductionist. When it comes to copyright authors can transfer some rights, but usually not their moral rights to their work. I feel something similar is at play here. Moral rights attached to data that describes a person, which can’t be transferred when data is transacted. Is it ok to manipulate you into a specific bubble and influence how you vote, if they paid you first for the type of stuff they needed to be able to do that to you? The EU GDPR I think takes that approach too, taking moral rights into account. It’s not about ownership of data per se, but the rights I have if your data describes me, regardless of whether it was collected with consent.
The whole ownership notion is difficult to me in itself. As stated above, a lot of data about me is not necessarily data I am aware of creating or ‘having’, and likely don’t see a need for to collect about myself. Unless paying me is meant as incentive to start collecting stuff about me for the sole purpose of selling it to a company, who then doesn’t need my consent nor make the effort to collect it about me themselves. There are other instances where me being the only one able to determine to share some data or withhold it mean risks or negative impact for others. It’s why cadastral records and company beneficial ownership records are public. So you can verify that the house or company I’m trying to sell you is mine to sell, who else has a stake or claim on the same asset, and to what amount. Similar cases might be made for new and closely guarded data, such as DNA profiles. Is it your sole individual right to keep those data closed, or has society a reasonable claim to it, for instance in the search for the cure for cancer? All that to say, that seeing data as a mere commodity is a very limited take, and that ownership of data isn’t a clear cut thing. Because of its content, as well as its provenance. And because it is digital data, meaning it has non-rivalrous and non-exclusionary characteristics, making it akin to a public good. There is definitely a communal and network side to holding, sharing and processing data, currently conveniently ignored in discussions about data ownership.
In short talking about paying for personal data and data lockers under my control seem to be a framing that presents data issues as straightforward but doesn’t solve any of data’s ethical aspects, just pretends that it’s taken care of. So that things may continue as usual. And that’s even before looking into the potential unintended consequences of payments.
Replied to On Selling Access to Your Data and Ownership of Data by Ton Zijlstra Ton Zijlstra (Interdependent Thoughts)
This is a brieft test. I’ve been reading this in Aaron Parecki’s Monocle social reader, and am now commenting inline from within the reader.
Hi Ton,
Yes, this Is an interesting area for thought and research. I had to think of an HBR-article I read many moons ago, when I wrote an update to our privacy/Open Web/etc-dinner: https://www.onedaycompany.nl/digital-marketing-manager/het-veranderende-internet-update/.
After a lot of Search, I found it – its from 1997 and is called (brace yourself) “The Coming battle for customer information” (https://hbr.org/1997/01/the-coming-battle-for-customer-information). A lot in the article is still very relevant today.
What always appealed to me, is the notion of an entity (company/cooperation/etc) that would act as a go-between for you and someone else or some company. You could compare the exchange of data a bit as how IRMA should work: I give you just the information that is enough to grant me access. Or even better: my go-between assures a company that it negotiates for a group of people with certain characteristics, but won’t give access who exactly is in that group.
Plus, I hope that the Public Space-initiative (https://publicspaces.net) will help to speed up the discussion about personal data in a public area. I dream of a time when my virtual behaviour is as visible as my physical one: I don’t object being observed, but that’s about it: if someone want more info about me aside from my outside appearance, then they need to have an exchange with me. Comparable with seeing someone in the street, therefore.
Wouldn’t it be wonderful if our standard status would be that we are all in public and therefore: not followed?
That chimes very much with how I approach privacy generally: it is an intrinsic part of being in public, of the commons. Privacy is a gift from others to me, while I interact in the commons. In a physical public space there usually is immediate retribution, a social cost, to not giving someone their privacy. We have a whole spectrum of words for it, from impoliteness and rudeness all the way to harassment, abuse, and assault. If I do not give you your privacy it will at the very least tarnish my reputation with others that witness it. The online equivalent is slightly different. There the cost of not respecting someone’s privacy can be zero, due to a big social distance between the people involved. The lack of immediate social feedback introduces a strong asymmetry: it is much less costly to be a troll than it is to defend against being trolled.
It is telling in that regard that the EU GDPR is not a privacy law but a data protection law. It says under which conditions personal data can be used in which ways. It deliberately goes nowhere near actual privacy considerations, as the social dynamics of that are likely impossible to formulate in legalese.
Last week I attended Techfestival in Copenhagen. I participated in a day long Public Data Summit. This posting are thoughts and notes based on some of what was discussed during that Public Data Summit.
Group work at the Public Data Summit
Martin von Haller Groenbaek (we go back in open data a long time) provided an interesting lightning talk at the start of the Public Data Summit. He posited that in order to realise the full potential of open (government) data, we probably need to be more relaxed in sharing personal data as well.
There is a case to be made, I agree. In energy transition for instance your (real time) personal electricity use is likely key information. The aggregated yearly usage of you and at least 10 neighbours e.g. the Dutch electricity networks make available is not useless by far, but lacks the granularity, the direct connection to real people’s daily lives to be truly valuable for anything of practical use.
I agree with the notion that more person related data needs to come into play. Martin talked about it in terms of balancing priority, if you want to fix climate change, or another key societal issue, personal data protection can’t be absolute.
Now this sounds a bit like “we need your personal data to fight terrorism” which then gets translated “give me your fingerprints or your safety and security is compromised”, yet that is both an edge case and an example of the types of discussions needed to find the balancing point, to avoid false dilemma’s or better yet prevent reductionism towards ineffective simplicity (such as is the case with terrorism, where all sense of proportionality is abandoned). The balancing point is where the sweet spot of addressing the right level of complexity is. In the case of terrorism the personal data sharing discussion is framed as “you’re either with us, or with the terrorists” to quote Bush jr., a framing in absolutes and inviting a cascade of scope creep.
To me this is a valuable discussion to be had, to determine when and where sharing your personal data is a useful contribution to the common good or even should be part of a public good. Currently that ‘for the common good’ part is not in play mostly. We’re leaking personal data all over the tech platforms we use, without much awareness of its extend or how it is being used. We do know these data are not being used for the common good as it’s in no-one’s business model. This public good / common good thinking was central to our group work in the Public Data Summit during the rest of the day too.
Martin mentioned the GDPR as a good thing, certainly for his business as a lawyer, but also as a problematic one. Specifically he criticised the notion of owning personal data, and being able to trade it as a commodity based on that ownership. I agree, for multiple reasons. One being that a huge amount of our personal data is not directly created or held by me, as it is data about behavioural patterns, like where my phone has been, where I used my debit card, the things I click, the time I spent on pages, the thumbprint of my specific browser and plugins setup etc. The footsteps we leave on a trail in the forest aren’t personal data, but our digital footsteps are, because the traces can, due to the persistence of those tracks, more easily than in the woods be followed back to their source as well as can get compared to other tracks you leave behind.
Currently those footsteps in the digital woods are contractualised away into the possession of private owners of the woods we walk in, i.e. the tech platforms. But there’s a strong communal aspect to your and my digital footsteps as personal data. We need to determine how we can use that collectively, and how to govern that use. Talking about the ownership of data, especially the data we create by being out in the (semi) public sphere (e.g. tech platforms) and the ability to trade for it (like Solid suggests), has 2 effects: it bakes in the acceptance that me allowing FB to work with my data is a contract between equal parties (GDPR rightly tries to address this assymmetry). Aza Raskin in his keynote mentioned this too, saying tech platforms should be more seen and regulated as fiduciaries, to acknowledge the power asymmetry. And it takes the communal part of what we might do with data completely out of the equation. I can easily imagine when and where I’d be ok with my neighbours, my local government, a local NGO, or specific topical/sectoral communities etc. having access to using data about me. Where that same use by FB et al would not be ok at all under any circumstance.
In the intro’s to the public data summit civil society however was very much absent, there was talk about government and their data, and how it needed the private sector to do something valuable with it. Where to me open (e-)government, and opening data is very much about allowing the emergence and co-creation of novel public services by government/governance working together with citizens. Things we maybe not now regard as part of public institutions, structures or the role of government, but that in our digitised world very well could, or even should, be.
Elizabeth Renieris and Dazza Greenwood give different words to my previously expressed concerns about the narrative frame of personal ownership of data and selling it as a tool to counteract the data krakens like Facebook. The key difference is in tying it to different regulatory frameworks, and when each of those comes into play. Property law versus human rights law.
I feel the human rights angle also will serve us better in coming to terms with the geopolitical character of data (and one that the EU is baking into its geopolitical proposition concerning data). In the final paragraph they point to the ‘basic social compact’ that needs explicit support. That I connect to my notion of how so much personal data is also more like communal data, not immediately created or left by me as an individual, but the traces I leave acting in public. At Techfestival Aza Raskin pointed to fiduciary roles for those holding data on those publicly left personal data traces, and Martin von Haller mentioned how those personal data traces also can serve communal purposes and create communal value, placing it in yet another legal setting (that of weighing privacy versus public interest)
Read Do we really want to “sell” ourselves? The risks of a property law paradigm for personal data ownership. (Medium)