Two bookmarks, concerning GDPR enforcement. The GDPR is an EU law with global reach and as such ‘exports’ the current European approach to data protection as a key citizen right. National Data Protection Agencies (DPAs) are tasked with enforcing the GDPR against companies not complying with its rules. The potential fines for non-compliance are very steep, but much depends on DPAs being active. Various DPAs at this point, 2 years after GDPR enforcement commencing, seem understaffed, indecisive, or dragging their feet.

Now the DPAs are being sued by citizens to force them to do their job properly. The Luxembourg DPA is sued for the surprising ruling that the GDPR is basically unenforcable outside the EU (which isn’t true, as it could block services into the EU, seize assets etc.) And there’s a case before the EUCJ, based on the Irish DPA being extremely slow in starting investigations of the Big Tech companies registered within its jurisdiction, that would allow other national DPAs to start their own cases against these companies. (Normally the DPA of the country where a company is registered is responsible, but in certain cases DPA’s of the countries of residence of the complaining citizen can get involved too.)

The DPAs are the main factor in whether the GDPR is an actual force for data protection or an empty gesture. And it seems patience with DPAs to take up their defined role is running out with various EU citizens. Rightly so.

Read Do we really want to “sell” ourselves? The risks of a property law paradigm for personal data ownership.
....viewing this data as property that is capable of being bought, sold, and owned by others is in large part how we ended up with a broken internet funded by advertising — or the “ad tech model” of the Internet. A property law-based, ownership model of our data risks extending this broken ad tech model of the Internet to all other facets of our digital identity and digital lives expressed through data. While new technology solutions are emerging to address the use of our data online, the threat is not solved with technology alone. Rather, it is time for our attitudes and legal frameworks to catch up. The basic social compact should be explicitly supported and reflected by our business models, legal frameworks and technology architectures, not silently eroded and replaced by them.

Elizabeth Renieris and Dazza Greenwood give different words to my previously expressed concerns about the narrative frame of personal ownership of data and selling it as a tool to counteract the data krakens like Facebook. The key difference is in tying it to different regulatory frameworks, and when each of those comes into play. Property law versus human rights law.

I feel the human rights angle also will serve us better in coming to terms with the geopolitical character of data (and one that the EU is baking into its geopolitical proposition concerning data). In the final paragraph they point to the ‘basic social compact’ that needs explicit support. That I connect to my notion of how so much personal data is also more like communal data, not immediately created or left by me as an individual, but the traces I leave acting in public. At Techfestival Aza Raskin pointed to fiduciary roles for those holding data on those publicly left personal data traces, and Martin von Haller mentioned how those personal data traces also can serve communal purposes and create communal value, placing it in yet another legal setting (that of weighing privacy versus public interest)

After California, now the Washington State senate has adopted a data protection and privacy act that takes the EU General Data Protection Regulation (GDPR) as an example to emulate.

This is definitely a hoped for effect of the GDPR when it was launched. European environmental and food safety standards have had similar global norm setting impact. This as for businesses it generally is more expensive to comply with multiple standards, than it is to only comply with the strictest one. We saw it earlier in companies taking GDPR demands and applying them to themselves generally. That the GDPR might have this impact, is an intentional part of how the EC is developing a third proposition in data geopolitics, between the surveillance capitalism of the US data lakes, and the data driven authoritarianism of China.

To me the GDPR is a quality assurance instrument, with its demands increasing over time. So it is encouraging to see other government entities outside the EU taking a cue from the GDPR. California and Washington State now have adopted similar laws. Five other States in the USA have introduced similar laws for debate in the past 2 months: Hawaii, Massachusetts, New Mexico, Rhode Island, and Maryland.

Does the New York Times see the irony? This article talks about how US Congress should look much less at the privacy terms of big tech, and more at the actual business practices.

Yet it calls upon me to disable my ad blocker. The ad blocker that blocks 28 ads in a single article, all served by a Google advertisement tracker. One which one of my browsers flags as working the same way as cross site scripting attacks work.

If as you say adverts are at the core of your business model, making journalism possible, why do you outsource it?
I’m ok with advertising New York Times, but not with adtech. There’s a marked difference between the two. It’s adtech, not advertising, that does the things you write about, like “how companies can use our data to invisibly shunt us in directions” that don’t benefit us. And adtech is the reason that, as you the say, the “problem is unfettered data exploitation and its potential deleterious consequences.” I’m ok with a newspaper running their own ads. I’m not ok with the New York Times behaving like a Trojan horse, pretending to be a newspaper but actually being a vehicle for, your own words, the “surveillance economy”.

Until then my ad blocker stays.

My browser blocking 28 ads (see the address bar) on a single article, all from 1 Google ad tracker.