Since the start of this year I am actively tracking the suite of new European laws being proposed on digitisation and data. Together they are the expression into law of the geopolitical position the EU is taking on everything digital and data, and all the proposed laws follow the same logic and reasoning. Taken together they shape how Europe wants to use the potential and benefits of digitisation and data use, including specifically for a range of societal challenges, while defending and strengthening citizen rights. Of course other EU legal initiatives in parallel sometimes point in different directions (e.g. EU copyright regulations leading to upload filters, and the attempts at backdooring end-to-end encryption in messaging apps for mass surveillance), but that is precisely why to me this suite of regulations stands out. Where other legal initiatives often seem to stand on their own, and bear the marks of lobbying and singular industry interests, this group of measures all build on the same logic and read internally consistent as well as an expression of an actual vision.

My work is to help translate the proposed legal framework to how it will impact and provide opportunity to large Dutch government data holders and policy departments, and to build connections and networks between all kinds of stakeholders around relevant societal issues and related use cases. This to shape the transition from the data provision oriented INSPIRE program (sharing and harmonising geo-data across the EU), to a use needs and benefits oriented approach (reasoning from a societal issue to solve towards with a network of relevant parties towards the data that can provide agency for reaching a solution). My work follows directly from the research I did last year to establish a list of EU wide high value data sets to be opened, where I dived deeply into all government data and its governance concerning earth observation, environment and meteorology, while other team members did the same for geo-data, statistics, company registers, and mobility.

All the elements in the proposed legal framework will be decided upon in the coming year or so, and enter into force probably after a 2 year grace period. So by 2025 this should be in place. In the meantime many organisations, as well as public funding, will focus on already implementing elements of it even while nothing is mandatory yet. As with the GDPR, the legal framework once in place will also be an export mechanism of the notions and values expressed in it to the rest of the world. This as compliance is tied to EU market access and having EU citizens as clients wherever they are.

One element of the framework is already in place, the GDPR. The newly proposed elements mimic the fine structures of the GDPR for non-compliance.
The new elements take the EU Digital Compass and EU Digital Rights and Principles for which a public consultation is now open until 2 September as a starting point.

The new proposed laws are:

Digital Markets Act (download), which applies to all dominant market parties, in terms of platform providers as well as physical network providers, that de facto are gatekeepers to access by both citizens and market entities. It aims for a digital unified market, and sets requirements for interoperability, ‘service neutrality’ of platforms, and to prevent lock-in. Proposed in November 2020.

Digital Services Act (download), applies to both gatekeepers (see previous point) and other digital service providers that act as intermediaries. Aims for a level playing field and diversity of service providers, protection of citizen rights, and requires transparency and accountability mechanisms. Proposed in November 2020.

AI Regulatory Proposal (download), does not regulate AI technology, but the EU market access of AI applications and usage. Market access is based on an assessment of risk to citizen rights and to safety (think of use in vehicles etc). It’s a CE mark for AI. It periodically updates a list of technologies considered within scope, and a list of areas that count as high risk. With increasing risk more stringent requirements on transparency, accountability and explainability are set. Creates GDPR style national and European authorities for complaints and enforcement. Responsibilities are given to the producer of an application, distributors as well as users of such an application. It’s the world’s first attempt of regulating AI and I think it is rather elegant in tying market access to citizen rights. Proposed in April 2021.

Data Governance Act (download), makes government held data that isn’t available under open data regulations available for use (but not for sharing), introduces the European dataspace (created from multiple sectoral data spaces), mandates EU wide interoperable infrastructure around which data governance and standardisation practices are positioned, and coins the concept of data altruism (meaning you can securely share your personal data or company confidential data for specific temporary use cases). This law aims at making more data available for usage, if not for (public) sharing. Proposed November 2020.

Data Act, currently open for public consultation until 2 September 2021. Will introduce rules around the possibilities the Data Governance Act creates, will set conditions and requirements for B2B cross-border and cross-sectoral data sharing, for B2G data sharing in the context of societal challenges, and will set transparency and accountability requirements for them. To be proposed towards the end of 2021.

Open Data Directive, which sets the conditions and requirements for open government data (which build on the national access to information regulations in the member states, hence the Data Governance Act as well which does not build on national access regimes). The Open Data Directive was proposed in 2018 and decided in 2019, as the new iteration of the preceding Public Sector Information directives. It should have been transposed into national law by 1 July 2021, but not all MS have done so (in fact the Netherlands has just recently started the work). An important element in this Directive is EU High Value Data list, which will make publication of open data through APIs and machine readable bulk download mandatory for all EU member states for the data listed. As mentioned above, last year I was part of the research team that did the impact assessments and proposed the policy options for that list (I led the research for earth observation, environment and meteorology). The implementation act for the EU High Value Data list will be published in September, and I expect it to e.g. add an open data requirement to most of the INSPIRE themes.

Most of the elements in this list are proposed as Acts, meaning they will have power of law across the EU as soon as they are agreed between the European Parliament, the EU council of heads of government and the European Commission and don’t require transposition into national law first. Also of note is that currently ongoing revisions and evaluations of connected EU directives (INSPIRE, ITS etc.) are being shaped along the lines of the Acts mentioned above. This means that more specific data oriented regulations closer to specific policy domains are already being changed in this direction. Similarly policy proposals such as the European Green Deal are very clearly building on the EU digital and data strategies to achieving and monitoring those policy ambitions. All in all it will be a very interesting few years in which this legal framework develops and gets applied, as it is a new fundamental wave of changes after the role the initial PSI Directive and INSPIRE directive had 15 to 20 years ago, with a much wider scope and much more at stake.

Risk Board Game
The geopolitics of digitisation and data. Image ‘Risk Board Game’ by Rob Bertholf, license CC BY

My first reading of the yet to be published EU Regulation on the European Approach for Artificial Intelligence, based on a leaked version, I find pretty good. A logical approach, laid out in the 92 recitals preceding the articles, based on risk assessment, where erosion of human and citizen rights or risk to key infrastructure and services and product safety is deemed high risk by definition. High risk means more strict conditions, following some of the building blocks of the GDPR, also when it comes to governance and penalties. Those conditions are tied to being allowed to put a product on the market, and are tied to how they perform in practice (not just how they’re intended). I find that an elegant combination, risk assessment based on citizen rights and critical systems, and connected to well-worn mechanisms of market access and market monitoring. It places those conditions on both producers and users, as well as other parties involved along the supply chain. The EU approach to data and AI align well this way it seems, and express the European geopolitical proposition concerning data and AI, centered on civic rights, into codified law. That codification, like the GDPR, is how the EU exports its norms to elsewhere.

The text should be published soon by the EC, and I’ll try a write-up in more detail then.

De Gemeente Amsterdam wil een meldingsplicht voor sensoren in de publieke ruimte. Iedere organisatie die sensoren in de buitenruimte plaatst zou vanaf het najaar moeten melden waar die sensoren staan. Dit is nuttig om meerdere redenen. Allereerst omwille van transparantie en om de discussie over nut en noodzaak van al die sensoren om ons heen diepgaand te kunnen voeren. Of om te zien welke gegevens die nu door private organisaties worden verzameld, eventueel ook voor een gedeeld publiek belang kunnen worden gebruikt.

Amsterdam gaf eerder al een voorbeeld dat navolging verdient met de start van een algoritme-register, en dit sensorenregister lijkt me een uitstekende aanvulling.

(Defect) reclamebord op Utrecht Centraal Station dat ik in 2018 fotografeerde, met een ondoordachte camera in de publieke ruimte om aandacht voor de advertentie te meten. Burgerlijk verzet plakte de camera af.

It’s odd to see how conspiracy fantasies, suspect sources, disinformation and deliberate emotionally provocative or even antagonistic wording are on the rise on my LinkedIn timeline.

I first encountered a QAnon account in a comments section last August, but that person was still many steps away in my network. Now I see things popping up from direct connections and their connections. I had assumed that LinkedIn being tied to your professional reputation would go a long way to prevent such things, but apparently not any longer. In some instances, it’s almost as if people don’t realise they’re doing it, a boiling-a-frog effect of sorts.

One person being called out for some under-informed reactionary content by pointing out that their employer has the capabilities and resources to prove them wrong even responded “leave my employer out of it”. That’s not really possible though, as your employer is in your by-line and accompanies your avatar with every post and comment you make. Seven months after first encountering something like that on my LinkedIn timeline it is now a daily part of my timeline, and all coming from my Dutch network and their connections.

LinkedIn is starting to feel as icky as Facebook did three years ago. Makes me wonder how long LinkedIn will remain a viable tool. I don’t think I will be spending much or any attention on my timeline moving forward, until the moment LinkedIn is as much a failed social platform as others and it’s time to let go of it completely. That doesn’t mean disengaging with the people in my network obviously, but it is not at all my responsibility to help LinkedIn reach a certain level of quality of discourse by trying to counteract the muck. I was an early user of LinkedIn (nr. 8730, look at the source of your profile page and search it for ‘member:’ to find your number) in the spring of 2003, I know there’s already a trickle of people leaving the platform, and I wonder when (not if) I’ll fully join them.

De Open State Foundation en SETUP lanceren de SOS Tech Awards gericht op transparantie en verantwoordelijkheid in de digitale samenleving.

De Glass & Black Box Awards gaan over openheid en transparantie. De Dode & Levende Mussen Award gaan over verantwoordelijkheid nemen na technologische missers, en de mate waarin bedrijven en overheden niet alleen excuus aanbieden maar ook echt hun handelen aanpassen. De genomineerden worden in de komende weken bekend gemaakt. De SOS Tech Awards worden op dinsdag 23 maart 2021 uitgereikt via een livestream vanuit de centrale Bibliotheek Utrecht.

(In het kader van transparantie: ik ben bestuurslid bij de Open State Foundation)