A little over a decade ago I was at a small conference, where I happened to share the stage with a British lawyer, Polly Higgins, seeking to internationally criminalise ‘ecocide’, alongside various other speakers. One of those others was a self declared rationalist running a data driven research start-up with billionaire funding. He believed the trickle down innovation trope that usually ends in pulling up the ladder behind them, which can be readily found around all things tech-singularity. And he called himself a futurist. After the talks we as speakers stood on and in front of the stage chatting about the things that had been presented. The futurist, addressing me and one other speaker, chuckled that ‘that eco-lady’ had a nice idea but a naive unrealistic and irrational one that obviously had zero probability of happening. At the time I found it jerkish and jarring, not least given the guys’s absence of expertise in the fields concerned (environment and international law). It’s one of the key moments I remember from that conference, as the condescending remark so strongly clashed with the rest of the event and atmosphere.

Meanwhile we’re some 10 years into the future of that conference. The futurist’s efforts collapsed soon after the conference it seems and there are no recent online traces of him. Polly Higgins is no longer alive, but her cause has very much outlived her. On 26 March the final step in the legislative path of a renewed Directive on the protection of the environment through criminal law has been taken, when the Council of the EU formally approved the text agreed (last November) with the European Parliament. In that new ecocrimes directive preamble 21 uses the phrase ecocide to describe specific crimes covered in the Directive (PDF).

Criminal offences relating to intentional conduct listed in this Directive can lead to catastrophic results, such as widespread pollution, industrial accidents with severe effects on the environment or large-scale forest fires. Where such offences cause the destruction of, or widespread and substantial damage which is either irreversible or long-lasting to, an ecosystem of considerable size or environmental value or a habitat within a protected site, or cause widespread and substantial damage which is either irreversible or long-lasting to the quality of air, soil, or water, such offences, leading to such catastrophic results, should constitute qualified criminal offences and, consequently, be punished with more severe penalties than those applicable in the event of other criminal offences defined in this Directive. Those qualified criminal offences can encompass conduct comparable to ‘ecocide’, which is already covered by the law of certain Member States and which is being discussed in international fora.

Good work barrister Higgins, and the Stop Ecocide organisation.


A photo taken by Polly Higgins of me as we had fun together driving an all electric ‘motor bike’ around the venue’s hallways at that conference in 2013.

Polly Higgins about to take the e-chopper for a spin through the venue.

Bookmarked Commission opens non-compliance investigations against Alphabet, Apple and Meta under the Digital Markets Act (by European Commission)

With the large horizontal legal framework for the single digital market and the single market for data mostly in force and applicable, the EC is initiating first actions. This announcement focuses on app store aspects, on steering (third parties being able to provide users with other paths of paying for services than e.g. Apple’s app store), on (un-)installing any app and freedom to change settings, as well as providers preferencing own services above those of others. Five investigations for suspected non-compliance involving Google (Alphabet), Apple, and Meta (Facebook) have been announced. Amazon and Microsoft are also being investigated in order to clarify aspects that may lead to suspicions of non-compliance.

The investigation into Facebook is about their ‘pay or consent’ model, which is Facebook’s latest attempt to circumvent their GDPR obligations that consent should be freely given. It was clear that their move, even if it allows them to steer clear of GDPR (which is still very uncertain), it would create issues under the Digital Markets Act (DMA).

In the same press release the EC announces that Facebook Messenger is getting a 6 month extension of the period in which to comply with interoperability demands.

The Commission suspects that the measures put in place by these gatekeepers fall short of effective compliance of their obligations under the DMA. … The Commission has also adopted five retention orders addressed to Alphabet, Amazon, Apple, Meta, and Microsoft, asking them to retain documents which might be used to assess their compliance with the DMA obligations, so as to preserve available evidence and ensure effective enforcement.

European Commission

Bookmarked US lawmakers vote 50-0 to force sale of TikTok despite angry calls from users (by Jon Brodkin at Ars Technica)

Apple may be misinterpreting what the EU Digital Markets Act and Digitale Services Act are about, so perhaps this example of how the US is working their own anti trust laws, here w.r.t. TikTok helps them realise what’s at stake.

If an application is determined to be operated by a company controlled by a foreign adversary—like ByteDance, Ltd., which is controlled by the People’s Republic of China—the application must be divested from foreign adversary control within 180 days.

Jon Brodkin at Ars Technica

A final draft of the European AI Regulation is circulating (here’s an almost 900 page PDF). The coming days I will read it with curiosity.

With this the ambitious legal framework for everything digital and data that the European Commission set out to create in 2020 has been finished within this Commission period. That’s pretty impressive.
In 2020 there was no Digital Markets Act, Digital Services Act, AI Regulation, Data Governance Act, Data Act, nor an Open Data Directive/High Value Data implementing regulation.
Before the European elections coming spring, they are all in place. I’ve closely followed the process (and helped create a very small part of it), and I think the result is remarkably consistent and level headed. DG CNECT has done well here in my opinion. It’s a set of laws that are very useful in themselves that which simultaneously forms a geo-political proposition.

The coming years will be dedicated to implementing these novel instruments.

Bookmarked China Seeks Stricter Oversight of Generative AI with Proposed Data and Model Regulations (by Chris McKay at Maginative)

Need to read this more closely. A few things stand out at first glance:

  • This is an addition to the geo-political stances that EU, US, China put forth w.r.t. everything digital and data. A comparison with the EU AI Regulation that is under negotiation is of interest.
  • It seems focused on generative AI solely. Are there other (planned) acts covering other AI applications and development. Why is generative AI singled out here, because it has a more direct population facing character?
  • It seems to mostly front-load the responsibilities towards the companies producing generative AI applications, i.e. towards the models used and pre-release. In comparison the EU regulations incorporates responsibilities for distributors, buyers, users and even users of output only and spans the full lifetime of any application.
  • It lists specific risks in several categories. How specific are those worded, might there be an impact on how future-proof the regulation is? Are there thresholds introduced for such risks?

Let’s see if I can put some AI to work to translate the original Chinese proposed text (PDF).

Via Stephen Downes, who is also my source for the link to the original proposal in PDF.

By emphasizing corpus safety, model security, and rigorous assessment, the regulation intends to ensure that the rise of [generative] AI in China is both innovative and secure — all while upholding its socialist principles.

Chris McKay at Maginative

ODRL, Open Digital Rights Language popped up twice this week for me and I don’t think I’ve been aware of it before. Some notes for me to start exploring.

Rights Expression Languages

Rights Expression Languages, RELs, provide a machine readable way to convey or transfer usage conditions, rights, restraints, granularly w.r.t. both actions and actors. This can then be added as metadata to something. ODRL is a rights expression language, and seems to be a de facto standard.

ODRL is a W3C recommendation since 2018, and thus part of the open web standards. ODRL has its roots in the ’00s and Digital Rights Management (DRM): the abhorred protections media companies added to music and movies, and now e-books, in ways that restrains what people can do with media they bought to well below the level of what was possible before and commonly thought part of having bought something.

ODRL can be expressed in JSON or RDF and XML. A basic example from Wikipedia looks like this:


{
"@context": "http://www.w3.org/ns/odrl.jsonld",
"uid": "http://example.com/policy:001",
"permission": [{
"target": "http://example.com/mysong.mp3",
"assignee": "John Doe",
"action": "play"
}]
}

In this JSON example a policy describes that example.com grants John permission to play mysong.

ODRL in the EU Data Space

In the shaping of the EU common market for data, aka the European common data space, it is important to be able to trace provenance and usage conditions for not just data sets, but singular pieces of data, as it flows through use cases, through applications and their output back into the data space.
This week I participated in a webinar by the EU Data Space Support Center (DSSC) about their first blueprint of data space building blocks, and for federation of such data spaces.

They propose ODRL as the standard to describe usage conditions throughout data spaces.

The question of enactment

It wasn’t the first time I talked about ODRL this week. I had a conversation with Pieter Colpaert. I reached out to get some input on his current view of the landscape of civic organisations active around the EU data spaces. We also touched upon his current work at the University of Gent. His research interest is on ODRL currently, specifically on enactment. ODRL is a REL, a rights expression language. Describing rights is one thing, enacting them in practice, in technology, processes etc. is a different thing. Next to that, how do you demonstrate that you adhere to the conditions expressed and that you qualify for using the things described?

For the EU data space(s) this part sounds key to me, as none of the data involved is merely part of a single clear interaction like in the song example above. It’s part of a variety of flows in which actors likely don’t directly interact, where many different data elements come together. This includes flows through applications that tap into a data space for inputs and outputs but are otherwise outside of it. Such applications are also digital twins, federated systems of digital twins even, meaning a confluence of many different data and conditions across multiple domains (and thus data spaces). All this removes a piece of data lightyears from the neat situation where two actors share it between them in a clearly described transaction within a single-faceted use case.

Expressing the commons

It’s one thing to express restrictions or usage conditions. The DSSC in their webinar talked a lot about business models around use cases, and ODRL as a means for a data source to stay in control throughout a piece of data’s life cycle. Luckily they stopped using the phrase ‘data ownership’ as they realised it’s not meaningful (and confusing on top of it), and focused on control and maintaining having a say by an actor.
An open question for me is how you would express openness and the commons in ODRL. A shallow search surfaces some examples of trying to express Creative Commons or other licenses this way, but none recent.

Openness, can mean an absence of certain conditions, although there may be some (like adding the same absence of conditions to re-shared material or derivative works), which is not the same as setting explicit permissions. If I e.g. dedicate something to the public domain, an image for instance, then there are no permissions for me to grant, as I’ve removed myself from that role of being able to give permission. Yet, you still want to express it to ensure that it is clear for all that that is what happened, and especially that it remains that way.

Part of that question is about the overlap and distinction between rights expressed in ODRL and authorship rights. You can obviously have many conditions outside of copyright, and can have copyright elements that may be outside of what can be expressed in RELs. I wonder how for instance moral authorship rights (that an author in some (all) European jurisdictions cannot do away with) can be expressed after an author has transferred/sold the copyrights to something? Or maybe, expressing authorship rights / copyrights is not what RELs are primarily for, as it those are generic and RELs may be meant for expressing conditions around a specific asset in a specific transaction. There have been various attempts to map all kinds of licenses to RELs though, so I need to explore more.

This is relevant for the EU common data spaces as my government clients will be actors in them and bringing in both open data and closed and unsharable but re-usable data, and several different shades in between. A range of new obligations and possibilities w.r.t. data use for government are created in the EU data strategy laws and the data space is where those become actualised. Meaning it should be possible to express the corresponding usage conditions in ODRL.

ODRL gaps?

Are there gaps in the ODRL standard w.r.t. what it can cover? Or things that are hard to express in it?
I came across one paper ‘A critical reflection on ODRL’ (PDF Kebede, Sileno, Van Engers 2020), that I have yet to read, that describes some of those potential weaknesses, based on use cases in healthcare and logistics. Looking forward to digging out their specific critique.