“But that’s politics!” One of the other participants in our group discussing progress in tech said this to me during our work as the Copenhagen 150. “You sound like a politician”. I was making a second attempt summarising our discussion, trying to formulate our key points, after a first summary by someone else in the group.

The remark stood out for me, because of two reasons.

First, it surprised me that it seemed a new notion for the other person, as I think tech is inherently political. Tech shapes society, and society in turn shapes tech development. Tech in the way it creates or diminishes agency, creates affordances, deals with aspects like access, power (a)symmetries, externalisation of costs, in the way it gets deployed, is all about ethics. And ethics, as the practical expression of values and morals, is deeply political. Maybe less in a party political way, the politics as horse race we see play out daily in the news. That comparison might have been the source of surprise for the other participant. But definitely in the shape of a societal debate about desirability, impacts and intended and unintended consequences. At the start we had a great conversation with Denmarks ambassador to the tech industry (photo), which is a very clear expression of the political weight of tech, and just before the Copenhagen 150 I listened to a very good conversation between Casper Klynge, the tech ambassador, and former Dutch MEP / now Stanford international policy director Marietje Schaake, which rightly and firmly put tech discussions in the geopolitical arena.

Second, the “you make it sound like politics” bit stood out for me, because it gave me a jolt realising that I should behave more purposefully on a political level. Some 25 years ago I briefly engaged in local city politics, but soon realised such games weren’t for me, and that a faster way to change is to start creating the little impacts you want to see. Not of the ‘move fast and break things’ type, but out of the belief that if you create new effective behaviours those will be contagious, and in aggregate lead to culture changes. It is how I ended up in open data for instance, I was already working on open government but not particularly getting anywhere, and then realised open data as a newly emerging topic provided a much better inroad to changing governance. It was seen as a tech-only topic by most politicians and thus unthreatening to the status quo, whereas it was clear to me that if you start pulling the strings of how data gets shared, you soon start pulling over the entire processes that lead to the creation and usage of that data, as its publication creates new realities that generates responses. Politics obviously always plays a role concerning internal relations within a client. A large part of my international work is about diplomacy and cultural sensitivity too. But treating it as a political endeavour in its own right is different. I realise I may be in a place in my work where that deserves to have a much more deliberate role.

During our work on shaping the Tech Pledge last week, we looked into progress as it is mentioned in the Copenhagen Letter as being different from just innovation. The Copenhagen Letter was written in 2017 as output of the same process that now delivered the Tech Pledge.


Thomas, Techfestival’s initiator, reading the Copenhagen Letter
at the start of this year’s CPH150

Progress is not a fixed point on the horizon we said. What is progress shifts, with public opinion and new ideas of what is good, just, true and beautiful emerging, and with the movement of time itself. When the first steam engines appeared their plumes of smoke heralded a new age, that drove industrialisation, nation forming and, through rail roads, changed what cities were for and how city and countryside relate to each other. Steam engines still exist at the very heart of every electricity plant in the world, but progress has moved on from the steam engine.
We also realised that progress does not have fixed and static definition, and so we are free to fill it with whatever definition we think fits in the context we are looking at.

In terms of technology then, progress is a motion, a process, and in our group we defined it as (new) technology plus direction/sense of purpose. Technology here, to me at least, being not just ‘hard tech’, but also ‘soft tech’. Our methods, processes, organisational structures are technology just as much as fountain pens, smart phones and particle accelerators.

So we named a number of elements that fit into this understanding progress as a process and search for direction.

  • It is a part of human nature to strive for change and progress, even if not every single individual in every context and time will be so inclined. This desire to progress is more about setting a direction than a fixed end state. Hence our use of “(new) technology with intended direction” as working definition.
  • We need to be accountable to how anything we make fits the intended direction, and additionally whether it not externalises human or ecological costs, or conflicts with our natural systems, as these are often ignored consequences.
  • We recognise that direction may get lost, or ends up in need of change, in fact we realise that anything we make is largely out of our control once released into the world.
  • So we pledge to continuous reflection on the direction our tech is taking us in practice. Not just during its development or launch, but throughout its use.
  • Whether we want to use the tech we created ourselves, or see our loved ones use it is a smell test, if it doesn’t pass our emotional response something is off.
  • What doesn’t pass the smell test needs to be explored and debated
  • We have a civic duty to organise public debate about the value and direction of our tech right alongside our tech. Not just at the start of making tech, but throughout the life cycle of something you make. If you make something you also need to organise the continuous debate around it to keep a check on its societal value and utility, and to actively identify unintended consequences.
  • If our tech is no longer fit for purpose or takes an unintended turn, we have a duty to actively adapt and /or publicly denounce the aspect of our tech turned detrimental.

Working on the pledge

Regardless of what the Copenhagen Pledge says in addition to this, this reflective practice is something worth wile in itself for me to do: continuously stimulate the debate around what you make, as part and parcel of the artefacts you create. This is not a new thing to me, it’s at the core of the unconferences we organise, where lived practice, reflection and community based exploration are the basic ingredients.

To me what is key in the discussions we had is that this isn’t about all tech in general, though anyone is welcome to join any debate. This is about having the moral and civic duty to actively create public debate around the technology you make and made. You need to feel responsible for what you make from inception to obsolescence, just as you always remain a parent to your children, regardless of their age and choices as adult. The connection to self, to an emotional anchoring of this responsibility is the crucial thing here.

So there I was on a rainy Copenhagen evening finding myself in a room with 149 colleagues for 24 hours, nearing midnight, passionately arguing that technologists need to internalise and own the reflection on the role and purpose of their tech, and not leave it as an exercise to the academics in the philosophy of technology departments. A duty to organise the public debate about your tech alongside the existence of the tech itself. If your own tech no longer passes your own smell test then actively denounce it. I definitely felt that emotional anchoring I’m after in myself right there and then.

Before Techfestival‘s speakers and event partners’ dinner Thursday, Marie Louise Gørvild, Techfestival’s Director, and Thomas Madsen-Mygdal, its initiator, said a few words. Thomas cited the Copenhagen Letter from 2017 singling out how our tech needs to be embedded in the context of our democratic structures, and how innovation can’t be a substitute for our sense of progress and impact. The Copenhagen Letter, and the entire Techfestival emphasise humanity as not only the source and context for technology and its use, but its ultimate yardstick for the constructive use and impact of technology. This may sound obvious, it certainly does to me, but in practice it needs to be repeated to ensure it is used as such a yardstick from the very first design stage of any new technology.

At Techfestival Copenhagen 2019

Technology is always about humans to me. Technology is an extension of our bodies, an extension of reach and an extension of human agency. A soup spoon is an extension of our hand so we don’t burn our hand when we stir the soup. A particle accelerator is an extension of our ears and eyes to better understand the particles and atoms we’re made of. With technology we extend our reach across the globe by instantaneously communicating, extend it into the air, into the deep sea, towards the atom level, and into interstellar space. Tech is there to deepen and augment our humanity. In my daily routines it’s how I approach technology too, both in personal matters such as blogging, and in client projects, and apparently such an approach stands out. It’s what recently Kicks Condor remarked upon and Neil Mather pointed to in conversations about our blogging practices, what Heinz Wittenbrink referenced when he said “they talk about their own lives when they talk about these things” about our unconference, and what clients say about my change management work around open data.

Techfestival in Copenhagen takes humanity as the starting point for tech, and as litmus test for the usefulness and ethicality of tech. It therefore is somewhat grating to come across people talking about how to create a community for their tech to help it scale. Hearing that last week in Copenhagen a few times felt very much out of tune. Worse, I think It is an insulting way to talk about people you say you want to create value for.

Yes, some newly launched apps / platforms really are new places where communities can form that otherwise wouldn’t, because of geographic spread, shame, taboo or danger to make yourself visible in your local environment, or because you’re exploring things you’re still uncertain about yourself. All (niche) interests, the crazy ones, those who can’t fully express their own personality in their immediate environment benefit from the new spaces for interaction online tools have created. My own personal blog based peer network started like that: I was lonely in my role as a knowledge manager at the start of the ’00s, and online interaction and blogging brought me the global professional peer network I needed, and which wasn’t otherwise possible in the Netherlands at the time.

Techfestival’s central stage in Kødbyen, during an evening key-note

Otherwise, however, every single one of us already is part of communities. Their sports teams, neighbourhood, extended family, work context, causes, peer networks, alumni clubs, etc etc. Why doesn’t tech usually focus on me using it for my communities as is, and rather present itself as having me join a made up community whose raison d’etre is exploiting our attention for profit? That’s not community building, that’s extraction, instrumentalising your users, while dehumanising them along the way. To me it’s in those communities everyone is already part of where the scaling for technology is to be found. “Scaling does not scale” said Aza Raskin in his Techfestival keynote, and that resonates. I talked about the invisible hand of networks in response to demands for scaling when I talked about technology ‘smaller than us‘ and networked agency at SOTN18, and this probably is me saying the same again in a slightly different way. Scaling is in our human structures. Artists don’t scale, road building doesn’t scale but art and road networks are at scale. Communities don’t scale, they’re fine as they are, but they are the grain of scale, resulting in society which is at scale. Don’t seek to scale your tech, seek to let your tech reinforce societal scaling, our overlapping communities, our cultures. Let your tech be scaffolding for a richer expression of society.

Techfestival fits very much into that, and I hope it is what I brought to the work on the CPH150 pledge: the notion of human (group) agency. and the realisation that tech is not something on its own, but needs to be used in combination with methods and processes, in which you cannot ever ignore societal context. One of those processes is continuous reflection on your tech, right alongside the creation and implementation of your tech, for as long as it endures.

Our group of 150 working 24 hours on writing the TechPledge

As part of the Techfestival last week, the Copenhagen 150, which this time included me, came together to write a pledge for individual technologists and makers to commit their professional lives to. A bit like a Hippocratic oath, but for creators of all tech. Following up on the Copenhagen Letter, which was a signal, a letter of intent, and the Copenhagen Catalog which provide ‘white patterns’ for tech makers, this years Tech Pledge makes it even more personal.

I pledge

  • to take responsibility for what I create.
  • to only help create things I would want my loved ones to use.
  • to pause to consider all consequences of my work, intended as well as unintended.
  • to invite and act on criticism, even when it is painful.
  • to ask for help when I am uncertain if my work serves my community.
  • to always put humans before business, and to stand up against pressure to do otherwise, even at my own risk.
  • to never tolerate design for addiction, deception or control.
  • to help others understand and discuss the power and challenges of technology.
  • to participate in the democratic process of regulating technology, even though it is difficult.
  • to fight for democracy and human rights, and to improve the institutions that protect them.
  • to work towards a more equal, inclusive and sustainable future for us all, following the United Nations global goals.
  • to always take care of what I start, and to fix what we break.

I signed the pledge. I hope you will do to. If you have questions about what this means in practical ways, I’m happy to help you translate it to your practice. A first step likely is figuring out which questions to ask of yourself at the start of something new. In the coming days I plan to blog more from my notes on Techfestival and those postings will say more about various aspects of this. You are also still welcome to sign the Copenhagen Letter, as well as individual elements of the Copenhagen Catalog.

Last week I attended Techfestival in Copenhagen. I participated in a day long Public Data Summit. This posting are thoughts and notes based on some of what was discussed during that Public Data Summit.

Group work at the Public Data Summit

Martin von Haller Groenbaek (we go back in open data a long time) provided an interesting lightning talk at the start of the Public Data Summit. He posited that in order to realise the full potential of open (government) data, we probably need to be more relaxed in sharing personal data as well.

There is a case to be made, I agree. In energy transition for instance your (real time) personal electricity use is likely key information. The aggregated yearly usage of you and at least 10 neighbours e.g. the Dutch electricity networks make available is not useless by far, but lacks the granularity, the direct connection to real people’s daily lives to be truly valuable for anything of practical use.

I agree with the notion that more person related data needs to come into play. Martin talked about it in terms of balancing priority, if you want to fix climate change, or another key societal issue, personal data protection can’t be absolute.

Now this sounds a bit like “we need your personal data to fight terrorism” which then gets translated “give me your fingerprints or your safety and security is compromised”, yet that is both an edge case and an example of the types of discussions needed to find the balancing point, to avoid false dilemma’s or better yet prevent reductionism towards ineffective simplicity (such as is the case with terrorism, where all sense of proportionality is abandoned). The balancing point is where the sweet spot of addressing the right level of complexity is. In the case of terrorism the personal data sharing discussion is framed as “you’re either with us, or with the terrorists” to quote Bush jr., a framing in absolutes and inviting a cascade of scope creep.

To me this is a valuable discussion to be had, to determine when and where sharing your personal data is a useful contribution to the common good or even should be part of a public good. Currently that ‘for the common good’ part is not in play mostly. We’re leaking personal data all over the tech platforms we use, without much awareness of its extend or how it is being used. We do know these data are not being used for the common good as it’s in no-one’s business model. This public good / common good thinking was central to our group work in the Public Data Summit during the rest of the day too.

Martin mentioned the GDPR as a good thing, certainly for his business as a lawyer, but also as a problematic one. Specifically he criticised the notion of owning personal data, and being able to trade it as a commodity based on that ownership. I agree, for multiple reasons. One being that a huge amount of our personal data is not directly created or held by me, as it is data about behavioural patterns, like where my phone has been, where I used my debit card, the things I click, the time I spent on pages, the thumbprint of my specific browser and plugins setup etc. The footsteps we leave on a trail in the forest aren’t personal data, but our digital footsteps are, because the traces can, due to the persistence of those tracks, more easily than in the woods be followed back to their source as well as can get compared to other tracks you leave behind.

Currently those footsteps in the digital woods are contractualised away into the possession of private owners of the woods we walk in, i.e. the tech platforms. But there’s a strong communal aspect to your and my digital footsteps as personal data. We need to determine how we can use that collectively, and how to govern that use. Talking about the ownership of data, especially the data we create by being out in the (semi) public sphere (e.g. tech platforms) and the ability to trade for it (like Solid suggests), has 2 effects: it bakes in the acceptance that me allowing FB to work with my data is a contract between equal parties (GDPR rightly tries to address this assymmetry). Aza Raskin in his keynote mentioned this too, saying tech platforms should be more seen and regulated as fiduciaries, to acknowledge the power asymmetry. And it takes the communal part of what we might do with data completely out of the equation. I can easily imagine when and where I’d be ok with my neighbours, my local government, a local NGO, or specific topical/sectoral communities etc. having access to using data about me. Where that same use by FB et al would not be ok at all under any circumstance.

In the intro’s to the public data summit civil society however was very much absent, there was talk about government and their data, and how it needed the private sector to do something valuable with it. Where to me open (e-)government, and opening data is very much about allowing the emergence and co-creation of novel public services by government/governance working together with citizens. Things we maybe not now regard as part of public institutions, structures or the role of government, but that in our digitised world very well could, or even should, be.

Had a very good first day at Copenhagen Techfestival Thursday. After bumping into Thomas right at the start, I joined the full day Public Data Summit, focusing on the use of open data in climate change response, co-hosted by Christian. Lots of things to mention / write about more, but need to work out some of my notes first.

Then I met up with Cathrine Lippert a long time Danish open data colleague, that I hand’t seen for a few years and now works at DTI. Just before dinner I ended up walking next to Nadja Pass, whom I think I last met a decade ago, and over excellent food we talked about the things that happened to us, the things we currently do and care about. Listened to an excellent and very well designed talk (sticky soundbites and all) by Aza Raskin, about the hackability of human minds, and what that spells out for the impact of tech on society. While leaving the central festival area, Nadia El Imam, co-founder of the great Edgeryders network, and I crossed paths, and over wine and some food at Pate Pate, we talked about a wide variety of things. I arranged a bicycle from the hotel, which is much easier to get around. Cycling I found that even having been in Copenhagen last years ago, I still know my way around without having to check where I’m going.

This afternoon, after catching up with Henriette Weber to hear of her latest adventures, I will take part in the Copenhagen 150 think-tank, which is a 24 hour event. However, I will need to limit my presence to 20 hours, as I need to be back in Amsterdam by tomorrow mid afternoon to make a birthday dinner with dear friends in the very south of the country.