“But that’s politics!” One of the other participants in our group discussing progress in tech said this to me during our work as the Copenhagen 150. “You sound like a politician”. I was making a second attempt summarising our discussion, trying to formulate our key points, after a first summary by someone else in the group.

The remark stood out for me, because of two reasons.

First, it surprised me that it seemed a new notion for the other person, as I think tech is inherently political. Tech shapes society, and society in turn shapes tech development. Tech in the way it creates or diminishes agency, creates affordances, deals with aspects like access, power (a)symmetries, externalisation of costs, in the way it gets deployed, is all about ethics. And ethics, as the practical expression of values and morals, is deeply political. Maybe less in a party political way, the politics as horse race we see play out daily in the news. That comparison might have been the source of surprise for the other participant. But definitely in the shape of a societal debate about desirability, impacts and intended and unintended consequences. At the start we had a great conversation with Denmarks ambassador to the tech industry (photo), which is a very clear expression of the political weight of tech, and just before the Copenhagen 150 I listened to a very good conversation between Casper Klynge, the tech ambassador, and former Dutch MEP / now Stanford international policy director Marietje Schaake, which rightly and firmly put tech discussions in the geopolitical arena.

Second, the “you make it sound like politics” bit stood out for me, because it gave me a jolt realising that I should behave more purposefully on a political level. Some 25 years ago I briefly engaged in local city politics, but soon realised such games weren’t for me, and that a faster way to change is to start creating the little impacts you want to see. Not of the ‘move fast and break things’ type, but out of the belief that if you create new effective behaviours those will be contagious, and in aggregate lead to culture changes. It is how I ended up in open data for instance, I was already working on open government but not particularly getting anywhere, and then realised open data as a newly emerging topic provided a much better inroad to changing governance. It was seen as a tech-only topic by most politicians and thus unthreatening to the status quo, whereas it was clear to me that if you start pulling the strings of how data gets shared, you soon start pulling over the entire processes that lead to the creation and usage of that data, as its publication creates new realities that generates responses. Politics obviously always plays a role concerning internal relations within a client. A large part of my international work is about diplomacy and cultural sensitivity too. But treating it as a political endeavour in its own right is different. I realise I may be in a place in my work where that deserves to have a much more deliberate role.

Last week danah boyd was presented with an EFF award. She gave a great acceptance speech titled Facing the Great Reckoning Head-On, that contains a plethora of quotes to highlight. Exploring how to make sense of the entire context and dynamics, in which the MIT Media Lab scandal of funding from a badly tainted source could take place (which I previously mentioned here, here and here.) So it’s best to just go read the entire thing.

In stark contrast, Lawrence Lessig’s ‘exploration’ makes no sense to me, and comes across as tone deaf, spending hundreds of words putting forward a straw man that if you accept tainted funding it always should be anonymous, while saying he personally wouldn’t accept such funding. That might well be, but has no real bearing on the case. Instead of putting forward how hard it is to raise funding, he could just as well have argued that higher education should be publicly funded, and funded well to avoid situations like at MIT Media Lab. A model that works well around the globe. Lessig wrote a book against corruption, meaning the funding focus of US politics, but doesn’t here call out the private funding of higher education on the same terms, even though the negative consequences are the same.

On the other hand boyd’s speech addresses the multiple layers involved. One’s own role in a specific system, and in a specific institute, how privilege plays out. How the deeply personal, the emotional and the structures and systems we create relate to and mutually impact each other. Acknowledging and sketching out the complexity, and then to seek where meaningful boundaries are is much maturer way to take this on than Lessig’s highlighting a single dimension of a situation which seems minimally pertinent to it, and worse because of its ‘flatness’ is easily perceived to be actively denying the emotional strata involved and in dire need of recognition.

As said go read the entire speech, but I’ll pick out a few quotes nevertheless. They are pertinent to topics I blog about here, such as the recently launched TechPledge, the role of community, the keys to agency, and resonates with my entire take on technology.

The story of how I got to be standing here is rife with pain and I need to expose part of my story in order to make visible why we need to have a Great Reckoning in the tech industry. This award may be about me, but it’s also not. It should be about all of the women and other minorities who have been excluded from tech by people who thought they were helping.

I am here today in-no-small-part because I benefited from the generosity of men who tolerated and, in effect, enabled unethical, immoral, and criminal men. And because of that privilege, I managed to keep moving forward even as the collateral damage of patriarchy stifled the voices of so many others around me.

What’s happening at the Media Lab right now is emblematic of a broader set of issues plaguing the tech industry and society more generally. Tech prides itself in being better than other sectors. But often it’s not.

If change is going to happen, values and ethics need to have a seat in the boardroom. Corporate governance goes beyond protecting the interests of capitalism. Change also means that the ideas and concerns of all people need to be a part of the design phase and the auditing of systems, even if this slows down the process.

…whether we like it or not, the tech industry is now in the business of global governance.

“Move fast and break things” is an abomination if your goal is to create a healthy society…In a healthy society, we strategically design to increase social cohesion because binaries are machine logic not human logic.

…accountability without transformation is simply spectacle.

The goal shouldn’t be to avoid being evil; it should be to actively do good. But it’s not enough to say that we’re going to do good; we need to collectively define — and hold each other to — shared values and standards.

Human progress needs the the tech sector to be actively reflective, and to continuously scrutinise its ethics, the values and morals actually expressed in behaviour.

During our work on shaping the Tech Pledge last week, we looked into progress as it is mentioned in the Copenhagen Letter as being different from just innovation. The Copenhagen Letter was written in 2017 as output of the same process that now delivered the Tech Pledge.


Thomas, Techfestival’s initiator, reading the Copenhagen Letter
at the start of this year’s CPH150

Progress is not a fixed point on the horizon we said. What is progress shifts, with public opinion and new ideas of what is good, just, true and beautiful emerging, and with the movement of time itself. When the first steam engines appeared their plumes of smoke heralded a new age, that drove industrialisation, nation forming and, through rail roads, changed what cities were for and how city and countryside relate to each other. Steam engines still exist at the very heart of every electricity plant in the world, but progress has moved on from the steam engine.
We also realised that progress does not have fixed and static definition, and so we are free to fill it with whatever definition we think fits in the context we are looking at.

In terms of technology then, progress is a motion, a process, and in our group we defined it as (new) technology plus direction/sense of purpose. Technology here, to me at least, being not just ‘hard tech’, but also ‘soft tech’. Our methods, processes, organisational structures are technology just as much as fountain pens, smart phones and particle accelerators.

So we named a number of elements that fit into this understanding progress as a process and search for direction.

  • It is a part of human nature to strive for change and progress, even if not every single individual in every context and time will be so inclined. This desire to progress is more about setting a direction than a fixed end state. Hence our use of “(new) technology with intended direction” as working definition.
  • We need to be accountable to how anything we make fits the intended direction, and additionally whether it not externalises human or ecological costs, or conflicts with our natural systems, as these are often ignored consequences.
  • We recognise that direction may get lost, or ends up in need of change, in fact we realise that anything we make is largely out of our control once released into the world.
  • So we pledge to continuous reflection on the direction our tech is taking us in practice. Not just during its development or launch, but throughout its use.
  • Whether we want to use the tech we created ourselves, or see our loved ones use it is a smell test, if it doesn’t pass our emotional response something is off.
  • What doesn’t pass the smell test needs to be explored and debated
  • We have a civic duty to organise public debate about the value and direction of our tech right alongside our tech. Not just at the start of making tech, but throughout the life cycle of something you make. If you make something you also need to organise the continuous debate around it to keep a check on its societal value and utility, and to actively identify unintended consequences.
  • If our tech is no longer fit for purpose or takes an unintended turn, we have a duty to actively adapt and /or publicly denounce the aspect of our tech turned detrimental.

Working on the pledge

Regardless of what the Copenhagen Pledge says in addition to this, this reflective practice is something worth wile in itself for me to do: continuously stimulate the debate around what you make, as part and parcel of the artefacts you create. This is not a new thing to me, it’s at the core of the unconferences we organise, where lived practice, reflection and community based exploration are the basic ingredients.

To me what is key in the discussions we had is that this isn’t about all tech in general, though anyone is welcome to join any debate. This is about having the moral and civic duty to actively create public debate around the technology you make and made. You need to feel responsible for what you make from inception to obsolescence, just as you always remain a parent to your children, regardless of their age and choices as adult. The connection to self, to an emotional anchoring of this responsibility is the crucial thing here.

So there I was on a rainy Copenhagen evening finding myself in a room with 149 colleagues for 24 hours, nearing midnight, passionately arguing that technologists need to internalise and own the reflection on the role and purpose of their tech, and not leave it as an exercise to the academics in the philosophy of technology departments. A duty to organise the public debate about your tech alongside the existence of the tech itself. If your own tech no longer passes your own smell test then actively denounce it. I definitely felt that emotional anchoring I’m after in myself right there and then.

Before Techfestival‘s speakers and event partners’ dinner Thursday, Marie Louise Gørvild, Techfestival’s Director, and Thomas Madsen-Mygdal, its initiator, said a few words. Thomas cited the Copenhagen Letter from 2017 singling out how our tech needs to be embedded in the context of our democratic structures, and how innovation can’t be a substitute for our sense of progress and impact. The Copenhagen Letter, and the entire Techfestival emphasise humanity as not only the source and context for technology and its use, but its ultimate yardstick for the constructive use and impact of technology. This may sound obvious, it certainly does to me, but in practice it needs to be repeated to ensure it is used as such a yardstick from the very first design stage of any new technology.

At Techfestival Copenhagen 2019

Technology is always about humans to me. Technology is an extension of our bodies, an extension of reach and an extension of human agency. A soup spoon is an extension of our hand so we don’t burn our hand when we stir the soup. A particle accelerator is an extension of our ears and eyes to better understand the particles and atoms we’re made of. With technology we extend our reach across the globe by instantaneously communicating, extend it into the air, into the deep sea, towards the atom level, and into interstellar space. Tech is there to deepen and augment our humanity. In my daily routines it’s how I approach technology too, both in personal matters such as blogging, and in client projects, and apparently such an approach stands out. It’s what recently Kicks Condor remarked upon and Neil Mather pointed to in conversations about our blogging practices, what Heinz Wittenbrink referenced when he said “they talk about their own lives when they talk about these things” about our unconference, and what clients say about my change management work around open data.

Techfestival in Copenhagen takes humanity as the starting point for tech, and as litmus test for the usefulness and ethicality of tech. It therefore is somewhat grating to come across people talking about how to create a community for their tech to help it scale. Hearing that last week in Copenhagen a few times felt very much out of tune. Worse, I think It is an insulting way to talk about people you say you want to create value for.

Yes, some newly launched apps / platforms really are new places where communities can form that otherwise wouldn’t, because of geographic spread, shame, taboo or danger to make yourself visible in your local environment, or because you’re exploring things you’re still uncertain about yourself. All (niche) interests, the crazy ones, those who can’t fully express their own personality in their immediate environment benefit from the new spaces for interaction online tools have created. My own personal blog based peer network started like that: I was lonely in my role as a knowledge manager at the start of the ’00s, and online interaction and blogging brought me the global professional peer network I needed, and which wasn’t otherwise possible in the Netherlands at the time.

Techfestival’s central stage in Kødbyen, during an evening key-note

Otherwise, however, every single one of us already is part of communities. Their sports teams, neighbourhood, extended family, work context, causes, peer networks, alumni clubs, etc etc. Why doesn’t tech usually focus on me using it for my communities as is, and rather present itself as having me join a made up community whose raison d’etre is exploiting our attention for profit? That’s not community building, that’s extraction, instrumentalising your users, while dehumanising them along the way. To me it’s in those communities everyone is already part of where the scaling for technology is to be found. “Scaling does not scale” said Aza Raskin in his Techfestival keynote, and that resonates. I talked about the invisible hand of networks in response to demands for scaling when I talked about technology ‘smaller than us‘ and networked agency at SOTN18, and this probably is me saying the same again in a slightly different way. Scaling is in our human structures. Artists don’t scale, road building doesn’t scale but art and road networks are at scale. Communities don’t scale, they’re fine as they are, but they are the grain of scale, resulting in society which is at scale. Don’t seek to scale your tech, seek to let your tech reinforce societal scaling, our overlapping communities, our cultures. Let your tech be scaffolding for a richer expression of society.

Techfestival fits very much into that, and I hope it is what I brought to the work on the CPH150 pledge: the notion of human (group) agency. and the realisation that tech is not something on its own, but needs to be used in combination with methods and processes, in which you cannot ever ignore societal context. One of those processes is continuous reflection on your tech, right alongside the creation and implementation of your tech, for as long as it endures.

Our group of 150 working 24 hours on writing the TechPledge

As part of the Techfestival last week, the Copenhagen 150, which this time included me, came together to write a pledge for individual technologists and makers to commit their professional lives to. A bit like a Hippocratic oath, but for creators of all tech. Following up on the Copenhagen Letter, which was a signal, a letter of intent, and the Copenhagen Catalog which provide ‘white patterns’ for tech makers, this years Tech Pledge makes it even more personal.

I pledge

  • to take responsibility for what I create.
  • to only help create things I would want my loved ones to use.
  • to pause to consider all consequences of my work, intended as well as unintended.
  • to invite and act on criticism, even when it is painful.
  • to ask for help when I am uncertain if my work serves my community.
  • to always put humans before business, and to stand up against pressure to do otherwise, even at my own risk.
  • to never tolerate design for addiction, deception or control.
  • to help others understand and discuss the power and challenges of technology.
  • to participate in the democratic process of regulating technology, even though it is difficult.
  • to fight for democracy and human rights, and to improve the institutions that protect them.
  • to work towards a more equal, inclusive and sustainable future for us all, following the United Nations global goals.
  • to always take care of what I start, and to fix what we break.

I signed the pledge. I hope you will do to. If you have questions about what this means in practical ways, I’m happy to help you translate it to your practice. A first step likely is figuring out which questions to ask of yourself at the start of something new. In the coming days I plan to blog more from my notes on Techfestival and those postings will say more about various aspects of this. You are also still welcome to sign the Copenhagen Letter, as well as individual elements of the Copenhagen Catalog.