The TechPledge is a Hippocratic oath like pledge for individual tech professionals. It is aimed at promoting human centered technology and at instilling reflective practices around tech as a key responsibility for tech professionals.

It was written during the 2019 TechFestival by the ‘Copenhagen 150’ a group of people in tech in diverse roles from over 40 countries, in 24 hours.

In the days after launch some criticisms were voiced, which are gathered below. This so it is possible to formulate a response, and perhaps adapt the TechPledge when warranted.

  • The pledge doesn’t have teeth. We need laws and regulation.
  • My specific technology(-concern) isn’t mentioned (e.g. nanotech, AI, Facebook, nuclear energy etc)
  • My specific societal concern isn’t mentioned (e.g. toxic online behaviour, verbal violence, misogyny, bigotry, hate)
  • Issues taken with specific wording (e.g. addiction, control)
  • This will not change any company’s behaviour
  • It’s too long / not concise enough (The Techpledge is 238 words, the English translation of the Hippocratic oath and the modern Geneva version are both over 300 words and the Hipprcatic oath has endured two and a half millennia
  • Will it actually be persuasive to prevent creepy (yet lucrative) behaviour?

(to be structured/added)
On addiction:
Person 1 (original tweeter:) Yes manipulation and influence exists. Yes, design shapes behavior in context. Yes, techs habits, but more like candy than heroin.

On control:
Me: It needs to be taken in the flow of the context of the as a whole. In that context I find it’s clearer what’s meant. For control I’d had suggested manipulation. I agree that maybe too much hinges on current perceptions of dark patterns and less timeless. That said calling out dark patterns is important. Perhaps the pledge should have said that, not willing to cooperate in designing/deploying dark patterns.
Person 2: If you are intentionally using “dark patterns”, you are intentionally using psychology to compel or addict.
Person 3: But mostly it does not make much sense. For instance fighting for democracy may in some cases entail deregulating tech or increasing control.
Me: I think you’re reading ‘control’ differently here. Control and boundaries are needed elements in any complex environment, just as allowing for emergence and experimentation, for sure. Intentionally and opaquely aiming for compulsion is a diff type of control, though.


Ethics by design is adding ethical choices and values to a design process as non-functional requirements, that then are turned into functional specifications.

E.g. when you want to count the size of a group of people by taking a picture of them, adding the value of safeguarding privacy into the requirements might mean the picture will be intentionally made grainy by a camera. A more grainy pic still allows you to count the number of people in the photo, but you never captured and stored their actual faces.

When it comes to data governance and machine learning Europe’s stance towards safeguarding civic rights and enlightenment values is a unique perspective to take in a geopolitical context. Data is a very valuable resource. In the US large corporations and intelligence services have created enormous data lakes, without much restraints, resulting in a tremendous power asymmetry, and an objectification of the individual. This is surveillance capitalism.
China, and others like Russia, have created or are creating large national data spaces in which the individual is made fully transparent and described by connecting most if not all data sources and make them accessible to government, and where resulting data patterns have direct consequences for citizens. This is data driven authoritarian rule.
Europe cannot compete with either of those two models, but can provide a competing perspective on data usage by creating a path of responsible innovation in which all data is as much combined and connected as elsewhere in the world, yet with values and ethical boundaries designed into its core. With the GDPR the EU is already setting a new de-facto global standard, and doing more along similar lines, not just in terms of regulations, but also in terms of infrastructure (Estonia’s X-road for instance) is the opportunity Europe has.

Some pointers:
My blogpost Ethics by Design
A naive exploration of ethics around networked agency.
A paper (PDF) on Value Sensitive Design
The French report For a Meaningful Artificial Intelligence (PDF), that drive France’s 1.5 billion investment in value based AI.