Last week danah boyd was presented with an EFF award. She gave a great acceptance speech titled Facing the Great Reckoning Head-On, that contains a plethora of quotes to highlight. Exploring how to make sense of the entire context and dynamics, in which the MIT Media Lab scandal of funding from a badly tainted source could take place (which I previously mentioned here, here and here.) So it’s best to just go read the entire thing.

In stark contrast, Lawrence Lessig’s ‘exploration’ makes no sense to me, and comes across as tone deaf, spending hundreds of words putting forward a straw man that if you accept tainted funding it always should be anonymous, while saying he personally wouldn’t accept such funding. That might well be, but has no real bearing on the case. Instead of putting forward how hard it is to raise funding, he could just as well have argued that higher education should be publicly funded, and funded well to avoid situations like at MIT Media Lab. A model that works well around the globe. Lessig wrote a book against corruption, meaning the funding focus of US politics, but doesn’t here call out the private funding of higher education on the same terms, even though the negative consequences are the same.

On the other hand boyd’s speech addresses the multiple layers involved. One’s own role in a specific system, and in a specific institute, how privilege plays out. How the deeply personal, the emotional and the structures and systems we create relate to and mutually impact each other. Acknowledging and sketching out the complexity, and then to seek where meaningful boundaries are is much maturer way to take this on than Lessig’s highlighting a single dimension of a situation which seems minimally pertinent to it, and worse because of its ‘flatness’ is easily perceived to be actively denying the emotional strata involved and in dire need of recognition.

As said go read the entire speech, but I’ll pick out a few quotes nevertheless. They are pertinent to topics I blog about here, such as the recently launched TechPledge, the role of community, the keys to agency, and resonates with my entire take on technology.

The story of how I got to be standing here is rife with pain and I need to expose part of my story in order to make visible why we need to have a Great Reckoning in the tech industry. This award may be about me, but it’s also not. It should be about all of the women and other minorities who have been excluded from tech by people who thought they were helping.

I am here today in-no-small-part because I benefited from the generosity of men who tolerated and, in effect, enabled unethical, immoral, and criminal men. And because of that privilege, I managed to keep moving forward even as the collateral damage of patriarchy stifled the voices of so many others around me.

What’s happening at the Media Lab right now is emblematic of a broader set of issues plaguing the tech industry and society more generally. Tech prides itself in being better than other sectors. But often it’s not.

If change is going to happen, values and ethics need to have a seat in the boardroom. Corporate governance goes beyond protecting the interests of capitalism. Change also means that the ideas and concerns of all people need to be a part of the design phase and the auditing of systems, even if this slows down the process.

…whether we like it or not, the tech industry is now in the business of global governance.

“Move fast and break things” is an abomination if your goal is to create a healthy society…In a healthy society, we strategically design to increase social cohesion because binaries are machine logic not human logic.

…accountability without transformation is simply spectacle.

The goal shouldn’t be to avoid being evil; it should be to actively do good. But it’s not enough to say that we’re going to do good; we need to collectively define — and hold each other to — shared values and standards.

Human progress needs the the tech sector to be actively reflective, and to continuously scrutinise its ethics, the values and morals actually expressed in behaviour.

Chris Grey’s Brexit blog is a very worthwile read, bringing up the energy to weekly take a detailed look at what is happening in the UK regarding Brexit. Where many others run screaming frustratedly, or deludedly shout ‘get it over with’ as if there’s a simple Gordian knot style solution to solving the complexity of Brexit, especially as it has become based on mutually exclusive notions, as per the quote. Getting it over with simply means you return to the exact same issues the next day, after having needlessly created a gaping hole in your legal framework as well as economy which do nothing but undermine your ability to solve those issues, merely having taken them from ‘important to solve’ to ‘extremely urgent to solve’.

Read Revolution and counter-revolution (chrisgreybrexitblog.blogspot.com)

Brexit makes liars of everyone who tries to enact it, even if they are not by nature as mendacious as Johnson or as destructive as Cummings. For it derives ultimately from the lies within the Vote Leave campaign itself which, at heart, promised that Brexit could be done without negative consequences. This led May into such tortured positions on, for example, maintaining ‘frictionless trade’ whilst leaving the institutions that make that possible. It is still present in Johnson and the Brexiters’ underlying position that there can be an open border in Ireland whilst leaving the institutions that make that possible.

Not sure if it is a separate pattern, but I recognise the general gist of what Bryan Alexander writes. I suspect that ideas having come slightly before their time, until technological and societal circumstances are more conducive to an idea, is part of this. Thinking of speech technology as an example.

What data would we need to see if these patterns do occur. And where might that data already be available? I do like Bryan’s name for it: Bloom Bust Boom cycles.

Read When new ideas boom, bust, then come roaring back: around the U-bend (Bryan Alexander)

Lately I’m thinking about the re-emergence of ideas over time.  I’d like to get a better handle on that process, hopefully with a model, and so try to better anticipate when such a thin…

During our work on shaping the Tech Pledge last week, we looked into progress as it is mentioned in the Copenhagen Letter as being different from just innovation. The Copenhagen Letter was written in 2017 as output of the same process that now delivered the Tech Pledge.

20190906_163157
Thomas, Techfestival’s initiator, reading the Copenhagen Letter
at the start of this year’s CPH150

Progress is not a fixed point on the horizon we said. What is progress shifts, with public opinion and new ideas of what is good, just, true and beautiful emerging, and with the movement of time itself. When the first steam engines appeared their plumes of smoke heralded a new age, that drove industrialisation, nation forming and, through rail roads, changed what cities were for and how city and countryside relate to each other. Steam engines still exist at the very heart of every electricity plant in the world, but progress has moved on from the steam engine.
We also realised that progress does not have fixed and static definition, and so we are free to fill it with whatever definition we think fits in the context we are looking at.

In terms of technology then, progress is a motion, a process, and in our group we defined it as (new) technology plus direction/sense of purpose. Technology here, to me at least, being not just ‘hard tech’, but also ‘soft tech’. Our methods, processes, organisational structures are technology just as much as fountain pens, smart phones and particle accelerators.

So we named a number of elements that fit into this understanding progress as a process and search for direction.

  • It is a part of human nature to strive for change and progress, even if not every single individual in every context and time will be so inclined. This desire to progress is more about setting a direction than a fixed end state. Hence our use of “(new) technology with intended direction” as working definition.
  • We need to be accountable to how anything we make fits the intended direction, and additionally whether it not externalises human or ecological costs, or conflicts with our natural systems, as these are often ignored consequences.
  • We recognise that direction may get lost, or ends up in need of change, in fact we realise that anything we make is largely out of our control once released into the world.
  • So we pledge to continuous reflection on the direction our tech is taking us in practice. Not just during its development or launch, but throughout its use.
  • Whether we want to use the tech we created ourselves, or see our loved ones use it is a smell test, if it doesn’t pass our emotional response something is off.
  • What doesn’t pass the smell test needs to be explored and debated
  • We have a civic duty to organise public debate about the value and direction of our tech right alongside our tech. Not just at the start of making tech, but throughout the life cycle of something you make. If you make something you also need to organise the continuous debate around it to keep a check on its societal value and utility, and to actively identify unintended consequences.
  • If our tech is no longer fit for purpose or takes an unintended turn, we have a duty to actively adapt and /or publicly denounce the aspect of our tech turned detrimental.

20190907_120354Working on the pledge

Regardless of what the Copenhagen Pledge says in addition to this, this reflective practice is something worth wile in itself for me to do: continuously stimulate the debate around what you make, as part and parcel of the artefacts you create. This is not a new thing to me, it’s at the core of the unconferences we organise, where lived practice, reflection and community based exploration are the basic ingredients.

To me what is key in the discussions we had is that this isn’t about all tech in general, though anyone is welcome to join any debate. This is about having the moral and civic duty to actively create public debate around the technology you make and made. You need to feel responsible for what you make from inception to obsolescence, just as you always remain a parent to your children, regardless of their age and choices as adult. The connection to self, to an emotional anchoring of this responsibility is the crucial thing here.

So there I was on a rainy Copenhagen evening finding myself in a room with 149 colleagues for 24 hours, nearing midnight, passionately arguing that technologists need to internalise and own the reflection on the role and purpose of their tech, and not leave it as an exercise to the academics in the philosophy of technology departments. A duty to organise the public debate about your tech alongside the existence of the tech itself. If your own tech no longer passes your own smell test then actively denounce it. I definitely felt that emotional anchoring I’m after in myself right there and then.

As part of the Techfestival last week, the Copenhagen 150, which this time included me, came together to write a pledge for individual technologists and makers to commit their professional lives to. A bit like a Hippocratic oath, but for creators of all tech. Following up on the Copenhagen Letter, which was a signal, a letter of intent, and the Copenhagen Catalog which provide ‘white patterns’ for tech makers, this years Tech Pledge makes it even more personal.

I pledge

  • to take responsibility for what I create.
  • to only help create things I would want my loved ones to use.
  • to pause to consider all consequences of my work, intended as well as unintended.
  • to invite and act on criticism, even when it is painful.
  • to ask for help when I am uncertain if my work serves my community.
  • to always put humans before business, and to stand up against pressure to do otherwise, even at my own risk.
  • to never tolerate design for addiction, deception or control.
  • to help others understand and discuss the power and challenges of technology.
  • to participate in the democratic process of regulating technology, even though it is difficult.
  • to fight for democracy and human rights, and to improve the institutions that protect them.
  • to work towards a more equal, inclusive and sustainable future for us all, following the United Nations global goals.
  • to always take care of what I start, and to fix what we break.

I signed the pledge. I hope you will do to. If you have questions about what this means in practical ways, I’m happy to help you translate it to your practice. A first step likely is figuring out which questions to ask of yourself at the start of something new. In the coming days I plan to blog more from my notes on Techfestival and those postings will say more about various aspects of this. You are also still welcome to sign the Copenhagen Letter, as well as individual elements of the Copenhagen Catalog.

Last week I attended Techfestival in Copenhagen. I participated in a day long Public Data Summit. This posting are thoughts and notes based on some of what was discussed during that Public Data Summit.

Techfestival.co Public Data SummitGroup work at the Public Data Summit

Martin von Haller Groenbaek (we go back in open data a long time) provided an interesting lightning talk at the start of the Public Data Summit. He posited that in order to realise the full potential of open (government) data, we probably need to be more relaxed in sharing personal data as well.

There is a case to be made, I agree. In energy transition for instance your (real time) personal electricity use is likely key information. The aggregated yearly usage of you and at least 10 neighbours e.g. the Dutch electricity networks make available is not useless by far, but lacks the granularity, the direct connection to real people’s daily lives to be truly valuable for anything of practical use.

I agree with the notion that more person related data needs to come into play. Martin talked about it in terms of balancing priority, if you want to fix climate change, or another key societal issue, personal data protection can’t be absolute.

Now this sounds a bit like “we need your personal data to fight terrorism” which then gets translated “give me your fingerprints or your safety and security is compromised”, yet that is both an edge case and an example of the types of discussions needed to find the balancing point, to avoid false dilemma’s or better yet prevent reductionism towards ineffective simplicity (such as is the case with terrorism, where all sense of proportionality is abandoned). The balancing point is where the sweet spot of addressing the right level of complexity is. In the case of terrorism the personal data sharing discussion is framed as “you’re either with us, or with the terrorists” to quote Bush jr., a framing in absolutes and inviting a cascade of scope creep.

To me this is a valuable discussion to be had, to determine when and where sharing your personal data is a useful contribution to the common good or even should be part of a public good. Currently that ‘for the common good’ part is not in play mostly. We’re leaking personal data all over the tech platforms we use, without much awareness of its extend or how it is being used. We do know these data are not being used for the common good as it’s in no-one’s business model. This public good / common good thinking was central to our group work in the Public Data Summit during the rest of the day too.

Martin mentioned the GDPR as a good thing, certainly for his business as a lawyer, but also as a problematic one. Specifically he criticised the notion of owning personal data, and being able to trade it as a commodity based on that ownership. I agree, for multiple reasons. One being that a huge amount of our personal data is not directly created or held by me, as it is data about behavioural patterns, like where my phone has been, where I used my debit card, the things I click, the time I spent on pages, the thumbprint of my specific browser and plugins setup etc. The footsteps we leave on a trail in the forest aren’t personal data, but our digital footsteps are, because the traces can, due to the persistence of those tracks, more easily than in the woods be followed back to their source as well as can get compared to other tracks you leave behind.

Currently those footsteps in the digital woods are contractualised away into the possession of private owners of the woods we walk in, i.e. the tech platforms. But there’s a strong communal aspect to your and my digital footsteps as personal data. We need to determine how we can use that collectively, and how to govern that use. Talking about the ownership of data, especially the data we create by being out in the (semi) public sphere (e.g. tech platforms) and the ability to trade for it (like Solid suggests), has 2 effects: it bakes in the acceptance that me allowing FB to work with my data is a contract between equal parties (GDPR rightly tries to address this assymmetry). Aza Raskin in his keynote mentioned this too, saying tech platforms should be more seen and regulated as fiduciaries, to acknowledge the power asymmetry. And it takes the communal part of what we might do with data completely out of the equation. I can easily imagine when and where I’d be ok with my neighbours, my local government, a local NGO, or specific topical/sectoral communities etc. having access to using data about me. Where that same use by FB et al would not be ok at all under any circumstance.

In the intro’s to the public data summit civil society however was very much absent, there was talk about government and their data, and how it needed the private sector to do something valuable with it. Where to me open (e-)government, and opening data is very much about allowing the emergence and co-creation of novel public services by government/governance working together with citizens. Things we maybe not now regard as part of public institutions, structures or the role of government, but that in our digitised world very well could, or even should, be.