A good presentation I attended this afternoon at World Summit AI 2019. Will blog about it, but bookmarking it here for now.

Read Escaping Skinner’s Box: AI and the New Era of Techno-Superstition (philosophicaldisquisitions.blogspot.com)

One of the things AI will do is re-enchant the world and kickstart a new era of techno-superstition. If not for everyone, then at least for most people who have to work with AI on a daily basis. The catch, however, is that this is not necessarily a good thing. In fact, it is something we should worry about.

Fifteen years ago today Elmine Wijnia published a paper “Understanding Weblogs: a Communicative Perspective” (PDF) for the BlogTalk conference based on her master thesis. In it she discusses weblogs as a communications medium and compares their role and potential a.o. with Habermas’ philosophical work on communications (Habermas’ work on this predates the web). I have a ‘on this day in….’ widget in my sidebar, and it showed me I had blogged about it back then.

From my posting then, I feel much is still the same, and much is still as key as then in bringing online expression and interaction forward.

In my view Elmine’s work does something very important, which is to firmly place weblogs in communications, and not put the fact that it’s technology-based first.

Having just organised an IndieWebCamp where technology is very much front and center, while I find we struggle to get broader involvement, this is a very pertinent reminder.

It describes what we actually do, in stead of which tools we use to do it.

This is a core element in my thinking about technology in general, unchanged in all these years. It is about what people do and can do. The agency that technology provides.

She positioned weblogs as a new medium because it combines three information patterns in itself, that previously stood on their own (e.g. in separate digital tools): consultation, registration, and conversation.

In part it feels like silo’s such as FB and Twitter break that combination of multiple patterns again, after weblogs joined them, and from which these silos themselves in turn emerged. The ‘back to the blog‘ urge I’ve felt and lived here in the past two years, is an expression of seeking the richness that the combination provides. My involvement in IndieWeb which tries to strengthen the ties between those patterns by adding new functionality to our blogging tools is also explained by it.

Because it allows better communication. Which is what matters. As Kicks Condor phrased it when he reflected on my information strategies

It is very focused on just being a human who is attempting to communicate with other humans—that’s it really.

Elmine and Habermas still point us in that direction. We can do better in this, and we should do better in this.

Last week danah boyd was presented with an EFF award. She gave a great acceptance speech titled Facing the Great Reckoning Head-On, that contains a plethora of quotes to highlight. Exploring how to make sense of the entire context and dynamics, in which the MIT Media Lab scandal of funding from a badly tainted source could take place (which I previously mentioned here, here and here.) So it’s best to just go read the entire thing.

In stark contrast, Lawrence Lessig’s ‘exploration’ makes no sense to me, and comes across as tone deaf, spending hundreds of words putting forward a straw man that if you accept tainted funding it always should be anonymous, while saying he personally wouldn’t accept such funding. That might well be, but has no real bearing on the case. Instead of putting forward how hard it is to raise funding, he could just as well have argued that higher education should be publicly funded, and funded well to avoid situations like at MIT Media Lab. A model that works well around the globe. Lessig wrote a book against corruption, meaning the funding focus of US politics, but doesn’t here call out the private funding of higher education on the same terms, even though the negative consequences are the same.

On the other hand boyd’s speech addresses the multiple layers involved. One’s own role in a specific system, and in a specific institute, how privilege plays out. How the deeply personal, the emotional and the structures and systems we create relate to and mutually impact each other. Acknowledging and sketching out the complexity, and then to seek where meaningful boundaries are is much maturer way to take this on than Lessig’s highlighting a single dimension of a situation which seems minimally pertinent to it, and worse because of its ‘flatness’ is easily perceived to be actively denying the emotional strata involved and in dire need of recognition.

As said go read the entire speech, but I’ll pick out a few quotes nevertheless. They are pertinent to topics I blog about here, such as the recently launched TechPledge, the role of community, the keys to agency, and resonates with my entire take on technology.

The story of how I got to be standing here is rife with pain and I need to expose part of my story in order to make visible why we need to have a Great Reckoning in the tech industry. This award may be about me, but it’s also not. It should be about all of the women and other minorities who have been excluded from tech by people who thought they were helping.

I am here today in-no-small-part because I benefited from the generosity of men who tolerated and, in effect, enabled unethical, immoral, and criminal men. And because of that privilege, I managed to keep moving forward even as the collateral damage of patriarchy stifled the voices of so many others around me.

What’s happening at the Media Lab right now is emblematic of a broader set of issues plaguing the tech industry and society more generally. Tech prides itself in being better than other sectors. But often it’s not.

If change is going to happen, values and ethics need to have a seat in the boardroom. Corporate governance goes beyond protecting the interests of capitalism. Change also means that the ideas and concerns of all people need to be a part of the design phase and the auditing of systems, even if this slows down the process.

…whether we like it or not, the tech industry is now in the business of global governance.

“Move fast and break things” is an abomination if your goal is to create a healthy society…In a healthy society, we strategically design to increase social cohesion because binaries are machine logic not human logic.

…accountability without transformation is simply spectacle.

The goal shouldn’t be to avoid being evil; it should be to actively do good. But it’s not enough to say that we’re going to do good; we need to collectively define — and hold each other to — shared values and standards.

Human progress needs the the tech sector to be actively reflective, and to continuously scrutinise its ethics, the values and morals actually expressed in behaviour.

During our work on shaping the Tech Pledge last week, we looked into progress as it is mentioned in the Copenhagen Letter as being different from just innovation. The Copenhagen Letter was written in 2017 as output of the same process that now delivered the Tech Pledge.

20190906_163157
Thomas, Techfestival’s initiator, reading the Copenhagen Letter
at the start of this year’s CPH150

Progress is not a fixed point on the horizon we said. What is progress shifts, with public opinion and new ideas of what is good, just, true and beautiful emerging, and with the movement of time itself. When the first steam engines appeared their plumes of smoke heralded a new age, that drove industrialisation, nation forming and, through rail roads, changed what cities were for and how city and countryside relate to each other. Steam engines still exist at the very heart of every electricity plant in the world, but progress has moved on from the steam engine.
We also realised that progress does not have fixed and static definition, and so we are free to fill it with whatever definition we think fits in the context we are looking at.

In terms of technology then, progress is a motion, a process, and in our group we defined it as (new) technology plus direction/sense of purpose. Technology here, to me at least, being not just ‘hard tech’, but also ‘soft tech’. Our methods, processes, organisational structures are technology just as much as fountain pens, smart phones and particle accelerators.

So we named a number of elements that fit into this understanding progress as a process and search for direction.

  • It is a part of human nature to strive for change and progress, even if not every single individual in every context and time will be so inclined. This desire to progress is more about setting a direction than a fixed end state. Hence our use of “(new) technology with intended direction” as working definition.
  • We need to be accountable to how anything we make fits the intended direction, and additionally whether it not externalises human or ecological costs, or conflicts with our natural systems, as these are often ignored consequences.
  • We recognise that direction may get lost, or ends up in need of change, in fact we realise that anything we make is largely out of our control once released into the world.
  • So we pledge to continuous reflection on the direction our tech is taking us in practice. Not just during its development or launch, but throughout its use.
  • Whether we want to use the tech we created ourselves, or see our loved ones use it is a smell test, if it doesn’t pass our emotional response something is off.
  • What doesn’t pass the smell test needs to be explored and debated
  • We have a civic duty to organise public debate about the value and direction of our tech right alongside our tech. Not just at the start of making tech, but throughout the life cycle of something you make. If you make something you also need to organise the continuous debate around it to keep a check on its societal value and utility, and to actively identify unintended consequences.
  • If our tech is no longer fit for purpose or takes an unintended turn, we have a duty to actively adapt and /or publicly denounce the aspect of our tech turned detrimental.

20190907_120354Working on the pledge

Regardless of what the Copenhagen Pledge says in addition to this, this reflective practice is something worth wile in itself for me to do: continuously stimulate the debate around what you make, as part and parcel of the artefacts you create. This is not a new thing to me, it’s at the core of the unconferences we organise, where lived practice, reflection and community based exploration are the basic ingredients.

To me what is key in the discussions we had is that this isn’t about all tech in general, though anyone is welcome to join any debate. This is about having the moral and civic duty to actively create public debate around the technology you make and made. You need to feel responsible for what you make from inception to obsolescence, just as you always remain a parent to your children, regardless of their age and choices as adult. The connection to self, to an emotional anchoring of this responsibility is the crucial thing here.

So there I was on a rainy Copenhagen evening finding myself in a room with 149 colleagues for 24 hours, nearing midnight, passionately arguing that technologists need to internalise and own the reflection on the role and purpose of their tech, and not leave it as an exercise to the academics in the philosophy of technology departments. A duty to organise the public debate about your tech alongside the existence of the tech itself. If your own tech no longer passes your own smell test then actively denounce it. I definitely felt that emotional anchoring I’m after in myself right there and then.

As part of the Techfestival last week, the Copenhagen 150, which this time included me, came together to write a pledge for individual technologists and makers to commit their professional lives to. A bit like a Hippocratic oath, but for creators of all tech. Following up on the Copenhagen Letter, which was a signal, a letter of intent, and the Copenhagen Catalog which provide ‘white patterns’ for tech makers, this years Tech Pledge makes it even more personal.

I pledge

  • to take responsibility for what I create.
  • to only help create things I would want my loved ones to use.
  • to pause to consider all consequences of my work, intended as well as unintended.
  • to invite and act on criticism, even when it is painful.
  • to ask for help when I am uncertain if my work serves my community.
  • to always put humans before business, and to stand up against pressure to do otherwise, even at my own risk.
  • to never tolerate design for addiction, deception or control.
  • to help others understand and discuss the power and challenges of technology.
  • to participate in the democratic process of regulating technology, even though it is difficult.
  • to fight for democracy and human rights, and to improve the institutions that protect them.
  • to work towards a more equal, inclusive and sustainable future for us all, following the United Nations global goals.
  • to always take care of what I start, and to fix what we break.

I signed the pledge. I hope you will do to. If you have questions about what this means in practical ways, I’m happy to help you translate it to your practice. A first step likely is figuring out which questions to ask of yourself at the start of something new. In the coming days I plan to blog more from my notes on Techfestival and those postings will say more about various aspects of this. You are also still welcome to sign the Copenhagen Letter, as well as individual elements of the Copenhagen Catalog.

Since the summer I am holding three questions that are related. They all concern what role machine learning and AI could fulfil for an individual or an everyday setting. Everyman’s AI, so to speak.

The first question is a basic one, looking at your house, and immediate surroundings:

1: What autonomous things would be useful in the home, or your immediate neighbourhood?

The second question is more group and community oriented one:

2: What use can machine learning have for civic technology (tech that fosters citizen’s ability to do things together, to engage, participate, and foster community)?

The third question is perhaps more a literary one, an invitation to explore, to fantasise:

3 What would an “AI in the wall” of your home be like? What would it do, want to do? What would you have it do?

(I came across an ‘AI in the wall’ in a book once, but it resided in the walls of a pub. Or rather it ran the pub. It being a public place allowed it to interact in many ways in parallel, so as to not get bored)