A public sector client announced last week that working from home will be their default until September 1st for certain, and maybe until January 1st. I can imagine why, there is no real way to house their 1600 staff under distancing guidelines, and the staff restaurant (that usually caters to some 1200 people in 90 minutes each day) has no real way of accomodating people for lunch in meaningful numbers. Three similar organisations in a different part of the country announced they would keep working from home until January.

I wonder how this may shift modes of working over time, now that centralised working is replaced by distributed working. When will public sector organisations realise they now have eyes and ears on the ground everywhere in their area, and put that to good use? In our experience not ‘going outside’ for real stories and feedback from directly involved people often reduces the quality of choices and decisions made, as observations get replaced by assumptions. This is true for any type of larger organisation I think, but now we all of a sudden have turned them into a distributed network.

If you’re in a larger organisation working from home, do you have a notion of where all your people are, and is that geographical spread a potential instrument in your work?

Based on my conversation with Boris Mann about Fission, and visiting a Decentralised Web Meetup in Amsterdam because his Fission co-founder Brooklyn Zelenka, I started exploring the technology they work with and are building. First step was getting access to and understanding the ideas behind IPFS.

What makes IPFS interesting

The IPFS about file (if you click that link, you’re visiting a file on IPFS from your browser) says a variety of things, but a few elements are key imo.

First, it is a peer to peer system, much like we’ve seen many before. When you download a file to your system it will come in bits and pieces from multiple other computers, somewhere in the network, who have that file available. Whatever is the easiest way to get that file to you, is the way followed.

Second there is a key difference in how file addresses work in IPFS, compared to the web or on your local drive. We are used to files having names, and addresses being a representation of the location of that file. The URL for this blog points to a specific server, where in a specific folder, a specific filename resides. That file returns the content. Similarly the address for a file on my drive is based on the folder structure and the name of the file. IPFS addresses files based on their content, and does so with a hash (a cryptographic representation of the content of a file).
Naming things based on a hash of its contents means that if the content of a file changes, the name will change too. For every file the content will match what it says on the tin, and versioning is built in.

Combine that with a peer to peer system, and you have a way of addressing things globally without being tied to location. You also have a way to ensure that whatever you find in a given file is exactly what was originally in the file. https://mydomain.com/catpicture.html may have had a cat picture at the start that later got replaced by malware, but you wouldn’t know. With earlier p2p systems to exchange files like Napster of Bittorrent you always had to be careful about what it was you actually downloaded. Because the content might be very different from what the name suggested. With IPFS those issues are done away with.

Currently (location based) addressing on the web is centralised (through domain registration and DNS), and decoupling addresses from locations like IPFS does allows decentralisation. This decentralisation is important to me, as it helps build agency and make that agency resilient, as decentralisation is much closer to local first principles.

Getting IPFS set-up on my laptop

Boris was helpful in pointing the way for me how to set-up IPFS (and Fission). There is a IPFS desk top client, which makes it very easy to do. I installed that and then you have a basic browser that shows you which of your own files you are sharing, and which ones you are re-sharing. It also when you are looking at a file shows where it comes from.

I uploaded a PDF as a hello world message. In the screenshot above the Qm…… series of characters you see underneath the local file name helloworld.pdf is the hash that is used to identify the file across the IPFS network. If you ‘pin’ a file (or folder), you prevent it from being deleted from your cache, and it stays available to the wider network with that Qm…. string as address. Which also means a drawback of hashed-content addressing is non-human readable addresses, but they’re usually intended for machines anyway (and otherwise, there’s a use case for QR codes here maybe)

With IPFS set-up, I started playing with Fission. Fission builds on IPFS, to allow you to deploy apps or websites directly from your laptop. (“build and go live while on a plane without wifi”). It’s meant as tooling for developers, in other words not me, but I was curious to better understand what it does. More in a next post.

Good catching up with you after too long Boris. Excited to hear about Fission. Later on was wondering how IPFS as starting point plays out with highly dynamic material (e.g. real time data sets), versus dat for such data sets. Pleasing to note our thinking since our joined session at BarCamp Brussels in 2006 has evolved along similar lines in the current timeframe, except you more on the tech side of things, and me on the change management side of it.

This is a start to more fully describe and explore a distributed version of digitisation, digitalisation and specifically digital transformation, and state why I think bringing distributed / networked thinking into them matters.

Digitising stuff, digitalising routines, the regular way

Over the past decades much more of the things around us became digitised, and in recent years much of the things we do, our daily routines and work processes, have become digitalised. Many of those digitalised processes are merely digitised replicas of their paper predecessors. Asking for a government permit for instance, or online banking. There’s nothing there that wasn’t there in the paper version. Sometimes even small steps in those processes still force you to use paper. At the start of this year I had to apply for a declaration that my company had never been involved in procurement fraud. All the forms I needed for it (30 pages in total!), were digitised and I filled them out online, but when it came to sending it in, I had to print the PDF resulting from those 30 pages, and send it through snail mail. I have no doubt that the receiving government office’s first step was to scan it all before processing it. Online banking similarly is just a digitised paper process. Why don’t all online bank accounts provide nifty visualisation, filtering and financial planning tools (like alerts for dates due, saving towards a goal, maintaining a buffer etc.), now that everything is digital? The reason we laugh at Little Britains ‘computer says no’ sketches, is because we recognise all too well the frustration of organisations blindly trusting their digitalised processes, and never acknowledging or addressing their crappy implementation, or the extra work and route-arounds their indifference inflicts.

Digital transformation, digital societies

Digital transformation is the accumulated societal impact of all those digital artefacts and digitalised processes, even if they’re incomplete or half-baked. Digital transformation is why I have access to all those books in the long tail that never reached the shelves of any of the book shops I visited in decades past, yet now come to my e-reader instantly, resulting in me reading more and across a wider spectrum than ever before. Digital transformation is also the impact on elections that almost individually targeted data-driven Facebook advertising caused by minutely profiling undecided voters.

Digital transformation is often referred to these days, in my work often also in the context of development and the sustainable development goals.
Yet, it often feels to me that for most intents and purposes this digital transformation is done to us, about us but not of us. It’s a bit like the smart city visions corporations like Siemens and Samsung push(ed), that were basically devoid of life and humanity. Quality of life reduced and equated to security only, in sterilised cities, ignoring that people are the key actors, as critiqued by Adam Greenfieldin ‘Against the Smart City’ in 2013.

Human digital networks: distributed digital transformation

The Internet is a marvellous thing. At least it is when we use it actively, to assist us in our routines and in our efforts to change, learn and reach out. As social animals, our human interaction has always been networked where we fluently switch between contexts, degrees of trust and disclosure, and routing around undesired connections. In that sense human interaction and the internet’s original design principle closely match up, they’re both distributed. In contrast most digitalisation and digital transformation happens from the perspective of organisations and silos. Centralised things, where some decide for the many.

To escape that ‘done to us, about us, not of us’, I think we need to approach digitisation, digitalisation and digital transformation from a distributed perspective, matching up our own inherently networked humanity with our newly (since 30 yrs) networked global digital infrastructure. We need to think in terms of distributed digital transformation. Distributed digital transformation (making our own digital societal impact), building on distributed digitisation (making our things digital), and on distributed digitalisation (making our routines digital).

Signs of distributed digitisation and digitalisation

Distributed digitisation can already be seen in things like the quantified self movement, where individuals create data around themselves to use for themselves. Or in the sensors I have in the garden. Those garden measurements are part of something you can call distributed digitalisation, where a network of similar sensors create a map of our city that informs climate adaptation efforts by local government. My evolving information strategies, with a few automated parts, and the interplay of different protocols and self-proposed standards that make up the Indieweb also are examples of distributed digitalisation. My Networked Agency framework, where small groups of relationships fix something of value with low threshold digital technology, and network/digital based methods and processes, is distributed digitisation and distributed digitalisation combined into a design aid for group action.

Distributed digital transformation needs a macroscope for the new civil society

Distributed digital transformation, distributed societal impact seems a bit more elusive though.
Civil society is increasingly distributed too, that to me is clear. New coops, p2p groups, networks of individual actors emerge all over the world. However they are largely invisible to for instance the classic interaction between government and the incumbent civil society, and usually cut-off from the scaffolding and support structures that ‘classic’ activities can build on to get started. Because they’re not organised ‘the right way’, not clearly representative of a larger whole. Bootstrapping is their only path. As a result these initiatives are only perceived as single elements, and the scale they actually (can) achieve as a network remains invisible. Often even in the eyes of those single elements themselves.

Our societies, including the nodes that make up the network of this new type of civil society, lack the perception to recognise the ‘invisible hand of networks’. A few years ago already I discussed with a few people, directors of entities in that new civil society fabric, how it is that we can’t seem to make our newly arranged collective voices heard, our collective efforts and results seen, and our collective power of agency recognised and sought out for collaboration? We’re too used, it seems, to aggregating all those things, collapsing them into a single voice of a mouthpiece that has the weight of numbers behind it, in order to be heard. We need to learn to see the cumulative impact of a multitude of efforts, while simultaneously keeping all those efforts visible on their own. There exist so many initiatives I think that are great examples of how distributed digitalisation leads to transformation, but they are largely invisible outside their own context, and also not widely networked and connected enough to reach their own full potential. They are valuable on their own, but would be even more valuable to themselves and others when federated, but the federation part is mostly missing.
We need to find a better way to see the big picture, while also seeing all pixels it consists of. A macroscope, a distributed digital transformation macroscope.

At our birthday unconference STM18 last week, Frank gave a presentation (PDF) on running your own website and social media tools separate from the commercial silos like Facebook, Twitter etc. Collected under the name IndieWeb (i.e. the independent web), this is basically what used to be the default before we welcomed the tech companies’ silos into town. The IndieWeb never went away of course, I’ve been blogging in this exact same space for 16 years now, and ran a personal website for just under a decade before that. For broader groups to take their data and their lives out of silos it requires however easy options out, and low-threshold replacement tools.

One of the silos to replace is Twitter. There are various other tools around, like Mastodon. What they have in common is that it’s not run by a single company, but anyone can run a server, and then they federate, i.e. all work together. So that if I am on server 128, and you are on server 512 our messages still arrive in the right spot.

I’ve been looking at running a Mastodon instance, or similar, myself for a while. Because yes, there are more Mastodon servers (I have accounts on mastodon.cloud and on mastodon.nl), but I know even less about who runs them and their tech skills, attitudes or values than I know about Twitter. I’ve just exchanged a big silo for a smaller one. The obvious logical endpoint of thinking about multiple instances or servers, is that instances should be individual, or based on existing groups that have some cohesion. More or less like e-mail, which also is a good analogy to think of when trying to understand Mastodon account names.

Ideally, running a Mastodon instance would be something you do yourself, and which at most has your household members in it. Or maybe you run one for a specific social context. So how easy is it, to run Mastodon myself.

Not easy.

I could deploy it on my own VPS. But maintaining a VPS is rather a lot of work. And I would need to find out if I run the right type of operating system and other packages to be able to do it. Not something for everyone, nor for me without setting aside some proper time.

Or I could spin up a Mastodon instance at Amazon’s server parks. That seems relatively easy to do, requiring a manageable list of mouse clicks. It doesn’t really fit my criteria though, even if it looks like a relatively quick way to at least have my own instance running. It would take me out of Twitter’s software silo, but not out of Amazon’s hardware silo. Everything would still be centralised on a US server, likely right next to the ones Twitter is using. Meaning I’d have more control over my own data, but not be bringing my stuff ‘home’.

Better already is something like Masto.host, run by a volunteer named Hugo Gameiro who’s based in Portugal. It provides ease of use in terms of running your own instance, which is good, but leaves open issues of control and flexibility.

So I’d like a solution that either can run on a package with my local hosting provider or figure out how to run it on cheap hardware like Raspberry Pi which can be connected to my home router. The latter one I’d prefer, but for now I am looking to learn how easy it is to do the former.

Mastodon and other similar tools like Pleroma require various system components my hosting provider isn’t providing, nor likely to be willing to provide. Like many other hosters they do have library of scripts you can automatically install with all the right dependencies and settings. In the section ‘social media’ it doesn’t mention Mastodon or any other ‘modern’ varity, but they do list GnuSocial and its predecessor StatusNet. GnuSocial is a script that uses the same protocols like Mastodon, OStatus and ActivityPub. So it should be able to communicate with Mastodon.

I installed it and created an account for myself (and myself as administrator). Then I tried to find ways to federate with Mastodon instances. The interface is rather dreadful, and none of the admin settings seemed to hint at anything that lies beyond the GnuSocial instance itself, no mention of anything like federation.

The interface of GnuSocial

However in my profile a button labelled “+remote” popped up. And through that I can connect to other people on other instances. Such as the people I am connected to on Mastodon already. I did that, and it nicely links to their profiles. But none of their messages show up in my stream. Even if it looks I can send messages to them from my GnuSocial instance as I can do things like @someotheruser, they don’t seem to arrive. So if I am indeed sending something, there’s no-one listening at the other end.

I did connect to others externally

And I can send messages to them, although they do not seem to arrive

So that leaves a number of things for next steps to explore. Also on Mastodon in conversation with Maarten I noticed that I need to express better what I’m after. Something for another posting. To be continued.

Peter Rukavina picks up on my recent blogging about blogging, and my looking back on some of the things I wrote 10 to 15 years ago about it (before the whole commercial web started treating social interaction as a adverts targeting vehicle).

In his blogpost in response he talks about putting back the inter in internet, inter as the between, and as exchanges.

… all the ideas and tools and debates and challenges we hashed out 20 years ago on this front are as relevant today as they were then; indeed they are more vital now that we’ve seen what the alternatives are.

And he asks “how can we continue to evolve it“?

That indeed is an important question, and one that is being asked in multiple corners. By those who were isolated from the web for years and then shocked by what they found upon their return. But also by others, repeatedly, such as Anil Dash and Mike Loukides of O’Reilly Media, when they talk about rebuilding or retaking the web.

Part of it is getting back to seeing blogging as conversations, conversations that are distributed across your and my blogs. This is what made my early blog bloom into a full-blown professional community and network for me. That relationships emerge out of content sharing, which then become more important and more persistent than the content, was an important driver for me to keep blogging after I started. These distributed conversations we had back then and the resulting community forming were even a key building block of my friend Lilia’s PhD a decade ago.

So I’m pleased that Peter responds to my blogging with a blogpost, creating a distributed conversation again, and like him I wonder what we can do to augment it. Things we had ideas about in the 00’s but which then weren’t possible, and maybe now are. Can we make our blogs smarter, in ways that makes the connections that get woven more tangible, discoverable and followable, so that it can become an enriching and integral part of our interaction?