We’re in a time where whatever is presented to us as discourse on Facebook, Twitter or any of the other platforms out there, may or may not come from humans, bots, or someone/a group with a specific agenda irrespective of what you say or respond. We’ve seen it at the political level, with outside influences on elections, we see it in things like gamer gate, and in critiques of the last Star Wars movie. It creates damage on a societal level, and it damages people individually. To quote Angela Watercutter, the author of the mentioned Star Wars article,

…it gets harder and harder to have an honest discussion […] when some of the speakers are just there to throw kerosene on a flame war. And when that happens, when it’s impossible to know which sentiments are real and what motivates the people sharing them, discourse crumbles. Every discussion […] could turn into a […] fight — if we let it.

Discourse disintegrates I think specifically when there’s no meaningful social context in which it takes place, nor social connections between speakers in that discourse. The effect not just stems from that you can’t/don’t really know who you’re conversing with, but I think more importantly from anyone on a general platform being able to bring themselves into the conversation, worse even force themselves into the conversation. Which is why you never should wade into newspaper comments, even though we all read them at times because watching discourse crumbling from the sidelines has a certain addictive quality. That this can happen is because participants themselves don’t control the setting of any conversation they are part of, and none of those conversations are limited to a specific (social) context.

Unlike in your living room, over drinks in a pub, or at a party with friends of friends of friends. There you know someone. Or if you don’t, you know them in that setting, you know their behaviour at that event thus far. All have skin in the game as well misbehaviour has immediate social consequences. Social connectedness is a necessary context for discourse, either stemming from personal connections, or from the setting of the place/event it takes place in. Online discourse often lacks both, discourse crumbles, entropy ensues. Without consequence for those causing the crumbling. Which makes it fascinating when missing social context is retroactively restored, outing the misbehaving parties, such as the book I once bought by Tinkebell where she matches death threats she received against the sender’s very normal Facebook profiles.

Two elements therefore are needed I find, one in terms of determining who can be part of which discourse, and two in terms of control over the context of that discourse. They are point 2 and point 6 in my manifesto on networked agency.

  • Our platforms need to mimick human networks much more closely : our networks are never ‘all in one mix’ but a tapestry of overlapping and distinct groups and contexts. Yet centralised platforms put us all in the same space.
  • Our platforms also need to be ‘smaller’ than the group using it, meaning a group can deploy, alter, maintain, administrate a platform for their specific context. Of course you can still be a troll in such a setting, but you can no longer be one without a cost, as your peers can all act themselves and collectively.
  • This is unlike on e.g. FB where the cost of defending against trollish behaviour by design takes more effort than being a troll, and never carries a cost for the troll. There must, in short, be a finite social distance between speakers for discourse to be possible. Platforms that dilute that, or allow for infinite social distance, is where discourse can crumble.

    This points to federation (a platform within control of a specific group, interconnected with other groups doing the same), and decentralisation (individuals running a platform for one, and interconnecting them). Doug Belshaw recently wrote in a post titled ‘Time to ignore and withdraw?‘ about how he first saw individuals running their own Mastodon instance as quirky and weird. Until he read a blogpost of Laura Kalbag where she writes about why you should run Mastodon yourself if possible:

    Everything I post is under my control on my server. I can guarantee that my Mastodon instance won’t start profiling me, or posting ads, or inviting Nazis to tea, because I am the boss of my instance. I have access to all my content for all time, and only my web host or Internet Service Provider can block my access (as with any self-hosted site.) And all blocking and filtering rules are under my control—you can block and filter what you want as an individual on another person’s instance, but you have no say in who/what they block and filter for the whole instance.

    Similarly I recently wrote,

    The logical end point of the distributed web and federated services is running your own individual instance. Much as in the way I run my own blog, I want my own Mastodon instance.

    I also do see a place for federation, where a group of people from a single context run an instance of a platform. A group of neighbours, a sports team, a project team, some other association, but always settings where damaging behaviour carries a cost because social distance is finite and context defined, even if temporary or emergent.

    In just over a week I will be joining the Nuremberg IndieWebCamp, together with Frank Meeuwsen. As I said earlier, like Frank, I’m wondering what I could be working on, talking about, or sharing at the event. Especially as the event is set up to not just talk but also build things.

    So I went through my blogpostings of the past months that concerned the indie web, and made a list of potential things. They are of varying feasibility and scope, so I can probably strike off quite a few, and should likely go for the most simple one, which could also be re-used as building block for some of the less easy options. The list contains 13 things (does that have a name, a collection of 13 things, like ‘odd dozen’ or something? Yes it does: a baker’s dozen, see comment by Ric below.). They fall into a few categories: webmention related, rss reader related, more conceptual issues, and hardware/software combinations.

    1. Getting WebMention to display the way I want, within the Sempress theme I’m using here. The creator of the theme, Matthias Pfefferle, may be present at the event. Specifically I want to get some proper quotes displayed underneath my postings, and also understand much better what webmention data is stored and where, and how to manipulate it.
    2. Building a growing list of IndieWeb sites by harvesting successful webmentions from my server logs, and publish that in a re-usable (micro-)format (so that you could slowly map the Indieweb over time)
    3. Make it much easier for myself to blog from mobile, or mail to my blog, using the MicroPub protocol, e.g. using the micropublish client.
    4. Dive into the TinyTinyRSS datastructure to better understand. First to be able to add tags to feeds (not articles), as per my wishlist for RSS reader functionality.
    5. Make basic visualisation possible on top of TinyTinyRSS database, as a step to a reading mode based on pattern detection
    6. Allow better search across TinyTinyRSS, full text, to support the reading mode of searching material around specific questions I hold
    7. Adding machine translation to TinyTinyRSS, so I can diversify my reading, and compare original to its translation on a post by post basis
    8. Visualising conversations across blogs, both for understanding the network dynamics involved and for discovery
    9. Digging up my old postings 2003-2005 about my information strategies and re-formulate them for networked agency and 2018
    10. Find a way of displaying content (not just postings, but parts of postings) limited to a specific audience, using IndieAuth.
    11. Formulate my Networked Agency principles, along the lines of the IndieWeb principles, for ‘indietech’ and ‘indiemethods’
    12. Attempt to run FreedomBone on a Raspberry Pi, as it contains a range of tools, including GnuSocial for social networking. (Don’t forget to bring a R Pi for it)
    13. Automatically harvest my Kindle highlights and notes and store them locally in a way I can re-use.

    These are the options. Now I need to pick something that is actually doable with my limited coding skills, yet also challenges me to learn/do something new.

    Previously I had tried to get GNU Social running on my own hosted domain as a way to interact with Mastodon. I did not get it to work, for reasons unclear to me, I could follow people on Mastodon but would not receive messages, nor would they see mine.

    This morning I saw the message below in my Mastodon timeline.

    It originates from Peter Rukavina’s own GNU Social install. So at least he got the ‘sending mentions’ part working. He is also able to receive my replies, as my responses show up underneath his original message. Including ones I limited the visibility of it seems.

    Now I am curious to compare notes. Which version of GNU Social? Any tweaks? Does Peter receive my timeline? How do permissions propagate (I only let people follow me after I approve them)? And more. I notice that his URL structures are different from those in my GNU Social install for instance.

    In the past weeks I’ve enjoyed using a bot that turns my blog’s RSS feed into an Activity Pub stream. That stream I follow from my Mastodon account, and that way I can ‘retweet’ any of my postings in an easy way. You too can follow my blog on Mastodon through the account @ton@bots.tinysubversions.com.

    The bot that turns RSS into Activity Pub was created by Darius Kazemi, a coding artist and art creating coder. In the context of musing about my ideal RSS reader, I started running my own Tiny Tiny RSS instance. Tt-rss is not only a feed reader but can also create feeds, e.g. from the things you bookmark or like while reading. So I thought if you’d mount the bot that Darius created on the back of Tt-rss, you could publish curated feeds of what you read not just as rss but as activity pub streams. I pinged Darius Kazemi to hear if the code is available.

    Screenshot of me resharing a blogpost on Mastodon.

    I was a bit surprised to see a Dutch title above one of Peter’s blog posts. It referred to the blog of Marco Derksen, that I follow. I think Peter may have found it in the list of blogs I follow (in OPML) that I publish.

    Peter read it through machine translation. Reading the posting made me realise I only follow blogs in the languages I can read, but that that is limiting my awareness of what others across Europe and beyond blog about.

    So I think I need to extend my existing list of demands for an RSS reader with built-in machine translation. As both Tiny Tiny RSS which I self host and Google translate have API’s that should be possible to turn into a script.

    This looks as the first potential replacement for Evernote I’ve come across. It is called Standard Notes, and comes open source and with full encryption. It allows for the server to be self-hosted to sync stuff across devices (stand alone everything is stored locally), and use your self-hosted cloud as the place for storage of attachments. They sell subscriptions to extended functionality, but those extensions you can also self-host. I’d have to take a closer look at how I might replace some of the key Evernote functionality with it, or arrange some of my additional wishes. See my earlier list on what I’m looking for. At first glance it looks like a thing worth testing. How about you Peter?

    As I didn’t succeed yet in getting Mastodon to run on a Raspberry Pi, nor in running a Gnu Social instance that actually federates on my hosting package, I’ve opted for an intermediate solution to running my own Mastodon instance.

    Key in all this is satisfying three dimensions: control, flexibility and ease of use. My earlier attempts satisfy the control and flexibility dimensions, but as I have a hard time getting them to work, do not satisfy the ease of use dimension yet.

    At the same time I did not want to keep using Mastodon on a generic server much longer, as it builds up a history there which with every conversation ups the cost of leaving.

    The logical end point of the distributed web and federated services is running your own individual instance. Much as in the way I run my own blog, I want my own Mastodon instance.

    Such an individual instance needs to be within my own scope of control. This means having it at a domain I own. and being able to move everything to a different server at will.

    There is a hoster, Masto.host run by Hugo Gameiro, who provides Mastodon hosting as a monthly subscription. As it allows me to use my own domain name, and provides me with admin privileges of the mastodon instance, this is a workable solution. When I succeed in getting my own instance of Mastodon running on the Rapsberry Pi, I can simply move the entire instance at Masto.host to it.

    Working with Hugo at Masto.host was straightforward. After registering for the service, Hugo got in touch with me to ensure the DNS settings on my own domain were correct, and briefly afterwards everything was up and running.
    Frank Meeuwsen, who started using Masto.host last month, kindly wrote up a ‘moving your mastodon account’ guide in his blog (in Dutch). I followed (most) of that, to ensure a smooth transition.

    Using Mastodon? Do follow me at https://m.tzyl.nl/@ton.

    Screenshots of my old Mastodon.cloud account, and my new one on my own domain. And the goodbye and hello world messages from both.

    Today I had a bit of time to try running Mastodon on Raspberry Pi again. Last week I got stuck as some of the mentioned dependencies in the Mastodon installation guide could not be installed. As the step where I got stuck deals with a different Linux version, I tried simply skipping to the next step.

    From the linked guide the steps ubuntu dependencies, node.js repository, yarn repository did not work.
    The step after that, for various other dependencies, works again (which includes yarn actually).
    Then a few steps follow that need to be executed as the specific user for mastodon. Installing ruby and node.js works fine, and almost all steps to install ruby and node.js dependencies. The final 2 steps of the dependencies throw errors (bundle install and yarn install). As at least some parts of the bundle install command do get executed, but not all. These are the last two steps before actually getting into configuring the installation, so it feels like being nearly there.

    I’d have to dive deeply into the logfiles to see what wasn’t installed and what is missing. Not sure if I will easily find time to do so, and if I would actually understand what the log files tell me. It is also unclear if there is a relationship with the three steps I have skipped earlier in the process as they didn’t work.

    Saturday I visited the Maker Faire in Eindhoven. Jeroen of the Frysklab team invited me to come along, when their mobile FabLab was parked in our courtyard for Smart Stuff That Matters. They had arranged a touring car to take a group of librarians and educators to the Maker Faire, and invited me to join the bus ride. So I took a train to Apeldoorn and then a taxi out to a truck stop where the bus was scheduled to stop for a coffee break, and then joined them for the rest of the drive down south.

    The Maker Faire was filled with all kinds of makers showing their projects, and there was a track with 30 minute slots for various talks.
    It was fun to walk around, meet up with lots of people I know. Lots of projects shown seemed to lack a purpose beyond the initial fascination of technological possibilities however. There were many education oriented projects as well, and many kids happily trying their hand on them. From a networked agency point of view there were not that many projects that aimed for collective capabilities.

    Some images, and a line or two of comment.

    Makerfair Eindhoven
    En-able, a network of volunteers printing 3d-printed prosthetics, was present. Talked to the volunteer in the image, with his steam-punk prosthetic device. They printed 18 hands and arm prosthetics for kids in the Netherlands last year, and 10 this year until now. Children need new prosthetics every 3 to 6 months, and 3d printing them saves a lot of costs and time. You even get to customise them with colors, and your favourite cartoon figure or super hero.

    Makerfair Eindhoven Makerfair Eindhoven
    3d printing with concrete, a project in which our local FabLab Amersfoort is involved. Didn’t get to see the printer working alas.

    Makerfair Eindhoven Makerfair Eindhoven Makerfair Eindhoven
    Novelty 3d printing of portraits.

    Makerfair Eindhoven Makerfair Eindhoven
    Building your own electronic music devices.

    Makerfair Eindhoven Makerfair Eindhoven
    Bringing LED-farming to your home, open source. Astroplant is an educational citizen science project, supported by ESA.

    Maker Faire Eindhoven Maker Faire Eindhoven
    Robot football team versus kids team. Quite a few educational projects around robotics were shown. Mostly from a university of applied sciences, but with efforts now branching out to preceding education levels. Chatted to Ronald Scheer who’s deeply involved in this (and who participated in our Smart Stuff That Matters unconference).

    Maker Faire Eindhoven Maker Faire Eindhoven
    A good way to showcase a wide range of Microbit projects by school children. I can see this mounted on a class room wall.

    Maker Faire Eindhoven Maker Faire Eindhoven
    20180929_123913
    An open source 3d-printed, arduino controlled android. But what is it for? Open source robotics in general is of interest of course. There were also remote controlled robots, which were quite a lot of fun, as the video shows.

    Maker Faire Eindhoven Maker Faire Eindhoven

    20180929_134053

    At the fringe of the event there was some steam punk going on.

    Maker Faire Eindhoven Maker Faire Eindhoven
    Building with card board boxes for children. Makedo is an Australian brand, and next to their kits, you can find additional tools and elements as 3d printable designs online.

    Maker Faire Eindhoven
    The Frysklab team presented the new Dutch language Data Detox kit, which they translated from the English version the Berlin based Tactical Tech Collective created.

    In the past few days I tried a second experiment to run my own Mastodon instance. Both to actually get a result, but also to learn how easy or hard it is to do. The first round I tried running something on a hosted domain. This second round I tried to get something running on a Raspberry Pi.

    The Rapsberry Pi is a 35 Euro computer, making it very useful for stand-alone solutions or as a cheap hardware environment to learn things like programming.

    20180923_144442Installing Debian Linux on the Rapsberry Pi

    I found this guide by Wim Vanderbauwhede, which describes installing both Mastodon and Pleroma on a Raspberry Pi 3. I ordered a Raspberry Pi 3 and received it earlier this week. Wim’s guide points to another guide by on how to install Ruby on Rails and PostgresSQL on a Rapsberry Pi. The link however was dead, and that website offline. However archive.org had stored several snapshots, which I save to Evernote.

    Installing Ruby on Rails went fine using the guide, as did installing PostgresSQL. Then I returned to Wim’s guide, now pointing to the Mastodon installation guide. This is where the process currently fails for me: I can’t extend the Ubuntu repositories mentioned, nor node.js.

    So for now I’m stalled. I’ll try to get back to it later next week.

    Got a new phone (after selecting a new plan with some effort). As I don’t allow my phone to back-up everything to the (google) cloud, it took a few hours to get the new one ready: installing apps, and logging into all the associated accounts (using 1password). On the upside, it means a lot of unnecessary stuff accumulated over the last 2 years has been left behind on the old device.

    I plan to dedicate some learning time in the coming 12 weeks to better understand the protocols that drive the independent web, or IndieWeb. During our STM18 birthday unconference Frank Meeuwsen presented his experiences on the IndieWeb. Frank, Peter and I have formed an impromptu triade to explore the IndieWeb in the past months. In one of his slides Frank conveniently listed the relevant protocols. I’ll plan for 24 hours to explore 6 protocols. Some of them I already understand better than others, so I’ll start with the ones I feel less knowledgeable about.

    The ones I want to explore in more detail, in planned order, are:

    • ActivityPub / OStatus, a decentralized networking protocol (as this ties into my Mastodon experiments as well, this comes first)
    • Micropub, publish on your own domain with 3rd party tools
    • Microsub, own your feed-subscriptions (although I already run my own TinyTinyRss instance)
    • Microformats, markup for data, text, people, events (already used on my blog, but curious to see how I can extend that to more types of data)
    • Indieauth, federated login protocol to sign in with your own domain on other services (already active on my blog, but interested in where else I could use it)
    • Webmentions, respond to a blogpost through your own site (already active on my site, but strongly wish to better format and style it on my site)

    Last week the 2nd annual Techfestival took place in Copenhagen. As part of this there was a 48 hour think tank of 150 people (the ‘Copenhagen 150‘), looking to build the Copenhagen Catalogue, as a follow-up of last year’s Copenhagen Letter of which I am a signee. Thomas, initiator of the Techfestival had invited me to join the CPH150 but I had to decline the invitation, because of previous commitments I could not reschedule. I’d have loved to contribute however, as the event’s and even more the think tank’s concerns are right at the heart of my own. My concept of networked agency and the way I think about how we should shape technology to empower people in different ways runs in parallel to how Thomas described the purpose of the CPH150 48 hour think tank at its start last week.

    For me the unit of agency is the individual and a group of meaningful relationships in a specific context, a networked agency. The power to act towards meaningful results and change lies in that group, not in the individual. The technology and methods that such a group deploys need to be chosen deliberately. And those tools need to be fully within scope of the group itself. To control, alter, extend, tinker, maintain, share etc. Such tools therefore need very low adoption thresholds. Tools also need to be useful on their own, but great when federated with other instances of those tools. So that knowledge and information, learning and experimentation can flow freely, yet still can take place locally in the (temporary) absence of such wider (global) connections. Our current internet silos such as Facebook and Twitter clearly do not match this description. But most other technologies aren’t shaped along those lines either.

    As Heinz remarked earlier musing about our unconference, effective practices cannot be separated from the relationships in which you live. I added that the tools (both technology and methods) likewise cannot be meaningfully separated from the practices. Just like in the relationships you cannot fully separate between the hyperlocal, the local, regional and global, due to the many interdependencies and complexity involved: what you do has wider impact, what others do and global issues express themselves in your local context too.

    So the CPH150 think tank effort to create a list of principles that takes a human and her relationships as the starting point to think about how to design tools, how to create structures, institutions, networks fits right with that.

    Our friend Lee Bryant has a good description of how he perceived the CPH150 think tank, and what he shared there. Read the whole thing.

    Meanwhile the results are up: 150 principles called the Copenhagen Catalogue, beautifully presented. You can become signatory to those principles you deem most valuable to stick to.

    Heinz Wittenbrink, who teaches content strategy at the FH Joanneum in Graz, reflected extensively on his participation in our recent Smart Stuff That Matters unconference.
    We go back since 2006 (although I think we read each others blog before), when we first met at a BarCamp in Vienna. Later Heinz kindly invited me to Graz at several occasions such as the 2008 Politcamp (a barcamp on web 2.0 and political communication), and the 2012 annual conference of the Austrian association for trainers in basic education for adults.

    He writes in German, and his blogpost contains a lot to unpack (also as it weaves the history of our interaction into his observations), so I thought I’d highlight and translate some quotes here. This as I find it rather compelling to read how someone, who’s been involved in and thinking about online interaction for a long time, views the event we did in the context of his and my work. And that some of what I’m trying to convey as fundamental to thinking about tools and interaction is actually coming across to others. Even if I feel that I’ve not yet hit on the most compelling way to formulate my ideas.

    Heinz starts with saying he sees my approach as a very practice oriented one.
    “Ton engages on a very practical level with the possibilities of combining the personal and personal relationships with the wider contexts in which one lives, from the local community to global developments. He has a technical, pragmatic and practice oriented approach. Also he can explain to others who are not part of a digital avantgarde what he does.”

    And then places the birthday unconferences we did in that context, as an extension of that practice oriented approach. Something I realise I didn’t fully do myself.

    “The unconference of last week is an example of how one can do things from a highly personal motivation – like meeting friends, talking about topics you’re interested in, conversing about how you shape your new daily routines after a move – and make it easy for others to connect to that. What you find or develop you don’t keep for yourself, but is made useful for others, and in turn builds on what those others do. So it’s not about developing an overarching moral claim in a small context , but about shaping and networking one’s personal life in such a way that you collectively expand your capabilities to act. Ton speaks of networked agency. Digital networking is a component of these capabilities to act, but only embedded in networks that combine people, as well as locations and technical objects.”

    Speaking about the unconference he says something that really jumps out at me.

    To list the themes [….of the sessions I attended…] fails to express what was special about the unconference: that you meet people or meet them again, for whom these themes are personal themes, so that they are actually talking about their lives when they talk about them. At an unconference like this one does not try to create results that can be broadcast in abstracted formulations, but through learning about different practices and discussing them, extend your own living practice and view it from new perspectives. These practices or ways of living cannot be separated from the relationships in which and with which you live, and the relationships you create or change at such an event like this.

    Seeing it worded like that, that the topics we discussed, theorised about, experimented around, are very much personal topics, and in the context of personal relationships, hits me as very true. I hadn’t worded it in quite that way myself yet. This is however exactly why to me digital networks and human networks are so similar and overlapping, and why I see your immediate context of an issue, you and your meaningful relationships as the key unit of agency. That’s why you can’t separate how you act from your relationships. And why the layeredness of household, neighbourhood, city, earth is interwoven by default, just often not taken into account, especially not in the design phase of technology and projects.

    Heinz then talks about blogging, and our earlier silent assumptions that novel technology would as per default create the right results. Frank’s phrasing and Heinz’s mention of the ‘original inspiration’ to blog resonate with me.

    It’s probably not a coincidence that the people I had the most intensive conversations with have been blogging for a long time. They all stuck with the original inspiration to blog. Frank in his presentation called it “to publish your own unedited voice”. The openness but also the individuality expressed in this formulation was clearly visible in the entire unconference.

    For me blogging was a way of thinking out loud, making a life long habit of note taking more public. The result was a huge growth in my professional peer network, and I found that learning in this networked manner accelerated enormously. Even if my imagined audience when I write is just 4 or 5 of people, and I started blogging as a personal archive/reflection tool, I kept doing it because of the relationships it helped create.

    Continuing on about the early techno-optimism Heinz says about the unconference

    The atmosphere at the unconference was very different. Of the certainties of the years shortly after 2000 nothing much remains. The impulses behind the fascination of yesteryear do remain however. It’s not about, or even less about technology as it was then, it’s about smart actions in themselves, and life under current conditions. It’s about challenging what is presented as unavoidable more than producing unavoidability yourself.

    Only slowly I understand that technologies are much deeper embedded in social practices and can’t be separated from them. Back then I took over Ton’s concept of ‘people centered navigation’. Through the event last week it became clearer to me what this concept means: not just a ‘right’ efficient way to use tools, but a practice that for specific needs deliberately selects tools and in doing so adapts them.

    People centered navigation is not a component of better more efficient mass media, but navigating information in reference to needs and capabilities of people in localised networks. Where above all the production of media and content in dialogue with a limited number of others is relevant, not its reception by the masses. Network literacies are capabilities to productively contribute to these localised networks.

    Just like practice is inseparable from our relationships, our tools are inseparable from our practices. In networked agency, the selection of tools (both technology and methods) is fully determined by the context of the issue at hand and the group of relationships doing it. As I tried to convey in 2010 in my Maker Households keynote at SHiFT and indeed at the earlier mentioned keynote I gave at Heinz’s university on basic literacy in adult learning, networked literacies are tied to your personal networks. And he’s right, the original fascination is as strong as before.

    Heinz finishes with adding the work of Latour to my reading list, by his last remark.

    The attempt to shape your local surroundings intelligently and to consider how you can connect them in various dimensions of networks, reminds me of the localised politics in fragile networks that Bruno Latour describes in his terrestrial manifest as an alternative to the utopies and dystopies of globalisation and closed national societies. Latour describes earth as a thin layer where one can live, because one creates the right connections and maintains them. The unconference was an experiment to discover and develop such connections.

    Thank you Heinz for your reflection, I’m glad you participated in this edition.

    At our birthday unconference STM18 last week, Frank gave a presentation (PDF) on running your own website and social media tools separate from the commercial silos like Facebook, Twitter etc. Collected under the name IndieWeb (i.e. the independent web), this is basically what used to be the default before we welcomed the tech companies’ silos into town. The IndieWeb never went away of course, I’ve been blogging in this exact same space for 16 years now, and ran a personal website for just under a decade before that. For broader groups to take their data and their lives out of silos it requires however easy options out, and low-threshold replacement tools.

    One of the silos to replace is Twitter. There are various other tools around, like Mastodon. What they have in common is that it’s not run by a single company, but anyone can run a server, and then they federate, i.e. all work together. So that if I am on server 128, and you are on server 512 our messages still arrive in the right spot.

    I’ve been looking at running a Mastodon instance, or similar, myself for a while. Because yes, there are more Mastodon servers (I have accounts on mastodon.cloud and on mastodon.nl), but I know even less about who runs them and their tech skills, attitudes or values than I know about Twitter. I’ve just exchanged a big silo for a smaller one. The obvious logical endpoint of thinking about multiple instances or servers, is that instances should be individual, or based on existing groups that have some cohesion. More or less like e-mail, which also is a good analogy to think of when trying to understand Mastodon account names.

    Ideally, running a Mastodon instance would be something you do yourself, and which at most has your household members in it. Or maybe you run one for a specific social context. So how easy is it, to run Mastodon myself.

    Not easy.

    I could deploy it on my own VPS. But maintaining a VPS is rather a lot of work. And I would need to find out if I run the right type of operating system and other packages to be able to do it. Not something for everyone, nor for me without setting aside some proper time.

    Or I could spin up a Mastodon instance at Amazon’s server parks. That seems relatively easy to do, requiring a manageable list of mouse clicks. It doesn’t really fit my criteria though, even if it looks like a relatively quick way to at least have my own instance running. It would take me out of Twitter’s software silo, but not out of Amazon’s hardware silo. Everything would still be centralised on a US server, likely right next to the ones Twitter is using. Meaning I’d have more control over my own data, but not be bringing my stuff ‘home’.

    Better already is something like Masto.host, run by a volunteer named Hugo Gameiro who’s based in Portugal. It provides ease of use in terms of running your own instance, which is good, but leaves open issues of control and flexibility.

    So I’d like a solution that either can run on a package with my local hosting provider or figure out how to run it on cheap hardware like Raspberry Pi which can be connected to my home router. The latter one I’d prefer, but for now I am looking to learn how easy it is to do the former.

    Mastodon and other similar tools like Pleroma require various system components my hosting provider isn’t providing, nor likely to be willing to provide. Like many other hosters they do have library of scripts you can automatically install with all the right dependencies and settings. In the section ‘social media’ it doesn’t mention Mastodon or any other ‘modern’ varity, but they do list GnuSocial and its predecessor StatusNet. GnuSocial is a script that uses the same protocols like Mastodon, OStatus and ActivityPub. So it should be able to communicate with Mastodon.

    I installed it and created an account for myself (and myself as administrator). Then I tried to find ways to federate with Mastodon instances. The interface is rather dreadful, and none of the admin settings seemed to hint at anything that lies beyond the GnuSocial instance itself, no mention of anything like federation.

    The interface of GnuSocial

    However in my profile a button labelled “+remote” popped up. And through that I can connect to other people on other instances. Such as the people I am connected to on Mastodon already. I did that, and it nicely links to their profiles. But none of their messages show up in my stream. Even if it looks I can send messages to them from my GnuSocial instance as I can do things like @someotheruser, they don’t seem to arrive. So if I am indeed sending something, there’s no-one listening at the other end.

    I did connect to others externally

    And I can send messages to them, although they do not seem to arrive

    So that leaves a number of things for next steps to explore. Also on Mastodon in conversation with Maarten I noticed that I need to express better what I’m after. Something for another posting. To be continued.