Do y’all understand how easy it is to make a fake tweet from a screenshot? Like by inspecting the browser and changing the text? …. I don’t trust posts I can’t search up on archives. And if you do have a link, archive it (not in an image but using an reputable archiving service).

Jacky Alciné’s words are true, so I thought I’d illustrate.

The general principle here is: if you make a statement about someone or something other than yourself or your personal opinions, you need to back it up with a link to supporting material. “X said on Twitter” needs to be linked to that tweet. Leaving googling for your source as an exercise to your readers isn’t just merely convenient to you, it is actively destructive of the web. The web is links, and they’re a key piece of information for your readers to judge if what you tweeted/said/blogged might be signal or noise. No links means it’s likely noise and it will degrade your standing as a source of signals. No links is aiding and abetting the bots, trolls and fakesters, as it allows them to hide in more noise.

Adding a screen-shot as Jacky Alciné says is not enough ‘proof’, as they can easily be altered directly in your browser. An example:

Yesterday I posted my first Tweet from my recent brain implant. It was awesome! So awesome in fact, I made a screenshot of it to preserve the moment for posterity.

In reality I posted from Indigenous (see there’s a link there!), a mobile app that provides my phone with IndieWeb reading and publishing capabilities, which I syndicated to my Twitter account (see there’s another link!). Also awesome, but much less awesome than blogging from a brain implant.

The difference between those two screenshots, getting from true to fake, is that I altered the text of the Twitter website in my browser. Every browser allows you to see a website you visit in ‘developer’ mode. It is helpful to e.g. play around with colors, to see what might work better for your site. But you can also use it to alter content. It’s all the same to your browser. See this screenshot, where I am in the process of changing ‘Indigenous’ into ‘brain implant’

But, you say, tweets might have been deleted and grabbing a screenshot is a good way of making sure I still have some proof if a tweet does get deleted. That’s true, tweets and other content do get deleted. Like self-congratulatory tweets/VK/FB messages about the downing of MH17 by separatist supporting accounts, before it became clear a regular line flight was shot out of the air, and those accounts were quickly scrubbed (See Bellingcat‘s overview). Having a screenshot is useful, but isn’t enough. If only for the reason that the originator may simply say you faked it, as it can so easily be done in a browser (see above). You still need to provide a link.

Using the Web Archive, or another archiving site, is your solution. The Web Archive has preserving as much of the web and other online content as possible as its mission. It is a trustable source. They save web pages on their own initiative, but you can submit any URL for preservation yourself and it will immediately be saved to the archive. Each archived page has its own URL as well, so you can always reference it. (Many links in Wikipedia point to the archived version of a page from the point in time it was referenced in Wikipedia for this reason).

I submitted my tweet from yesterday to the Web Archive, where it now has a web address that neither I, nor Twitter can change. This makes it acceptable proof of what I did in fact send out as a tweet yesterday.

Some things I thought worth reading in the past days

  • A good read on how currently machine learning (ML) merely obfuscates human bias, by moving it to the training data and coding, to arrive at peace of mind from pretend objectivity. Because of claiming that it’s ‘the algorithm deciding’ you make ML a kind of digital alchemy. Introduced some fun terms to me, like fauxtomation, and Potemkin AI: Plausible Disavowal – Why pretend that machines can be creative?
  • These new Google patents show how problematic the current smart home efforts are, including the precursor that are the Alexa and Echo microphones in your house. They are stripping you of agency, not providing it. These particular ones also nudge you to treat your children much the way surveillance capitalism treats you: as a suspect to be watched, relationships denuded of the subtle human capability to trust. Agency only comes from being in full control of your tools. Adding someone else’s tools (here not just Google but your health insurer, your landlord etc) to your home doesn’t make it smart but a self-censorship promoting escape room. A fractal of the panopticon. We need to start designing more technology that is based on distributed use, not on a centralised controller: Google’s New Patents Aim to Make Your Home a Data Mine
  • An excellent article by the NYT about Facebook’s slide to the dark side. When the student dorm room excuse “we didn’t realise, we messed up, but we’ll fix it for the future” defence fails, and you weaponise your own data driven machine against its critics. Thus proving your critics right. Weaponising your own platform isn’t surprising but very sobering and telling. Will it be a tipping point in how the public views FB? Delay, Deny and Deflect: How Facebook’s Leaders Fought Through Crisis
  • Some of these takeaways from the article just mentioned we should keep top of mind when interacting with or talking about Facebook: FB knew very early on about being used to influence the US 2016 election and chose not to act. FB feared backlash from specific user groups and opted to unevenly enforce their terms or service/community guidelines. Cambridge Analytica is not an isolated abuse, but a concrete example of the wider issue. FB weaponised their own platform to oppose criticism: How Facebook Wrestled With Scandal: 6 Key Takeaways From The Times’s Investigation
  • There really is no plausible deniability for FB’s execs on their “in-house fake news shop” : Facebook’s Top Brass Say They Knew Nothing About Definers. Don’t Believe Them. So when you need to admit it, you fall back on the ‘we messed up, we’ll do better going forward’ tactic.
  • As Aral Balkan says, that’s the real issue at hand because “Cambridge Analytica and Facebook have the same business model. If Cambridge Analytica can sway elections and referenda with a relatively small subset of Facebook’s data, imagine what Facebook can and does do with the full set.“: We were warned about Cambridge Analytica. Why didn’t we listen?
  • [update] Apparently all the commotion is causing Zuckerberg to think FB is ‘at war‘, with everyone it seems, which is problematic for a company that has as a mission to open up and connect the world, and which is based on a perception of trust. Also a bunker mentality probably doesn’t bode well for FB’s corporate culture and hence future: Facebook At War.

Something that strikes me as odd in addressing fake news, is that it’s almost exclusively focused on the information production and distribution. Not on the skills and strategies of the entity taking information in. Partly this is understandable, as forcing transparency on how your information might have been influenced is helpful (especially to see if what you get presented with is something others / everyone else is presented with). But otherwise it’s as if those receiving information are treated as passive consumers, not as agents in their own right.

“Our best defense against hostile influence, whatever its vector, is to invest in critical thinking skills at all levels of the population so that outlandish claims are seen for what they truly are: emotional exploitation for political or monetary gain”, wrote Nina Jankowicz on how Finnish society instills critical thinking skills.

The question of course is whether governments truly want to inoculate society, or merely want to deflect disinformation and manipulation from specific sources. Then it’s easier to understand where the focus on technology oriented solutions, or ones that presume centralised efforts come from.

In networks smartness needs to be at the endpoints, not in the core. There’s a lack of attention for the information strategies, filtering and interpreting tactics of those receiving information. Crap detection skills need to be developed for instance, and societies have a duty to self-inoculate. I think the obligation to explain* applies here too, showing others what you do and how.

Here’s a list of postings about my information habits. They’re not fixed, and currently I’m in the process of describing them again, and taking a critical look at them. What are your information habits, have you ever put them into words?

*The obligation to explain is something I’ve adopted from my friend Peter Rukavina: “The benefits of a rich, open pool of knowledge are so great that those who have learned have an obligation to share what they’ve learned.

Some links I thought worth reading the past few days

  • On how blockchain attempts to create fake scarcity in the digital realm. And why banks etc therefore are all over it: On scarcity and the blockchain by Jaap-Henk Hoepman
  • Doc Searl’s has consistently good blogposts about the adtech business, and how it is detrimental to publishers and citizens alike. In this blogpost he sees hope for publishing. His lists on adverts and ad tech I think should be on all our minds: Is this a turning point for publishing?
  • Doc Searl’s wrote this one in 2017: How to plug the publishing revenue drain – The Graph – Medium
  • In my information routines offline figures prominently, but it usually doesn’t in my tools. There is a movement to put offline front and center as design principle it turns out: Designing Offline-First Web Apps
  • Hoodie is a backendless tool for building webapps, with a offline first starting point: hood.ie intro
  • A Berlin based company putting offline first as foremost design principle: Neighbourhoodie – Offline First
  • And then there are Service Workers, about which Jeremy Keith has just published a book: Going Offline
  • Haven’t tested it yet, but this type of glue we need much more of, to reduce the cost of leaving silos, and to allow people to walk several walled gardens at the same time as a precursor to that: Granary