Tag Archives: algorithms

Time for an RSS Revival

Wired is calling for an RSS revival.

RSS is the most important piece of internet plumbing for following new content from a wide range of sources. It allows you to download new updates from your favourite sites automatically and read them at your leisure. Dave Winer, forever dedicated to the open web, created it.

I used to be a very heavy RSS user. I tracked hundreds of sources on a daily basis. Not as news but as a way to stay informed about the activities and thoughts of people I was interested in. At some point, that stopped working. Popular RSS readers were discontinued, most notably Google’s RSS reader, many people migrated to the Facebook timeline, platforms like Twitter stopped providing RSS feeds to make you visit their platform, and many people stopped blogging. But with FB in the spotlight, there is some interest in refocusing on the open web, and with it on RSS.

Currently I am repopulating from scratch my RSS reading ‘antenna’, following around 100 people again.

Wired in its call for an RSS revival suggests a few RSS readers. I, as I always have, use a desktop RSS reader, which currently is ReadKit. The FB timeline presents stuff to you based on their algorithmic decisions. As mentioned I definitely would like to have smarter ways of shaping my own information diet, but then with me in control and not the one being commoditised.

So it’s good to read that RSS Reader builders are looking at precisely that.
“Machines can have a big role in helping understand the information, so algorithms can be very useful, but for that they have to be transparent and the user has to feel in control. What’s missing today with the black-box algorithms is where they look over your shoulder, and don’t trust you to be able to tell what’s right.”,says Edwin Khodabakchian cofounder and CEO of RSS reader Feedly (which currently has 14 million users). That is more or less precisely my reasoning as well.

Algorithms That Work For Me, Not Commodotise Me

Stephanie Booth, a long time blogging connection, has been writing about reducing her Facebook usage and increasing her blogging. She says at one point

As the current “delete Facebook” wave hits, I wonder if there will be any kind of rolling back, at any time, to a less algorithmic way to access information, and people. Algorithms came to help us deal with scale. I’ve long said that the advantage of communication and connection in the digital world is scale. But how much is too much?

I very much still believe there’s no such thing as information overload, and fully agree with Stephanie that the possible scale of networks and connections is one of the key affordances of our digital world. My rss-based filtering, as described in 2005, worked better when dealing with more information, than with less. Our information strategies need to reflect and be part of the underlying complexity of our lives.

Algorithms can help us with that scale, just not the algorithms that FB uses around us. For algorithms to help, like any tool, they need to be ‘smaller’ than us, as I wrote in my networked agency manifesto. We need to be able to control its settings, tinker with it, deploy it and stop it as we see fit. The current application of algorithms, as they usually need lots of data to perform, sort of demands a centralised platform like FB to work. The algorithms that really will be helping us scale will be the ones we can use for our own particular scaling needs. For that the creation, maintenance and usage of algorithms needs to have a much lower threshold than now. I placed it in my ‘agency map‘ because of it.

Going back to a less algorithmic way of dealing with information isn’t an option, nor something to desire I think. But we do need algorithms that really serve us, perform to our information needs. We need less algorithms that purport to aid us in dealing with the daily river of newsy stuff, but really commodotise us at the back-end.