UntitledProbably the top left gives the most realistic information. Image by Brooke Novak, license CC BY

An organisation that says it wants to work data driven as well as sees ethics as a key design ingredient, needs to take a very close look imho at how they set KPI’s and other indicators. I recently came across an organisation that says those first two things, but whose process of setting indicators looks to have been left as a naive exercise to internal teams.

To begin with, indicators easily become their own goals, and people will start gaming the measurement system to attain the set targets. (Think of call centers picking up the phone and then disconnecting, because they are scored on the number of calls answered within 3 rings, but the length of calls isn’t checked for those picked up)

Measurement also isn’t neutral. It’s an expression of values, regardless of whether you articulated your values. When you measure the number of traffic deaths for instance as an indicator for road safety, but not wounded or accidents as such, nor their location, you’ll end up minimising traffic deaths but not maximising road safety. Because the absence of deaths isn’t the presence of road safety. Deaths is just one, albeit the most irreparable one, expression of the consequences of unsafety. Different measurements lead to different real life actions and outcomes.

Gauges‘Gauges’ by Adam Kent, license CC BY

When you set indicators it is needed to evaluate what they cover, and more importantly what they don’t cover. To check if the overall set of indicators is balanced, where some indicators by definition deteriorate when others improve (so balance needs to be sought). To check if assumptions behind indicators have been expressed and when needed dealt with.

Otherwise you are bound to end up with blind spots, lack of balance, and potential injustices. Defined indicators also determine what data gets collected, and thus what your playing field is when you have a ‘data driven’ way of working. That way any blind spot, lack of balance and injustice will end up even more profoundly in your decisions. Because where indicators mostly look back in time at output, data driven use of the data underlying those indicators actively determines actions and thus (part of) future output, turning your indicators in a much more direct and sometimes even automated feedback loop.

CompassOnly if you’ve deliberately defined your true north, can you use your measurements to determine direction of your next steps. ‘Compass’ by Anthony, license CC BY ND

With my company we now have fully moved out of Slack and into Rocket.Chat. We’re hosting our own Rocket.Chat instance on a server in an Amsterdam data center.

We had been using Slack since 2016, and used it both for ourselves, and with some network partners we work with. Inviting in (government) clients we never did, because we couldn’t guarantee the location of the data shared. At some point we passed the free tier’s limits, meaning we’d have to upgrade to a paid plan to have access to our full history of messages.

Rocket.chat is an open source alternative that is offered as a service, but also can be self-hosted. We opted for a Rocket.chat specific package with OwnCube. It’s an Austrian company, but our Rocket.chat instance is hosted in the Netherlands.

Slack offers a very well working export function for all your data. Rocket.chat can easily import Slack archives, including user accounts, channels and everything else.

With the move complete, we now have full control over our own data and access to our entire history. The cost of hosting (11.50 / month) is less than Slack would already charge for 2 users when paid annually (12.50 / month). The difference being we have 14 users. That works out as over 85% costs saving. Adding users, such as clients during a project, doesn’t mean higher costs now either, while it will always be a better deal than Slack as long as there’s more than 1 person in the company.

We did keep the name ‘slack’ as the subdomain on which our self-hosted instance resides, to ease the transition somewhat. All of us switched to the Rocket.chat desktop and mobile apps (Elmine from Storymines helping with navigating the installs and activating the accounts for those who wanted some assistance).

Visually, and in terms of user experience human experience, it’s much the same as Slack. The only exception being the creation of bots, which requires some server side wrangling I haven’t looked into yet.

The move to Rocket.chat is part of a path to more company-wide information hygiene (e.g. we now make sure all of us use decent password managers with the data hosted on EU servers, and the next step is running our own cloud e.g. for collaborative editing with clients and partners), and more information security.

In de (goede en nuttige!) sessie van de VNG over de WOO op Overheid360 eerder deze maand, werden de aanwezigen meerdere vragen gesteld. De laatste, wanneer je denkt dat de implementatie van de WOO afgerond zal zijn, leverde bovenstaande foto op.

Ik kan er nog steeds niet helemaal over uit. Het probleem van de WOO is overduidelijk dat het daarin genoemde ‘op orde krijgen van de informatiehuishouding’ tot meer werk leidt dan een overheidsorganisatie zegt aan te kunnen en budget voor te hebben (of bereid is prioriteit aan te geven).

Iedereen in de zaal zei, volgens deze foto, niet aan de wet te gaan of kunnen voldoen. Niemand zei over 5 jaren de boel op orde te hebben, de termijn die in de wet genoemd is. Twee van de 34 (6%) dachten het over 8 jaar voor elkaar te hebben, en die werden als enorme optimisten betiteld. De anderen dachten dat het tot 2030 (56%) zou duren, of nooit afkomt (38%).

De WOO krijgt het verwijt extra werk te veroorzaken. Je informatiehuishouding op orde hebben, wie eist dat nou, zo lijkt de gedachte. De WOO in huidige vorm is echter al een compromis. De eerste versie werd als onhaalbaar afgedaan, en in de nieuwe versie geeft de wetgever overheidsinstellingen vijf jaar de tijd, en de verplichting te laten zien dat je ook je best doet om in die vijf jaar een inhaalslag te maken. De 2e WOO is al een herkansing. En niet eens een tweede kans, maar de derde.

Veertig jaar geleden, 1980, werd de WOB van kracht, die openbaarheid regelt. Sinds die tijd is er vrijwel niets gedaan om openbaarheid als grondbeginsel in de informatiehuishouding op te nemen. Nog altijd wordt een WOB verzoek als lastig ervaren, want dan moet je zo zoeken waar je je spullen hebt. Omdat je je informatiehuishouding nooit hebt aangepast om openbaarheidsverzoeken snel te kunnen afhandelen. In Noorwegen krijg je per kerende post je gevraagde informatie, maar hier is een WOB verzoek (en elk verzoek om documenten, in welke vorm dan ook, is een WOB verzoek, ook dat besef is er na 40 jaren nog altijd niet) altijd extra werk, naast je gewone taken. Alsof openbaarmaking niet een wettelijke taak is. Dat heeft altijd al tot gekrakeel geleid, en de wetgever heeft de overheidsinstellingen voor die krampscheuten uitsluitend beloond (zoals het verwijderen van dwangmiddelen, anders dan de rechtsgang).

Nu verplichte actieve openbaarmaking dichterbij komt wordt nog veel zichtbaarder dat de informatiehuishouding daar niet op ingericht is. Dat was deze namelijk voor de passieve openbaarmaking van de WOB al niet. Enige tijd geleden kwam ik nog een hoofd bedrijfsinformatie bij een overheidsinstelling tegen die me vroeg “dus jij zegt dat openbaarheid wettelijk is omschreven?”. Ja dat zei ik. En wel al veel langer dan iedereen in die sessie waar ik bovenstaande foto maakte bij de overheid werkt.

Er zijn diverse zaken die al lang verplicht zijn om actief openbaar te maken (denk aan besluiten, vergunningen etc.), en dat lukt. Er is dus niet echt reden aan te nemen dat het voor een lijst van anderen zaken, zoals de WOO opnoemt, in vijf jaren niet ook zou kunnen.

Uit de slide bovenaan blijkt dat men al heeft opgegeven voordat de WOO er nog maar is.
Het is kennelijk een erg radicaal idee om een algemene openbaarheids- en data/informatie-strategie op te stellen die ook belooft de implementatie van de WOO netjes op tijd af te ronden. Een aanpak waarbij je actieve openbaarmaking als kans ziet. Als een instrument waarmee je het gedrag van allerlei externe betrokkenen kunt beïnvloeden. Zoals je nu financiering (subsidies) en regelgeving inzet om gedrag te beïnvloeden, is openbaarmaking een derde beleidsinstrument. En wel de goedkoopste van de drie.

Mij doet het allemaal denken aan het onderstaande plaatje dat in al mijn vroegere kennismanagement- en veranderprojecten wel van toepassing was. “We hebben geen tijd voor fundamentele aanpassingen, want we zijn al zo druk met ons normale werk en brandjes blussen”.

Too Busy To Improve - Performance Management - Square Wheels
Alan O’Rourke, license CC-BY

Ethics is the expression of values in actual behaviour. So when you want to do data ethics it is about practical issues, and reconsidering entrenched routines. In the past few weeks I successfully challenged some routine steps in a clients’ organisation, resulting in better and more ethical use of data. The provision of subsidies to individuals is arranged by specific regulations. The regulations describe the conditions and limitations for getting a subsidy, and specify a set of requirements when you apply for a subsidy grant. Such subsidy regulations, once agreed have legal status.

With the client we’re experimenting in making it vastly less of an effort for both requester and the client to process a request. As only then does it make sense to provide smaller sized subsidies to individual citizens. Currently there is a rather high lower limit for subsidies. Otherwise the costs of processing a request would be higher than the sum involved, and the administrative demands for the requester would be too big in comparison to the benefits received. Such a situation typically leads to low uptake of the available funding, and ineffective spending, which both make the intended impact lower (in this case reducing energy usage and CO2 emissions).

In a regular situation the drafting of regulation and then the later creation of an application form would be fully separate steps, and the form would probably blindly do what the regulations implies or demands and also introduce some overshoot out of caution.

Our approach was different. I took the regulation and lifted out all criteria that would require some sort of test, or demands that need a piece of information or data. Next, for each of those criteria and demands I marked what data would satisfy them, the different ways that data could be collected, and what role it played in the process. The final step is listing the fields needed in the form and/or those suggested by the form designers, and determining how filling those fields can be made easier for an applicant, (E.g. having pick up lists)

A representation of the steps taken / overview drawn

What this drawing of connections allows is to ask questions about the need and desirability of collecting a specific piece of data. It also allows to see what it means to change a field in a form, for how well the form complies with the regulation, or which fields and what data flows need to change when you change the regulation.

Allowing these questions to be asked, led to the realisation that several hard demands for information in the draft regulation actually play no role in determining eligibility for the subsidy involved (it was simply a holdover from another regulation that was used as template, and something that the drafters thought was ’nice to have’). As we were involved early, we could still influence the draft regulation and those original unneeded hard demands were removed just before the regulation came up for an approval vote. Now that we are designing the form it also allows us to ask whether a field is really needed, where the organisation is being overcautious about an unlikely scenario of abuse, or where it does not match an actual requirement in the regulation.

Questioning the need for specific data, showing how it would complicate the clients’ work because collecting it comes with added responsibilities, and being able to ask those questions before regulation was set in stone, allowed us to end up with a more responsible approach that simultaneously reduced the administrative hoops for both applicant and client to jump through. The more ethical approach now is also the more efficient and effective one. But only because we were there at the start. Had we asked those questions after the regulation was set, it would have increased the costs of doing the ethically better thing.

The tangible steps taken are small, but with real impact, even if that impact would likely only become manifest if we hadn’t taken those steps. Things that have less friction get noticed less. Baby steps for data ethics, therefore, but I call it a win.