Fifteen years ago today I blogged this brainstorming exercise about how internet-connectivity for objects might make for different and new objects of sociality. A way to interact with our environment differently. Not a whole lot of that has happened, let alone become common. What has happened is IoT being locked up in device and mobile app pairings. Our Hue lights are tied to the Hue app, and if I’d let it collect e.g. behavioural data it would go to Philips first, not to me. A Ring doorbell (now disabled), our Sonos speakers are the same Those rigid pairings are a far cry from me seamlessly interacting with my environment. One exception is our Meet Je Stad sensor in the garden, as it runs on LoRaWan and the local citizen science community has the same access as I do to the data (and I run a LoRa gateway myself, adding another control point for me).
Incoming EU legislation may help to get more agency on this front. First and foremost, the Data Act when it is finished will make it mandatory that I can access the data I generate with my use of devices like those Hue lights and Sonos speakers and any others you and I may have in use (the data from the invertor on your solar panels for instance). And allow third parties to use that data in real time. A second relevant law I think is the Cyber Resilience Act, which regulates the cybersecurity of any ‘product with digital elements’ on the EU market, and makes it mandatory to provide additional (technical) documentation around that topic.
The internet of things, increases the role of physical objects as social objects enormously, because it adds heaps of context that can serve relationships. Physical objects always have been social objects, but only in their immediate physical context. … Making physical objects internet-aware creates a slew of possible new uses for it as social objects. And if you [yourself] add more sensors or actuators to a product (object hacks so to speak), the list grows accordingly.
It shows about 3500 cameras listed in the Netherlands, surely a tiny fraction of the total.
And only a handful in my city, none in my neighbourhood. Again, a low number far from reality.
This reminds of a game that, I think Kars Alfrink and/or Alper Çuğun conceptualised, where you had to reach a destination in Amsterdam avoiding the views of the cameras along the way. It also reminds me how a former colleague had some basic camera detection device in his car years ago that became useless as surveillance cameras at private homes increased in numbers. It detected not just speeding camera signals, but also all those other cameras. At some point driving down a residential street, especially in more affluent neighbourhoods, the warning noises the device made were constant.
I’ll be on the lookout for cams in our area. There are I know two in our court yard (one on our frontdoor, not connected though, and one on a neighbour’s frontdoor).
Why can’t I in my phone’s app store filter search results by who built it, specifically jurisdiction they fall under. To better judge what might happen to data gathered by an app.
It seems sharing play-lists is no longer an innocent behaviour, nor is playing YouTube with the sound on in the presence of automated speech recognition like Google’s, Amazon’s and Apple’s cloud connected microphones in your living room. “CommanderSongs can be spread through Internet (e.g., YouTube) and radio.”
The easiest mitigation of course is not having such microphones in your home in the first place. Another is running your own ASR with your 95% standard commands localized in the device. Edge first like Candle proposes, not cloud first.
The popularity of ASR (automatic speech recognition) systems, like Google Voice, Cortana, brings in security concerns, as demonstrated by recent attacks.
The impacts of such threats, however, are less clear, since they are either
less stealthy (producing noise-like voice commands) or requiring the physical
presence of an attack device (using ultrasound). In this paper, we demonstrate
that not only are more practical and surreptitious attacks feasible but they
can even be automatically constructed. Specifically, we find that the voice
commands can be stealthily embedded into songs, which, when played, can
effectively control the target system through ASR without being noticed. For
this purpose, we developed novel techniques that address a key technical
challenge: integrating the commands into a song in a way that can be
effectively recognized by ASR through the air, in the presence of background
noise, while not being detected by a human listener. Our research shows that
this can be done automatically against real world ASR applications. We also
demonstrate that such CommanderSongs can be spread through Internet (e.g.,
YouTube) and radio, potentially affecting millions of ASR users. We further
present a new mitigation technique that controls this threat.
This week’s heat wave is breaking records across Europe including here in the Netherlands. So I’ve kept an eye on the temperature in our garden. Our sensor is part of a city wide network of sensors, which includes two sensors nearby. Of the three sensors, ours indicates the lowest temperature at 36.8 (at 16:45), the other two hover just under 40 and at 41.8 respectively. Such differences are caused by the surroundings of the sensor. That ours is the lowest is because it’s placed in a very green garden, while the others are out on the street. In our completely paved and bricked up courtyard the temperature is 42.1 in the shade, due to the radiation heat of sun and stones. Goes to show that greenery in a city is key in lowering temperatures.
Three sensors in our neighbourhood, ours is in the middle, showing the lowest temperature. Note that the color scale is relative, for these 3 sensors running from 36.6 to 41.8.
In the past days since our return from France the temperature has been steadily rising, as per the graph below (which currently ends at the peak of 36.8 at 16:45). Staying inside is the best option, although the also increasingly higher lowest temperatures (from 15 to above 20) mean that the nights are slowly becoming more uncomfortable as the outside temperature will stay above the in house temperature during most or all of the night.
UPDATE as of 26/7 June noon, here you can see how the night minimum jumped 5 degrees in 24 hours, bringing it above the in house temperature for the entire night, except a brief moment around 6 am. At noon the maximum for the day before is already nearly reached.
The way to make this graph yourself is
Go to meetjestad.net/data, where you can select various data types and time frames. Our sensor is number 51, and I selected a time frame starting at July 19th at midnight. This allows me to download the data as CSV.
The data in that download is Tab separated, not comma,when you select a comma to be used as decimal point.
The file contains columns for the sensor number and its latitude and longitude, that are not needed as this is data for just one sensor. Likewise, empty columns for measurement values for which my sensor kit doesn’t contain sensors, such as particulate matter, can be removed. Finally the columns for battery level and humidity are also not needed on this occasion.
With the remaining columns, time and temperature it is easy to build the graph. In this case I replaced the timestamps with sequential numbers, as I intend to make a sparkline graph with it later.
Bayou is an AI that will make software for you based on a basic description. It was trained with all the code available on GitHub, and is released as an open source tool. Today a paper on this will be presented at the Sixth International Conference on Learning Representations in Vancouver.