In the light of Google’s decision to end sales of its Google Glass eyewear we believe it is time for us to pause and take time to reset our strategy as well. The last year has shown us that resistance is not futile and that we can support technological progress without blindly accepting every creepy invasive technology that corporations want to foist upon us. It has also shown us that it is possible for public debate to help establish sensible norms arround the use of technology in public places.
The ‘stop the cyborgs’ campaign was never solely about Google Glass but rather about the invasive nature of wearables, dataveillance and the internet of things more generally. However this minor victory seems like a good time to pause our campaigning activities in order to rethink our stratergy and free up time for other projects. We therefore won’t be updating this blog, tweeting, responding to interview requests, or doing any talks for a while.
Glass will probably be back in some form and there are plenty of other invasive devices like Narrative clip, Drones or Spy Cams still being produced so you might still want to keep your no-surveillance signs up and your cyborg unplug plugged in.
Adam & Jack
The data you unconsciously produce by going about your day is being stored up over time by one or several entities. And now it could be used against you in court….
Inspired by the recent case of Fitbit history being used in a personal injury claim the excellent Kate Crawford discusses the rise of the algorithmic expert witness in the Atlantic. The important point is not just that wearable tech records may become evidence in a court of law but also that data – or rather the conclusions drawn by analytics companies – becomes a new kind of witness.
The decisions about what is “normal” and “healthy” that these companies come to depends on which research they’re using. Who is defining what constitutes the “average” healthy person? This contextual information isn’t generally visible. Analytics companies aren’t required to reveal which data sets they are using and how they are being analyzed.
The current lawsuit is an example of Fitbit data being used to support a plaintiff in an injury case, but wearables data could just as easily be used by insurers to deny disability claims, or by prosecutors seeking a rich source of self-incriminating evidence.
We should therefore not only be aware that wearable sensors are surveillance devices and that the records they produce can be subpenaed but also resist the idea that the data is a source of objective truth and instead see it as the opinion of the experts or companies interpreting the data.
Ultimately, the Fitbit case may be just one step in a much bigger shift toward a data-driven regime of “truth.” Prioritizing data—irregular, unreliable data—over human reporting, means putting power in the hands of an algorithm. These systems are imperfect—just as human judgments can be—and it will be increasingly important for people to be able to see behind the curtain rather than accept device data as irrefutable courtroom evidence. In the meantime, users should think of wearables as partial witnesses, ones that carry their own affordances and biases.
Read the article here
In this ‘lab test’ we look at a very popular covert surveillance device, cunningly disguised as that most innocuous of devices, a smoke detector. Unlike wireless cameras disguised as wall clocks or iPod docks, spy devices in this form are far less likely to be tampered with, let alone discovered. Not only are they just out of reach, they’re considered part of the local emergency infrastructure, meaning it’s far less likely someone will take them down for inspection. All the while, from their ceiling mounted position, they’re ideal for monitoring the activities within a room.
[More details] [Pre-order]
Michael Keller & Josh Neufel have turned Unraveling Privacy: The Personal Prospectus & the Threat of a Full Disclosure Future and some other stuff like Paul Ohm’s Databases of Ruin, an interview with Danah Boyd and why Bigdata is a civil rights issue into a graphic Novella called ‘Terms of service‘ for Aljazeera America. It is a great read that explains what is really at stake.
Read or download the whole thing here
Cyborg Unplug can now be pre-ordered at https://plugunplug.net
Cyborg Unplug is an anti wireless-surveillance system for the home and workplace. It detects and kicks selected devices known to pose a risk to personal privacy from your wireless network, breaking uploads and streams. Detected wireless devices currently include: wearable ‘spy’ cameras and microphones, Google Glass and Dropcam, small drones/copters and a variety of popular spy devices disguised as familiar objects.
€52.- (ca. $66.-)
Ars reports that the chief executive officer of a mobile spyware maker was arrested over the weekend, charged with allegedly illegally marketing an app that monitors calls, texts, videos, and other communications on mobile phones “without detection,”
Can we look forward to `Sandy’ Pentland, Zuckerberg, Jeff Bezos, Riccardo (Candy crush) Zacconi, and whoever the CEO of PassTime is, being arrested next? After all they collect even more personal and location data.
Interesting approach by Jaeyeon Jung & Matthai Philipose from Microsoft research. The basic idea is to turn off wearable cameras like Autographer when people are detected by a low res far-infrared imager unless those people have expressed consent.
Small and always-on, wearable video cameras disrupt social norms that have been established for traditional hand-held video cameras, which explicitly signal when and which subjects are being recorded to people around the camera-holder. We first discuss privacy-related social cues that people employ when recording other people (as a camera-holder) or when being recorded by others (as a bystander or a subject). We then discuss how low-fidelity sensors such as far-infrared imagers can be used to capture these social cues and to control video cameras accordingly in order to respect the privacy of others. We present a few initial steps toward implementing a fully functioning wearable camera that recognizes social cues related to video privacy and generates signals that can be used by others to adjust their privacy expectations.
read the full paper here
The key decisions that shape people’s lives—decisions about jobs, healthcare, housing, education, criminal justice and other key areas—are, more and more often, being made automatically by computers. As a result, a growing number of important conversations about civil rights, which focus on how these decisions are made, are also becoming discussions about how computer systems work.
The September 2014 report on social justice and technology begins to answer the question “How and where, exactly, does big data become a civil rights issue? “
Read the report here
The report is generally very good and provides real concrete examples. However they claim on the basis of sparse anecdotal evidence that ‘Body-worn cameras are poised to help boost accountability for law enforcement and citizens’. We beg to differ – life as always is more complicated than that.
There is no such thing as ‘technology.’
Anyone who views critics of particular technologies as ‘luddites’ fundamentally misunderstands what technology is.There is no such thing as ‘technology.’ Rather there are specific technologies, produced by specific economic and political actors, and deployed in specific economic and social contexts. You can be anti-nukes without being anti-antibiotics. You can be pro-surveillance of powerful institutions without being pro-surveillance of individual people. You can work on machine vision for medical applications while campaigning against the use of the same technology for automatically identifying and tracking people. How? Because you take a moral view of the likely consequences of a technology in a particular context.”