“Go Ahead, I’m Listening…” — Siri
Ever had a conversation with a friend, only to see a related ad on your social media feed some hours later? You’re not alone. Everyone I’ve spoken to in the past months has a story to share. “I was talking about needing a break from work… next thing I know, I get an ad for plane tickets on Instagram!” “A friend and I were talking about a new acquaintance and some hours later, he shows up on both of our suggested friends list on Facebook!” Similar anecdotes have become far too commonplace that a quick Google search for “Is Facebook listening?” yields more than 506 million results.
It is therefore unsurprising that tech-utopists and critics alike have been calling attention to the death of privacy in recent years. As if in familiar Nietzche-fashion, artists, writers, and researchers on both ends of the spectrum are debating the validity of the “privacy is dead” truth-claims. Whilst tech-utopists argue that the loss of privacy is a natural evolution in an information age, tech-critics are wary of the rise of the surveillance economy in curtailing individual liberty and freedom — and with good reason. In the wake of a myriad of privacy breaches — from the Cambridge Analytica scandal, to Google Chrome as a proxy for state monitoring, to China’s biometric surveillance, to smart speakers doubling up as wiretapping devices — privacy has become an increasingly elusive concept. In the meantime, our loss of privacy greases the wheels of surveillance capitalism.
I grew up in pragmatic Singapore where we were constantly reminded that there are no free lunches in the world. Being the idealist that I am, I tried to live outside the neoliberal framework of transactional relationships. But, perhaps, there is some wisdom in the old adage after all. Santa Claus of Big Tech never gave us these social media platforms for free and we have been paying ever since we first created an account. Trackers are constantly watching our every like, post, share, conversation etc, turning our data into profiles, which are in turn sold to advertisers who, in turn, push products and manipulate our behaviors. We have been paying for the use of these platforms all along and we’ve been paying in the form of our data, privacy, and attention.
From Street to Home Surveillance
The key question I’m asking is the extent to which we’ve been outsmarted by our smart devices: to what extent have our smart devices become tools of surveillance in the guise of convenience? Don’t get me wrong, I do see the value of convenience that Alexa, Google or Siri brings, especially for folks with limited mobility. What I’m more concerned about is the lack of transparency about the kinds of data that is extracted from consumers. Amazon has come under fire for their always-on, always-listening devices but the collection, analysis, and recordings of our data remain a black box (no pun intended). Researchers have also proven the vulnerability of these technologies to audio and light-activated adversarial attacks. Yet, by bringing these devices into our homes, we have brought surveillance from the streets and into the most intimate of spaces. Putting these in the context of surveillance capitalism, it is little wonder why Jeff Bezos is the richest man in the world.
In Anatomy of an AI, Crawford and Joler eloquently expose the kinds of physical exploitation that go behind the production of an Echo and the algorithmic processing of Alexa. Unfortunately, these forms of exploitation are often hidden from view. In the same way, audio surveillance is often invisible, silent, and hidden from plain sight. What’s more worrying is the extent to which surveillance and the loss of privacy have been normalized. Many users of smart devices are resigned to the fact that giving up privacy is the price to pay for convenience. I’m particularly interested in de-naturalizing the “normal” in the hopes of holding designers and technologists of these devices to higher standards.
Outsmarting Our Smart Devices
My work looks at building civic tools in response to surveillance capitalism. What can we do as civic society to hold Big Tech to higher standards of user privacy? In response, I’m developing a privacy hardwear kit. This hardwear kit is a collection of low to high tech 3D printed accessories for Amazon’s Echo, Apple’s Siri, and Google Home that are also fitted with ultrasonic speakers that blast frequencies in the 20 kHz, above the range of human hearing. These frequencies saturate the device’s microphone, preventing any form of unwarranted audio recording and processing.
Surveillance’s power rests in the fact that it is top-down, one-directional, and often invisible. It looms over us like an invisible dark cloud. I’m interested in giving shape and form to that cloud, to bring surveillance back into view. In order to subvert audio surveillance, we first need to shine a spotlight on it and call it out in broad daylight. I’m building these accessories for Alexa, Siri, and Google to call attention to surveillance and privacy breaches brought about my these devices. Further, the hardwear kit calls into question the usefulness of wearable technologies by putting the “wearing” on the smart devices instead of the human. These accessories play with the visual metaphors of sound jamming (i.e. earplugs, ear mufflers, plunger) in a camp aesthetic.
The privacy hardwear kit uses the aesthetics of camp as a methodology to hypervisiblize surveillance. Camp has long been playing and performing between the lines of the visible/invisible, seen/unseen. It has always been about questioning and pushing the boundaries of normativity and more specifically, heteronormativity. Camp, though popularized by the mainstream media’s coverage of the MET Gala, stems from deeper subversive roots. It was a protective mechanism for black, queer communities questioning the cultural hegemony of race, gender, and sexuality. It aimed to dismantle this hegemony through the masquerade, performance, and over-exaggeration of gender. By playing with the hypervisible through the use of wigs, make-up, and clothing, they reveal the extent to which gender is a performance, disrupting the boundaries of heteronormativity. In the same way, I use camp as a methodology to disrupt normative notions of privacy (as an inevitable price to pay for convenience) and call surveillance into question.
So what happens when you put a wig on Alexa? And what if the wig also obfuscates audio recording by saturating the microphone with ultrasonic sounds? Building on the political movement of camp, I use camp as a subversive framework in 3 ways: 1. As a medium to hypervisiblize surveillance. By putting a wig on Alexa, for example, we shine a spotlight on the device’s insidious capabilities and defamiliarize the familiar. 2. As a tool for obfuscation and protection of privacy. Kornstein, discusses how drag has been a tool to trick Facebook’s facial recognition algorithms, obfuscating the identities of drag queens. Similarly, I use ultrasonic frequencies to add digital noise to microphones, obfuscating the privacy of its users. 3. As an academic methodology to examine emerging technologies. Camp has often been whitewashed by the mainstream media, silencing its subversive historical roots that grew out of black queer communities. To pay homage to its political discourse, I aim to re-center its power by bringing it into the ivory tower. As an analytical framework of emerging technologies, camp also disrupts the homogeneity of the tech field.
I believe that critical design is a useful medium for reflection on the speed, growth, and uses of emerging technology. The privacy hardwear kit is one such instance that provokes introspection on the ethics of voice UI design. Ultimately, my end-goal is to provide commentary on the vulnerability of these technologies and to hold designers and engineers to a higher standards of software design, which value individual privacy and autonomy.