Article begins

“It’s creepy, don’t you think?” It’s the day after Christmas, and I’m walking with my sister-in-law Mette through the woods of Central Jutland, Denmark. She described how, just the other day, she had been speaking with friends about their new dog. Now she keeps seeing targeted advertising from pet food companies on her Facebook feed. Knowing that I study surveillance and privacy, she asked me if it could be true: could “they” be listening to her conversations through her cell phone?

Mette’s voice dropped an octave with the question, as if someone might be eavesdropping just then, although the path ahead was empty and we hadn’t passed another person in the last twenty minutes. We strode on, in the cold and I pulled my wool hat down over my ears. The sky above seemed woolen too, thick and close and gray. It hung over us like a cloak, enclosing our hushed conversation as we walked through the still forest.

I explained to Mette how Facebook has applied for a number of patents that theoretically enable the company to use the microphone and front-facing camera on a user’s phone to track their facial expressions, reactions, and ambient sounds, although Facebook affirms that these technologies have not been operationalized in any of its current products. In a recent New York Times opinion article, Sahil Chinoy chose the same word—creepy—to characterize Facebook’s patent applications, which attempt to detect, capture, analyze, and predict intimate aspects of users’ lives.

Ethnography offers a unique aperture for understanding how people conceive of privacy in their everyday lives.

When I first started my fieldwork on privacy and technology in 2016, privacy was the domain of activists and specialists, the more strident of whom may have been dismissed as conspiracy theorists. However, following the 2018 revelations about high profile data breaches, election hacking, and Cambridge Analytica, as well as the roll out of the EU General Data Protection Regulation, Internet privacy has increasingly become the topic of media attention, popular conversation, and Twitter memes. In addition to unease over the creepiness of targeted advertising and predictive analytics, Internet users are increasingly concerned that their personal data may be leaked.

Ethnography offers a unique aperture for understanding how people conceive of privacy in their everyday lives. What type of data gathering is considered creepy and what is benign? Does it depend on the nature of the information, the context of its use, or the identity of the user? Whereas philosophers and legal scholars have struggled to define what exactly constitutes privacy (Thomson 1975, Kahn 2003), ethnographers can trace individuals’ own understandings of privacy by paying attention to the strategies and practices through which people attempt to construct a border between their private and public lives. This sort of boundary work may include technical practices like encrypting data and deleting Internet cookies and trackers. It may also include social practices like irony, whispers, and pseudonyms. While the nonconsensual disclosure of secrets may disrupt social relationships, there is a long history within the discipline of anthropology that conversely demonstrates how privacy and secrecy are generative for building social bonds, community, and shared identities (see Jones 2014, Mahmud 2014, Manderson et al. 2015, Simmel 1906).

Three people standing near each other on a train platform, all staring intently at their smartphones.

Goffman’s “civil inattention” describes how people carefully avoid listening and staring in public, or even pretend to not to notice each other, in order to maintain the illusion of personal boundaries. Rather than a form of rudeness or neglect, Goffman characterizes this inattention as a mode of civility, a way of politely sharing public space. rawpixel/Pixabay

Creepiness gestures at the affective nature of privacy intrusions, an unease that is felt at the level of the body. Data collection becomes creepy when it breaches social norms about intimacy, transparency, consent and trust. While the recent EU GDPR has outlined stricter standards for the collection and storage of personal data, creepiness exceeds legal and technical definitions of data (in)security. Framing intrusions in terms of creepiness rather than illegality, injustice or other rights-based idioms highlights how people often conceive of privacy as a social relation. And an embodied one—as my walk in the woods with Mette illustrates.

While the Internet has provided a new platform for both sharing and collecting personal data, the management of private information in public space is a far older problem. In 1963, sociologist Erving Goffman described how urbanites engage in “civil inattention,” refraining from staring, prolonged eye contact, or eavesdropping when sharing public space with strangers. Ignoring others, in this context, is essential for maintaining social order. Imagine that a couple waiting in the queue ahead of you at a coffee shop is having an obvious disagreement. Their voices grow louder and louder as they argue. What you do? Look away, perhaps down at your phone, pretending not to hear them. To do otherwise—to gawk or listen in—would be an overstep. It would be creepy.

When people characterize data tracking as creepy, they indicate that companies and governments have transgressed social norms about who the designated recipient of information is and who should look the other way. Because we are accustomed to revealing information in public that we expect to be ignored, it is unsettling when that data is gathered and logged. And even though data tracking and surveillance are increasingly automated, commonplace discursive framings continue to imagine that there is a person on the other end who is watching or listening.

Treating privacy as an ethnographic object is particularly valuable for anthropologists because it mirrors the core methodological and ethical concerns of our discipline.

The breach of secrets violates social expectations about how information should be managed. To this point, novelist Sally Rooney illustrates the burden of the secret keeper: a secret is “something large and hot, like an overfull tray of hot drinks that [you have] to carry everywhere and never spill.” The content of the secret is less important than the fact that it was shared in confidence. We might say that there exists an obligation to maintain the privacy of one’s trusted companions, an ethical responsibility to protect their secrets. When companies and governments fail to meet that obligation, consumers and citizens lose trust.

Treating privacy as an ethnographic object is particularly valuable for anthropologists because it mirrors the core methodological and ethical concerns of our discipline—how to build intimacy and trust in order to gain access to privileged information, restricted knowledge, and our informants’ personal lives and social worlds. It can also help us reflect upon how we manage our own privacy and which aspects of our interior selves we choose to reveal while conducting fieldwork. Beyond “getting access” as a methodological problem, an ethnographic analysis of privacy and what it means to people brings our own moral quandaries more sharply into view, such as ethical contemplation over which moments and characters we make usable for ethnographic analysis and public consumption. How shall we observe, log, and write about people’s lives without being creepy? An anthropology of privacy requires careful ethical consideration of disclosure, discretion, exposure, and betrayal—especially in light of the discipline’s predilection for revelation, drawing up the curtain to reveal society’s rusty underbelly or hidden, beating heart.

Nina Dewi Horstmann is a PhD student in anthropology at Stanford University. Her research uses ethnographic methods to investigate how technology is applied in political contexts and to better understand the social and political problems that technology attempts to solve. Nina was the 2018 winner of the Society for the Anthropology of Europe’s Graduate Paper Prize for her manuscript entitled “The Power to Selectively Reveal Oneself: Privacy Protection Among Hacker-Activists.”

Cite as: Horstmann, Nina Dewi. 2019. “Watching Our Words.” Anthropology News website, April 24, 2019. DOI: 10.1111/AN.1148