In the summer of 2011, by the eastern edge of the Pacific Ocean, the smell of salt permeating the warm breeze, the sound of gulls close and menacing, a friend told me that Miranda July was therealdeal. We were young women working in industry towns that dreamt large and drank deeply from their mythologies; my friend a screenwriter in Los Angeles and I a product manager in Silicon Valley, both of us in search of purpose in our professions. My friend sensed in July’s work—in the film July released that year, TheFuture, in her story collection NoOneBelongsHereMoreThanYou, in her performance art—a model for making meaning, for smuggling art into life and ordinary life into art, penetrating reality’s surface aslant. I could not yet see how art might orient me in the world. For a time, I was devoted only to the functional and the techno-utopian, saw the future wherever I looked: in the experimental apps I tested on my phone while riding the company bus from San Francisco to a suburban office complex; out my office window, where the robot cars I worked on would roam.
There is something off-putting about technology, Miranda July says in a 2020 interview. Technology seems not for her, she says, not for an artist or a dreamy person. She says this, strikingly, as an artist who has, in her nearly three-decade career, made art with technologies old and new—video, email, PennySaver classifieds, zines, a messaging app, Instagram, internet-crowdsourced artifacts, and online user data. I understand July’s stance now that I have made a life as a writer and have some distance from Silicon Valley and its subsuming cultures; now that enough time has passed for us to collectively reckon with its once-new technologies. There is, July believes, a certain force of insight made available by being at her age in this year, straddling two eras, the analog and the digital. She imagines an old woman living the transition from horse and buggy to automobile: the woman sees the technological breakthrough of her lifetime for what it is and what it is not, allowing her to approach it, as July tells it, with noncompliance.
In a series of three Instagram videos from 2018, filmed on a cellphone angled up from the floor, July marches down a carpeted hallway at her dentist’s office, cradling a manila folder under one arm. She trips, flails, and falls, the folder of loose papers exploding into the air. In each video she repeats this choreography—with a slightly accelerated stumble, a more forceful expulsion of paper, a longer pause after the fall—as if in search of just right, Goldilocks-style. Every time July tumbles, my body reflexively tenses, even though I am aware that she has deliberately composed the scene.
Each of the three Instagram videos ends with July on her hands and knees, whispering a stream of profanities as she gathers herself and the scattered papers from the ground. Yes,Ibroughtthefolderto the dentist with this in mind, she writes in her post, it’s my receipts&BankofAmericastatementsfrom2005. On an app and a medium otherwise devoted to the performance of ideal selves, July’s performance of ordinary humiliation vibrates with noncompliance. I have revisited these videos often; on one viewing, they brought to mind a computer animation I had seen of an AI learning how to move like a human being. The artificial intelligence—a bipedal model—had to traverse an uneven virtual terrain. It ran obliquely, its legs at an awkward angle to the rest of its body, as if a crab had two legs and walked upright. I was entranced by this interpretation of the human body as a physics problem bounded by physiology. Mostly I was touched by its strangeness, the AI’s naïveté every time it messed up, fell, and tried again.
What seems to me creepy about surveillance capitalism is not the fact that we are being watched (a now foregone conclusion) or the motive for the watching (profit). It is creepy because we have so little say over its terms.
When I left Silicon Valley in 2018 and began writing in earnest, I wanted to examine anew the questions that had lurked in the background during my decade in tech: How are we to think or act, as machines and algorithms increasingly serve as arbiters and predictors of daily life? How might we enter into a meaningful relation with capital when our performance of self is tracked and used to deliver product and profit? I returned, also, to Miranda July’s art—vital to me for the ways it defamiliarizes our technologically mediated lives, rendering them uncanny, and unpacking their ambient creepiness. “July’s work inhabits dasUnheimliche, the uncanny space—a world in which the familiar or close at hand is suddenly made foreign,” wrote Julia Bryan-Wilson in CameraObscura in 2004. And “creepy,” as Jela Krečič and Slavoj Žižek write, building on Adam Kotsko’s Creepiness, “is today’s name for the Freudian uncanny.”
“Facebook Is Creepy. And Valuable,” read an online headline in TheNewYorkTimes in 2018, a week after Mark Zuckerberg’s congressional testimony on his company’s mishandling of user data. One accompanying photo depicts a pale-faced Zuckerberg rising from his seat, glancing over his shoulder with a look of misgiving. Another shows Facebook’s headquarters in Menlo Park, photographed at dusk. We see an empty guardhouse; a darkened car exiting between two lines of traffic cones. In the background, an amply lit glass building is eclipsed by a communications tower and obscured by trees. There are no humans visible. This scene, presented as a site of unease, is an imperfect emblem of our contemporary experience of creepiness, that wariness we feel when we notice ads tailing us all over the internet on the basis of what we had thought was a submerged desire.
In downtown Chicago, a train ride from where I now live, are four Amazon Go convenience stores, mere blocks from one another. In these stores, workers in orange Amazon shirts bustle about stocking shelves, preparing food, greeting and fielding questions from would-be customers. If you approach, you will be directed to download an app and hold your phone up to a turnstile; you will be invited to wander the aisles and browse the shelves of snacks, drinks, and ready-to-eat meals, to pick up whatever you desire and leave. Justwalkout, exhorts the text painted on a wall in one store. Nolines.Nocheckout.(No,seriously.) By far the store’s most mesmerizing feature is its ceiling: look hard and you will see, amid the seemingly familiar black plaster and black metal scaffolding, a repeating grid of matte black boxes and optical lenses, observing every square inch of the store. Amazon describes this surveillance apparatus as employing “the same types of technologies used in self-driving cars: computer vision, sensor fusion, and deep learning.” With these technologies, the store “automatically detects when products are taken from or returned to the shelves.” This language does not explain how each item taken or returned is uniquely attributed to you—to your meat-self, your right arm reaching for the bottle of cold kombucha. Nor does it explain why the company discourages you from helping another customer, a separate and distinct meat-self, to retrieve an item from the shelf.
One day, after months of stopping and hovering, curious, at the threshold of the Amazon Go store at the train terminal, I stepped through the automatic doors. I imagined my body passing through an invisible membrane, implicitly acceding to a contract of unknown terms, to be yoked to a virtual account of the self and processed, ultimately, into raw material for an algorithm. This transit, from a physical self in a public space to a disembodied self in a privatized space, flickered again through my mind when I read James J. Hodge’s “Sociable Media: Phatic Connection in Digital Art”: “Amazon may hail me ‘personally’—Jim, we recommend X foryou—but it does so algorithmically. Amazon.com hails me as market data, not as a person with parents and psychological depth.”
On May 14, 2016, Miranda July debuted a performance piece with the technologist Paul Ford at the New Museum in New York. In an uploaded recording of the piece, named for its date of performance, July sits in front of a projection screen, facing a small audience. She begins with incantatory anaphora, addressing an ambiguous you:
You were born and raised in New York City. You were born in Korea. You were born in Hong Kong and raised in Toronto. You were born on Halloween. You were born and raised in Cairo. You were born in Beirut during the civil war; you slept straight through the bombings. You were born in Brazil. You were born in Lima, Peru. You were born in Winnipeg, Manitoba, and were unschooled until age 13. You were born in Brooklyn to a Jamaican father and a Trinidadian mother. You were born in the town where David Bowie was arrested for smoking pot. You were born in Pančevo, Serbia. You were born in Guiyang, China.
July’s hailing of this multiplicity of yous seems at first enigmatic, perhaps invoking a global spirit or a sense of universal simultaneity. A few in the audience, however, may have felt a prickly sense of uncanny identification, recognizing their own provenances. Ford takes over the microphone and continues in the second person, invoking a you who sexted for the first time at age fifteen, or maybe thirteen, with a stranger from an AOL chat room. July and Ford continue to alternate, in a baton pass that accelerates as their increasingly detailed anecdotes of the yous comings and goings are accompanied by images projected behind them: photos of objects and artwork that the plural you has made, of food that the plural you has eaten, of various women who gave birth to you. You made allthesethings.Youdreamtthis.Youthinkthefollowingitemsarebullshit. The moment of realization arrives in waves: more and more audience members register that they are, in fact, the you being addressed.
The alarm and revulsion we may feel toward surveillance capitalism in all its creepiness is confounded by our complicity in it.
The 186-person guest list had been provided to July two weeks before the performance. With the help of radio producer Starlee Kine and researcher Elizabeth Minkel, July combed each of the attendees’ Instagram and Facebook pages, online wedding registries, and AirBnB reviews, as well as those of their family members. Their approach, as July put it in the post-performance Q&A, was to “Google the fuck out of everything.” Ford, in parallel, created a tool that indexed the last ten thousand tweets of any audience member with a Twitter account, which he and July then programmatically searched to find and aggregate instances of words like bullshit, sex, and dreams. All this raw material, mined using a combination of manual labor and automation, was then edited, through associative leaps and juxtapositions, to produce an arc of gradual revealing. Even watching the performance after the fact, at a remove, I am pulled into its visceral creepiness. As a technologist intimately familiar with networked life’s data trail, I am still astonished by the detailed and moving composite portrait cobbled together in just two weeks.
“Surveillance capitalism,” as defined by the scholar Shoshana Zuboff, “unilaterally claims human experience as free raw material for translation into behavioral data.” Some of the data go to service improvement; the rest, she writes, are “behavioralsurplus, fed into advanced manufacturing processes known as ‘machine intelligence,’ and fabricated into predictionproducts that anticipate what you will do now, soon, and later.” Reading Zuboff, I pause at her italicized terms and how they manifest, so unremarkably, in Silicon Valley life. Behavioral surplus is abundant: like dead skin cells, it is shed in the course of living and carries information about you. When and where on the screen you tap, what you choose to swipe, how long you linger on a specific segment of a video—such surplus is logged on servers and anonymized. Interpretive work is performed on this captured surplus in hopes of gaining insight into what is useful, profitable, habitual. The apps on your phone, for example, may anticipate how you will use your phone at certain times of day, or in certain locations—the way a barista at your favorite coffeeshop might anticipate your morning order or an office security guard might open the door every evening as you leave. What seems to me creepy about surveillance capitalism is not the fact that we are being watched (a now foregone conclusion) or the motive for the watching (profit). It is creepy because we have so little say over its terms. The word that stands out to me from Zuboff ’s definition is unilateral: a one-way relation of power and silent extraction, rather than a mutual and continual reckoning of the public costs and privatized benefits accrued.
July and Ford’s unilateral processes in May14th,2016 elicit, on one hand, feelings of vulnerability, exposure, and dread. They also elicit the pleasures of recognition, flattery at being chosen, perhaps a sense of ambient intimacy. Scattered throughout the performance are sentiments and images that audience members would recognize simultaneously as distinctly theirs and as pervasive on the internet in 2016: excitement about Beyoncé’s Super Bowl halftime show, eulogies for David Bowie and Prince, photos of cats, rainbows, and sunsets. The artists’ even handling of this material allows the audience to hold in tension two opposing considerations: that the performance might at once denounce the sameness of our self-projections on the internet and gesture toward our common humanity. Just as the audience begins to ease into the flow, buoyed by collective laughter and a sense of fellow feeling, July and Ford turn up the dial on creepiness by showing a succession of Google Maps screenshots of audience members’ addresses, de-identified, while intoning: Youlivehere. “I was sitting in the audience so I could look at someone… and see the reaction when they realized what was happening,” Kine recalls in MirandaJuly, a 2020 published retrospective of July’s work. “It was a real range of emotions—embarrassment, pride, anger, and a kind of building hysteria as it got more and more boundaryless. People’s phone numbers, addresses, children … It ended with a picture of the same rainbow that multiple people in the audience had taken a picture of the day before.”
For all the intimate data we offer up, the work suggests, these technologies know nothing of who we truly and fully are—which is, in the final analysis, reassuring and concerning both.
The alarm and revulsion we may feel toward surveillance capitalism in all its creepiness is confounded by our complicity in it. Discomfited by giving away our personal information, we do it anyway, for the convenience of skipping the checkout line, the pleasures of personalized entertainment, the possibility of relating across distances, a sense of being in the stream of the world. In this web of complicity, I have been more complicit than most. From my time working in Silicon Valley, my primary experience of technology is as an accumulation of deliberated micro-decisions made by people like me, my colleagues. All of us were trying to do right by our ethics, by the people whom these products were designed to serve. But no matter how individually well-meaning we are, we live and operate, all of us, under an encompassing system of capital with all of its inequities. There are limits to choosing or to being a conscientious objector. This is a reason to make art: to stand both within and at some remove from our realities, to recast them at a fresh slant.
I want to posit that the boundary transgressed in May14th,2016 is not only one related to privacy or propriety but one that separates two registers of the self: the psychoanalytical self—which discloses at length, as modeled by Freudian thought and talk therapy— and the phatic self, which asserts its presence simply, the self that tweets, likes Facebook posts, checks in on Pokémon Go, uploads selfies on Instagram. (Damon Young, the scholar from whom I have borrowed these classifications of the self, notes that the technological shifts of the past century have perhaps overwhelmingly replaced the psychoanalytical self with the phatic self.) We expect the phatic self ’s discrete grammar of declarative actions to translate into market data, but we do not expect them to gather into a psychoanalytic self, as they do in May14th,2016, through narrative synthesis. In an inversion of Hodge’s algorithmic calculus, July conjures a feeling of creepiness by hailing her audience not as market data, but as people with parents and psychological depth.
Several years ago, I plunged into Facebook’s privacy and ad settings, curious to see what the service had surmised about me, based on my phatic trail on the internet. It knew the basic things—that I used an Android phone and a MacBook, and that I had Facebook friends primarily concentrated in the United States, where I live, and Malaysia, where I was born and raised. Because I was at the time living in a new area and committed to learning about its racial histories, and because my reading on the internet led me to clicking on Black websites and reading Black writers and liking Black feminist theory, Facebook classified me, a first-generation Malaysian immigrant to the United States, as African American. In networked life, phatic gestures—clicks, swipes, skips, likes, dislikes, and exits—are used by recommendation engines and predictive algorithms to cluster users into “neighborhoods” of like individuals. As sized up by Facebook algorithms, I lived online in a Black neighborhood.
Phatic signals drive July’s I’mthePresident,Baby, a 2018 installation for the Victoria and Albert Museum in London, part of an exhibition titled TheFutureStartsHere. July’s installation was built upon the life and routines of Oumarou Idrissa, a Los Angeles Uber driver and immigrant from Niger whom July met in 2015. Though a U.S. citizen by the time July collaborated with him, Idrissa continued to suffer from insomnia, cultivated, as July wrote in the wall text, from years of fearing deportation. In the installation, Idrissa’s insomnia was documented, by proxy and in near-real time, by four motorized, color-coded, and internet-connected curtains. Each set of curtains corresponded to a frequently used app on his phone—brown curtains for WhatsApp, pink curtains for Uber, green for Instagram, and blue for a sleep-tracker program that July had installed in his bed. The exhibition hours in London coincided with the middle of the night for Idrissa in L.A., and any time he sleeplessly opened or closed WhatsApp, Uber, or Instagram or triggered the sleep tracker, the corresponding curtains mounted on the gallery wall would pull open or draw closed. The installation’s wall text details other facets of Idrissa’s life: his economic circumstances as an Uber Black driver, the twenty-one siblings in Niger with whom he chats on WhatsApp, who depend on him financially.
All of this is not outwardly creepy. The work was created with Idrissa’s consent, and any time he needed privacy, the installation could run on a prerecorded script. The museum labels blunted some of the edge of the purely phatic, furnishing details of a psychoanalytic self. “I could tell she was nervous to ask me, she wasn’t 100 percent sure I was gonna say yes, from the body language,” recounts Idrissa for July’s 2020 retrospective. “It’s not like I was worried, I was like, curious, you know? Like everything I do is going to be followed, like on my phone? But at the end of day I do much the same everything, every day the same things…it made it easier to say yes.” Idrissa’s reflections echo the complexity of our participation in networked life and surveillance capitalism—that we have to risk being creepy, or creeped out, to enter into a meaningful relationship with another. Still, there are moments in the installation design that unsettle. For each uninitiated visitor, the curtain’s movements at first seem to occur out of nowhere. The man behind the curtains’ logic is geographically separated from the curtains’ mechanics, making the specific thoughts and desires powering the installation essentially impenetrable. Occasionally and unpredictably, the curtain stutters, a glitch. There is the conceit of the curtains themselves: curtains block light so we can sleep; they also prevent a creepy neighbor from seeing into our private business and prevent us, in turn, from being the creepy neighbor. But when these curtains pull apart, they reveal only a blank, featureless wall. For all the intimate data we offer up, the work suggests, these technologies know nothing of who we truly and fully are—which is, in the final analysis, reassuring and concerning both.
I imagined my body passing through an invisible membrane, implicitly acceding to a contract of unknown terms, to be yoked to a virtual account of the self and processed, ultimately, into raw material for an algorithm.
“Now is pretty creepy, if you could possibly, like, feel it for a second,” July says, in a 2016 interview for Tate Modern in London. She pauses, then nods tentatively to the imaginary ticking of each passing moment. The now to which July refers—the raw material for live performance—is the same now that surveillance capitalism extracts and anticipates. In July’s art, I see a model for grappling with the creepiness of our technological nows, not with blind utopianism or reactionary alarm, but with ambivalence, as Patrick Jagoda defines it in NetworkAesthetics: “a process of slowing down and learning to inhabit a compromised environment with the discomfort, contradiction, and misalignment it entails.” It is a slowing down and a bearing down, a concerted engagement with a compromised circumstance. July’s art avoids the overdetermined about-ness that can hollow other art engaged with technology—work that is about the death of privacy, or about the quantified self and the drudgery of capitalism. By resisting prejudging the nature of our entanglement with technology, July stays open to both the critical and the redemptive, bringing creepiness’s contradictions into clearer view.
Recently, while trawling the internet, I came across a 2018 chromogenic print by the artist Miao Ying, titled BackfireYourCookie. I was deep in a sea of open tabs and followed links, skimming articles on my computer and watching videos on my phone, my initial reasons for coming online that evening long forgotten. Amid these diversions I noticed an ad for a pair of sunglasses, flanking an article as I scrolled down the page. The same product had insinuated itself into my Instagram feed that morning; it felt as if a room of marketers had themselves watched me absentmindedly misplace my old pair, locating an opportunity for an upsell. In this context, encountering Miao Ying’s print seemed felicitous: a page-long manifesto for outwitting the omnipresent browser cookie, throwing off the websites and advertisers that wish to track your behaviors online. I glimpsed in it something of July’s noncompliant spirit. Actlikeamaniacinfrontofyourcookie, urges the manifesto. Clickonthingsthatyouusuallyhatejustsoyourcookiewillhavethewrongdata…Youcanalsotrade yourphonewithothers.Trypeoplewhoarenotyourfriends…Surprise,Cookie!
Min Li Chan is a Malaysian essayist and technologist currently based in Chicago. Her work has appeared in The Point Magazine, BuzzFeed Reader, Tin House Online, and Triquarterly. She holds a B.S. in Electrical Engineering from Stanford University, and is an MFA and M.A. candidate at Northwestern University.
This holiday season, give The Yale Review to someone you love. For a limited time, enjoy 10% off all subscriptions with code HOLIDAY2021 at checkout.