culture

Because Your Algorithm Says So

Examining our (sometimes toxic) relationship with our AI overlords.

Photo-Illustration: by the Cut; Photos Getty Images
Photo-Illustration: by the Cut; Photos Getty Images

Emely Betancourt would rather show you her Notes app before handing over access to her TikTok For You Page. It’s too eerily accurate a virtual mirror, she tells me, one that took hours of scrolling to create. The For You Page is TikTok’s primary feed. At first, it shows you its most palatable offerings: videos with millions of likes, celebrities, Charli D’Amelio — milquetoast, likable content. As you start watching and liking posts, you go deeper into a niche you’ve co-created with the platform’s famous algorithm. Betancourt knows it sounds intense, but she feels like her FYP truly understands her inside and out: “I feel like it’s really a reflection of my subconscious thoughts; even things I never say out loud, it will know.”

Like 100 million other Americans under lockdown, Betancourt started using TikTok during the early-2020 days of the pandemic — if the now-20-year-old couldn’t be in school and with friends, she could at least go on TikTok to interact with people outside her immediate family. Today, Betancourt spends anywhere between one to three hours a day on TikTok, her most-used app. “My For You Page is literally a culmination of everything that I am,” she says: a “perfect” reflection of her liberal politics and satirical sense of humor, but also of more personal things like her attachment style and trauma. It’s not perfect, of course. No matter the platform, algorithms will never have the full picture of who we are — they didn’t watch us grow up and they don’t know how we act around friends and family offline. She knows that an app she’s only used for about a year and a half couldn’t possibly fully know her, “but I feel so seen!” Betancourt said. She added that sometimes, “it’s hard to say that it doesn’t know me all the way because sometimes it does know me all the way.”

So much of our lives — from online dating, to search engines, to social-media feeds — is mediated by algorithms. And we talk about them like we actually know much about them. We complain about the Facebook algorithm and we gush (Betancourt isn’t alone) over TikTok’s. As I write this, some YouTube alpha male is out there uploading videos promising straight men advice on how to “hack” the Tinder algorithm to date like kings, and if you watch any of these videos, the site’s algorithm will use that query to offer you more unsolicited dating advice the next time you log in.

In reality, we don’t know nearly enough.

When we talk about “the algorithm” of any given platform, we’re sometimes talking about multiple algorithms that use artificial intelligence to metabolize the data that consumers (that’s us) provide through our interactions with the platform. These algorithms use that information to then curate that platform’s offering to its users (again, us). In other words: Our likes, swipes, comments, play time, and clicks provide these platforms up-to-the-minute updates on our needs and preferences and the algorithms use this information to determine what we see and when.

Exactly what that data counts for and how it’s used to offer us everything from TikToks to dating prospects is proprietary information that’s kept secret from us. And it doesn’t help that we’re only just becoming aware of the algorithms that shape and mold our digital worlds. Congress and the relevant regulating bodies, like the FTC, have recently begun honing in on commercial algorithms, which they’ve deemed as having too great an impact to go totally unregulated.

By the way, what little we do learn can quickly become obsolete since these algorithms are updated, refined, and tinkered with almost endlessly in ways that sometimes make headlines — like when Instagram phased out (and reintroduced) the chronological feed — but are usually unannounced.

So we mythologize.

When we decide that an algorithm can “understand us” and it matches us with songs, people, and TikToks that align with our needs and desires, we slip into a sort of weird devotion. When we match with the same person over and over on dating apps, we wonder if it’s a sign. We say things like “TikTok’s algorithm knew I was bi before I did” and are so impressed with its perceived ability to “know us” that we often worry if more sinister surveillance practices are at play. Our algorithmically orchestrated encounters with people on dating apps or psychology buzzwords on social media start to feel preordained, as if the fact that the algorithm put something on our path Means Something™.

Of course, that once-accurate mirror can easily — and often does — start to stretch and distort your reflection, and you begin to question if you even know what you look like. “I get a lot of TikToks about high-functioning autism and ADHD,” Betancourt tells me, and while she does have anxiety, the fact that the algorithm would think those videos applied to her made her think, “Do I have ADHD? Do I have high-functioning autism?” The habit of relating to things on her FYP made it easier to trust some of the algorithm’s more out-there suggestions: “It kind of plants that seed.” She says she has no reason to seek out a diagnosis for any of these conditions, but the fact that she even considered it made her realize how much credibility she gave her FYP. “Although there are gems and really valuable stuff on TikTok,” she explains, “not all of it is necessarily applicable to you, and you kind of have to draw the line and say, This is relatable, but I’m not necessarily autistic.”

Algorithmically curated feeds like TikTok’s — and even ones like Spotify’s and Tinder’s — can connect us with people and ideas that expand our worlds and minds while also making us feel more seen and less alone. But they can also make us feel really alienated, misunderstood, and commodified when they use our own data to show a warped version of ourselves.

Natasha Dow Schüll is an anthropologist and an associate professor at New York University’s Department of Media, Culture, and Communication, whose work focuses on the psychic life of technology and the relationship between data and the self. “You can say all the smart reasons why you’re against these things, but if you look at people’s behaviors, even my own,” Schüll says, “at the end of the day, people like being recognized.” Along with all the value we get from engaging with these algorithms, we also get what Schüll describes as this sense of ambivalence that comes from being subtly aware of “this risky tendency that sort of looks like a spiral that you get caught in, where you’re at the bottom of this vortex or well and these forces are limiting your growth as a person and sort of pinning you and fixing you.”

Both in research and in public discussions about the power of these commercial algorithms, we often run into that question of addiction by design. “Just like a slot machine, every swipe, every date, every pull of the handle changes who you are in some way and further compels you and engages you,” Schüll says. “It’s this very dynamic flow of investment in both directions and we do not have any specific regulation that takes that into account.” We give these algorithms our time, our personal information, our likes — and make ourselves vulnerable — in exchange for that connection and understanding. We are, to varying degrees, okay with being surveilled as long as we get to feel seen.

The stakes are even higher for the algorithms on dating apps. When those algorithms determine whom you see and whom you’re shown to, they can easily start to feel like an authority on whom we should be attracted to and who should be attracted to us.

The way Tinder collects data, for example, is skewed: Users can choose from over a dozen gender-identity labels and select multiple sexual-orientation labels from a list of nine, but are asked to indicate if they’re looking to date men, women, or everyone. How can Tinder’s algorithm be expected to understand queer culture? In Hannah Sullivan Facknitz’s experience as a nonbinary bisexual, it can’t.

To Sullivan Facknitz, a 30-year-old grad student based out of Vancouver, being perceived by the algorithm felt the same as being perceived by any institution with power — like, for instance, a university. “The way I could be perceived by reading my résumé or my college transcripts, you see all the classes I took and the grades I got in them. They tell you something about me.” But there’s a huge gap in their transcript “where I flunked out and then came back.” A person reading a transcript with a gap might have to hear an explanation and learn more before making a decision. Any deviance from the norm could be counted against someone, even if they do get a chance to explain themselves. In the case of a dating-app algorithm (and even a hiring algorithm), we get sorted before we get a chance to explain ourselves.

On top of that, Sullivan Facknitz says, Tinder encourages quick-reaction swipes, and they tend to match with what they call their “impulse type,” the dating-app equivalent to the checkout counter’s candy selection: the familiar type they’ve already dated, and it didn’t work out, which is why they’re using Tinder in the first place. Instead of “meeting someone new,” they’re stuck in an unsatisfying, harmful even, swipe-right loop.

Interfacing with algorithmically curated apps like Tinder, for Sullivan Facknitz, worsened a feeling of being trapped in their worst pattern: “I would match with people who were like the men who victimized me and I was very confused and I dug into myself, emotionally and in a really destructive way, to try to excavate what the hell the algorithm was seeing in me that made me bad enough to deserve these men.” They felt like the algorithm saw something obvious they didn’t and kept coming back to figure out what it was. It took time and some personal growth for them to realize, “The algorithm was not smarter than me, it could [not] somehow see me clearer than I could see myself.”

That clarity has helped Sullivan Facknitz establish better relationships with the algorithms of their life — one that allowed them to wrestle back a bit of control. In fact, TikTok helped them realize they have ADHD, rather than convincing them. “It helped me ask that initial question,” they explain, adding that they then turned to members of their community who had been diagnosed for further guidance and then to medical professionals. “And it was a question I’d had my entire life. It wasn’t something the algorithm figured out for me, this was just another piece of information that helped me put the question together.”

It’d be totally naïve to trust that the companies behind these algorithms have our best interests at heart. TikTok’s main source of revenue is ad sales, and we learned from a recent New York Times column headlined “How TikTok Reads Your Mind” that “the app wants to keep you there as long as possible.” Tinder, on the other hand, makes most of its money from subscriptions, so it makes sense that the algorithm is only good enough to hook you onto the product, to keep you coming back for new prospects rather than make good on its promise of actually helping you connect. “Something that can be polar opposite in experience, in terms of healthy and harmful,” Schüll says, “are equally productive in terms of capitalism.”

The conversation sways between two extremes. “Some people would say that the answer to all of this is that you need a better algorithm,” Schüll says. Maybe smarter algorithms (that use even more data) can do a better job of understanding us — changes, ambivalences, inconsistencies, and all — rather than boxing us into a fixed version of ourselves. “This is the technological answer,” Schüll notes. But then there are those who’d say more technology will jeopardize some unquantifiable human quality — maybe our souls? — and that we should just put down our phones. “That’s an extreme humanist answer,” Schüll adds. “And I don’t go that way, either, because I feel like we’re all technological beings.”

She doesn’t have a clean-cut answer, but concludes that “I find it just as ridiculous to say that there is something anti-human about algorithms that contaminates our experience.” We’ve had enough meaningful experiences with algorithms and technology to think it’s that black-and-white.

If you’ve ever read a horoscope that made you feel a wave of dread (or made you annoyed at the idea of having your fate handed to you), then you can see how our relationship to these algorithms and astrology are similar. In a blog post, astrologer Alice Sparkly Kat offers some useful insights on how algorithms see us: “The predictable person that the algorithm imagines you to be is a corporate fiction.” Sparkly Kat writes that, somewhat similarly, astrology is a language that can help people describe themselves, “but it is also a technology that can try to tell you who you are.” Once upon a time, astrology was invoked to reduce the whole of a person to what the stars said they were. We are not our sun signs, nor are we who our algorithms say we are. We’ve learned to use astrology as a tool for interpretation, as a language to work through things we’re trying to understand. Maybe we can turn algorithms into better tools. For Schüll, step one is obviously regulation (“at the moment, it’s what we have to work with”).

In the end, these algorithms are just human creations. Like the front-facing camera on your phone, they distort what we see. There is no such thing as a perfectly accurate reflection, and maybe, with more awareness of how algorithms distort what we see, we can harness their power for ourselves.

Because Your Algorithm Says So