TikTok’s algorithm knows. People speak of the unseen program governing the platform’s “For You” page, where videos populate based on ones you’ve previously interacted with, as an omniscient, omnipresent god. The algorithm has figured out your every interest and hobby, every thought you’ve ever had. More than once, it’s been alleged to have figured out that a person is queer before they knew themselves. The machine genuinely feels like it’s handpicking videos just for you—which is why everyone should pay close attention when the app allows some people to turn it off later this month.
TikTok will soon allow users in Europe to disable the personalized feed. It’s an update meant to satisfy a component of the European Union’s Digital Services Act (DSA) that requires the internet’s largest social-media sites to let users opt out of being algorithmically targeted. The regulation, part of an aggressive push in Europe in recent years to rein in tech platforms, is geared toward better protecting people’s rights online and mitigating risks to democracy such as the spread of disinformation. For anyone who chooses to hide from TikTok’s all-knowing algorithm, the For You feed will become something like a “For Everyone” feed, filled with broadly popular videos that don’t take into account individual interests—or whatever the algorithm perceives those interests to be.
This new, normie TikTok will be a choice, so its widespread impact is likely to be minimal: Algorithm experts doubt that many people in Europe will actually use the non-algorithmic option. Even so, the change opens the door to a strange social experiment. Europeans are about to have access to a TikTok parallel dimension. “So much of the experience of TikTok is that weird sense that you’re being profiled—this idea that each thing that you see is somehow related to you,” Nick Seaver, an anthropology professor at Tufts University and the author of Computing Taste, a book about algorithmic recommendations, told me. What happens when that goes away?
For anyone who doesn’t spend as many hours a day as I do scrolling through videos of ’90s Eurodance parodies and self-appointed watchdogs screaming at pickpockets, I’ll note that many people experience TikTok mainly through its default personalized algorithmic recommendations. The app has other feeds, including one for videos by TikTokers you choose to follow, but For You is the main show. TikTok did recently launch the ability for users to “refresh” their For You algorithm—allowing them to start over—though it’s unclear how many people even know that feature exists.
In this light, the coming non-personalized feed is likely to look very different to Europeans accustomed to TikTok knowing them better than they know themselves. In a press release earlier this month, TikTok said that the new feed will show people “popular videos from both the places where they live and around the world.” Search results on the platform will also be non-personalized. (I asked TikTok for more details, and a representative directed me to the company’s press release.) Petter Törnberg, an assistant professor at the Institute for Language, Logic and Computation at the University of Amsterdam, thinks the updated feed will feel a lot like when a new user first fires up TikTok, before the algorithm is calibrated. Expect a lot of internet mainstays—sports, cats, pimple popping, cooking, ASMR—alongside plenty of the weird viral junk that any social-media user is all too familiar with at this point. Törnberg created an account to see what it gives him on Day One. “It felt kind of like the lowest common denominator of human culture,” he told me over email.
That is to say, on the basic level of user experience, a depersonalized version of TikTok might be … worse. Targeted feeds are tangled up in many problems: the runaway spread of misinformation, the creation of toxic like-minded thought bubbles, political polarization. But the tension in efforts to solve these problems is that personalization is also useful. As Chris Bail, a Duke University professor of sociology and public policy, put it to me, “Curation is one of the miracles of the internet and social media in particular.” If you like to watch videos about turtles, you might also like watching videos about Gila monsters. A fan of It’s Always Sunny in Philadelphia will want to see videos about the show, whereas someone who hates TV but loves to cook would rather be served cooking videos.
Of course, by helping people find stuff to connect with, social-media giants are also helping their own bottom line. Social platforms are engagement machines; they vacuum up data while profiting from ads viewed every minute spent on their sites. Research suggests that personalization leads people to use social media for longer; those who turn it off may use TikTok less. So a boring and bad TikTok feed might be a fix, in its own way. “One of the main concerns around TikTok is that the algorithm is highly addictive,” Törnberg wrote. “If you remove the algorithm, you will certainly solve this issue, for the simple reason that using the app will become a terrible experience.” Perhaps you’ll get an afternoon back, having successfully avoided the mindless allure of endless videos about your pet interest. (Or perhaps you’ll merely find yourself looking for something else to entertain you online. Reddit, here we come.)
If enough Europeans came to adopt the “For Everyone” feed, TikTok could in theory begin to feel like a throwback to a more mainstream era of media consumption—think peak broadcast TV, viewers all watching the same thing. As Seaver explained, one criticism of recommendation systems is that they removed the public’s sense of being part of a shared audience. A return to a centralized “most popular”–style feed could restore a sense of collective culture. But that would require people abandoning the miracle of curation. Researchers I spoke with told me that, based on what we know about adoption on other platforms, such as Instagram, that offer algorithm-free versions, most people will probably not make the jump—and may even be totally unaware of the new option.
All things considered, it will be hard for TikTok’s upcoming change to feel satisfying. The EU law is a significant move; for the first time, users will technically have a choice. But in TikTok’s hands, that choice feels like it’s between two bad options: algorithmic servitude versus an avalanche of soccer clips. TikTok is offering up “a sort of a useless alternative,” Alessandro Gandini, a sociologist who studies algorithms at the University of Milan, told me. The easier choice—the more entertaining choice—is to keep sliding deeper down the algorithmic rabbit hole. Little changes, and everyone is left with the same vexing questions: How much do we actually value personalization? At what cost?
For people who do find themselves tempted to enter the fray of the depersonalized feed, it will be fascinating to see if anything changes about how they come to view the algorithms themselves. Simply having the ability to compare the two feeds side by side might, in some small ways, shift the narrative around TikTok’s almighty algorithm. Perhaps, for instance, some people might see that many of their seemingly hyper-personal recommendations are actually quite generically popular. A peek behind the curtain could make everything feel a little less magic—or creepy. According to Bail, the stories we tell ourselves about algorithms matter. “In some sense, they’re more important than what the algorithms do themselves, because they are shaping things like our policies, and they’re shaping people’s opinions about whether and how to use social media,” he told me.
Researchers’ understanding of exactly how much sway algorithms can have over people’s behavior is still in flux. Their role in siloing online communities and boosting misinformation suggests plenty to be concerned about, although earlier this month, new papers—notably funded by Facebook—challenged the popular narrative about the platform’s role in polarizing America. Over a series of experiments in 2020, researchers tweaked a subset of users’ Facebook feeds in various ways—flipping them to chronological, for example—and measured the effect on their political attitudes. They found that such tweaks did virtually nothing to alter a user’s political views. As my colleague Kaitlyn Tiffany noted, the research by no means acquits Facebook, but it does add evidence to the notion that the relationship between algorithms and American politics is more complex than social-media algorithms = evil and bad.
Of course, Facebook is different from TikTok. That’s part of what makes the upcoming launch so interesting: Many people have experimented with non-algorithmic or less targeted social media—Reddit, Twitter’s (now X’s) chronological-timeline option, old Instagram, old Facebook. But we haven’t gotten to see what an algorithm-free version of the popular short-form-video platform might look like. In the meantime, a whole complicated mythology has been built up around TikTok’s secretive algorithm. Very little research on it actually exists, but that seems likely to change: The new EU law will also force TikTok to turn over data to academics. They, alongside TikTok fans in the EU, will finally get to put that mythology to the test.