TikTok Was Designed for War

As Russia’s invasion of Ukraine plays out online, the platform’s design and algorithm prove ideal for the messiness of war—but a nightmare for the truth.
Person recording a video through a train window of displaced Ukrainians standing on the train platform while fleeing...
Photograph: Ethan Swope/Bloomberg/Getty Images

Russia’s invasion of Ukraine is not the first social media war—but it is the first to play out on TikTok. The 2011 Arab Spring was fomented and furthered on Twitter and Facebook. Clips of Syrian children choking from chemical weapons filled social media timelines in 2018. And the Taliban’s capture of Kabul, with all the chaos that wrought, was live-tweeted into our homes last year. Images of unspeakable horrors supplanting the banality of status updates and selfies is nothing new. But the current conflict is a very different kind of social media war, fueled by TikTok’s transformative effect on the old norms of tech. Its more established competitors fundamentally changed the nature of conflict, but TikTok has created a stream of war footage the likes of which we have never seen, from grandmothers saying goodbye to friends to instructions on how to drive captured Russian tanks.

So much of TikTok’s success comes down to both how visual it is and how instant it is. From memes and dance crazes to the storming of the US Capitol, it captures and clips the world with an immediacy other platforms can’t. As Russia prepared to invade Ukraine, it became a boon for open source investigators trying to track troop movements, and has provided immediate, quickfire footage of what’s happening as Ukrainians fight for their future.

TikTok’s rise is—and always has been—a result of how easy it is to use. Its in-app editing and filters make it easier than any other platform to capture and share the world around us. If Facebook is bloated, Instagram is curated, and YouTube requires a shedload of equipment and editing time, TikTok is quick and dirty—the kind of video platform that can shape perceptions of how a conflict is unfolding. And as anyone who’s browsed social media in the last week knows, what happens on TikTok rarely stays on TikTok.

“As an analyst of what’s happening in Ukraine at the moment, I’m getting 95 percent of my information from Twitter,” says Ed Arnold, research fellow in European security at the Royal United Services Institute for Defence and Security Studies (RUSI). “Before that, 90 percent of your information would come from official sources, like intelligence sources.” But among the flurry of tweets, Arnold has noticed a strange trend: A significant chunk of the videos being shared are emblazoned with the TikTok watermark. “It’s odd,” he says.

But it makes sense. TikTok is ubiquitous, user-friendly, and a lot of younger people use it, says Arnold. As of July 2020, 28.5 million of Russia’s 144 million people used TikTok, according to internal data seen by WIRED. (Data for Ukraine was not available.) “Out of all the social media, TikTok is the one that is most visual[ly engaging],” says Agnes Venema, a national security and intelligence academic at the University of Malta.

TikTok is a firehose of content. Internal documentation presented by the company, dating back to June 2020, suggests at least 5 million videos are posted per hour. Getting specific content in front of eyeballs on the For You page is the job of TikTok’s algorithm. It’s the thing that can propel nobodies into superstardom overnight, and can also mean that shaky footage of the aftermath of a Russian missile attack can potentially be seen by millions of people within minutes of being uploaded.

TikTok’s algorithm feeds people videos it believes they are hungry to see. And there’s plenty of appetite for videos about war right now: In the eight days between February 20 and February 28, views on videos tagged with #ukraine jumped from 6.4 billion to 17.1 billion—a rate of 1.3 billion views a day, or 928,000 views a minute. (Content tagged #Украина, Ukraine in Cyrillic, is almost as popular, with 16.4 billion views as of February 28.)

Many of TikTok’s most viral Ukraine videos have been shared by Marta Vasyuta, a 20-year-old Ukrainian currently based in London. When Russia invaded, Vasyuta found herself stranded outside the country and decided to co-opt her TikTok profile, which only had a few hundred followers, into a platform to share footage of the conflict from Telegram with the wider world. “If you post a video from Ukraine, it will be likely for only Ukrainians or Russians to see it,” she says. That quirk is a result of how TikTok often localizes videos it shows on its For You page. Hoping that her location in London would help footage from Ukraine sidestep the algorithm, she began posting. Until she was blocked from posting by TikTok late last week—something she thinks may have been caused by Russian bots mass-reporting her profile—she had gained 145,000 followers. (A message from TikTok shows Vasyuta was temporarily barred from posting for three videos and one comment that breached the platform’s community guidelines. TikTok did not respond to a request for clarification on what rules were broken.)

Despite the suspension, plenty of Vasyuta’s videos have a half-life far beyond TikTok, thanks to the ease by which videos can be downloaded and reshared on other social media platforms.

Sharing videos off-platform has long been a tool deployed by parent company ByteDance to help promote TikTok. One of Vasyuta’s TikTok videos, showing bombs raining down on Kyiv, has been seen 44 million times on TikTok—and shared beyond the app nearly 200,000 times. Where it’s gone is difficult to tell—TikTok’s method of sharing removes the ability to trace a video back to its source—but a search of Twitter shows plenty of videos shared from TikTok on the platform.

But that immediacy and reach on and off TikTok comes at a price. Emotive videos can cause people to overlook whether or not information is legitimate. Couple that with a younger, sometimes less media-literate audience, and it’s a recipe for trouble. “Disinformation is really aimed at trying to elicit an emotional response,” says Venema, “It’s the stuff that gets you outraged, that gets you emotional, that tugs on the heartstrings. Combine those two, and that’s why there’s so much of it.”

How emotion can help create a viral hit is best shown in one video showing a soldier in military fatigues, gently coasting down to the grain fields below with a grin spread across his face. The video, posted to TikTok and reshared on Twitter, racked up 26 million views on the app and purported to give a glimpse into the Russian invasion of Ukraine. Except it didn’t. The video dated back to 2015, and was originally posted on Instagram, fact checkers found.

To combat this problem TikTok has partnered with independent fact-checking organizations to try and combat disinformation, but has struggled to slow the spread of fake or distorted news on its platform more than some of its more established social media competitors. The reason? Again, it comes down to TikTok’s design.

Prior research has shown that fake news travels six times faster than legitimate information on social media—in large part because of its ability to trigger a strong emotional response. TikTok’s design, which throws users headlong into an immersive, endless stream of snappy content, is designed to monopolize attention. Even legitimate information can work by appealing to outrage—and there are few things more outrageous than what’s going on in Ukraine at present. “It’s called collective sensemaking,” says Claudia Flores-Saviaga, who studies disinformation, crowdsourcing, and social computing at Northeastern University. “That is something very normal during crisis events like wars or natural disasters.” And with tools like duetting and stitching, which allow people to easily become creators themselves by responding to existing videos, TikTok encourages everyone to collectively make sense of what’s going on—or muddy the truth.

So far, pro-Ukrainian profiles have dominated the discourse on TikTok. However, videos of invading Russian forces causing destruction also plays into the hands of Putin, warns Flores-Saviaga. “Social media is definitely being weaponized,” she says. That includes dis- or misinformation, designed either to intimidate or to profit from the enormous appetite for war content. And it’s here that TikTok is struggling to keep pace.

From bogus livestreams to video game clips being repurposed as on-the-ground footage of the invading forces, TikTok has come under scrutiny for its inability to police content. US non-profit Media Matters for America has highlighted numerous instances of the app being abused to amplify false content. TikTok did not answer questions about the scale of fake content on its platform relating to the war in Ukraine, or how far content around the conflict had been shared off-platform. TikTok also did not respond to questions about how many moderators it employs and how many videos and livestreams the app has taken down. Sara Mosavi, a TikTok spokesperson, says the company continues to “monitor the situation with increased resources to respond to emerging trends and remove violative content, including harmful misinformation and promotion of violence,” but declines to give specific details.

The system can also be gamed by potentially silencing accounts like Vasyuta’s that have been favored by the algorithm. If TikTok’s features have inadvertently made it ideal for sharing videos of war, critics argue it needs to be more open about how it’s handling the war. When Ukraine was last invaded in 2014, Vasyuta was barely a teenager. Her country has now been in a state of war for eight years. “And I’d guess that 90 percent of the people who’ve seen my TikToks didn’t know about that,” she says. She’s watched on as TikTokers post live streams from Russian cities and speak out against Putin’s invasion. “Everyone knows the truth,” she says. “Now it’ll be really hard for the Russian government to continue lying. They’re saying these are Ukrainian soldiers bombing the cities in Ukraine. That they ‘just came to save us’. We don’t need to be saved. We need to be heard.”

The issue is that Vasyuta couldn’t post on TikTok for nearly three days because of an error in the moderation of her videos. She was banned from posting, and it’s not clear why. Vasyuta’s suspension was only lifted early when WIRED contacted TikTok about it.

Social media has long struggled with scale. But just as TikTok’s popularity has soared, collapsing the time taken to reach a billion users from eight years (in the case of Instagram) to four years, while also accelerating the speed at which video clips can go viral, so it has also found itself running into the same old problems.

When TikTok has worked correctly, it has helped the world understand the horrors going on in Ukraine. But when the app’s systems have been gamed by bad actors, it has tainted the world’s understanding of the war and sowed confusion far beyond the normal fog of war. Part of that, says Flores-Saviaga, could be a result of TikTok’s inability to process the scale of information it has created. She points out that on a platform where millions of videos are posted each day, an algorithm and content moderation system with 99 percent reliability would still let vast numbers of videos slip through the net. Fixing it is a million-dollar question, says Flores-Saviaga. “There needs to be a balance,” she says. “And this type of balance is why a lot of disinformation is slipping through the radar.”


More Great WIRED Stories