Wasn’t TikTok supposed to be fun?

There is a predictable trajectory for social media apps. Many of them start out as useful or even pure fun. But when they become popular enough, almost any app becomes a place for consequential discussions on political and social issues as well. And with that comes both meaningful conversations and a litany of nastiness.

This reality has come to TikTok. A better known app for viral dance videos has become a significant source of political and social misinformation, as my colleague Tiffany Hsu explained in a recent article.

Ahead of the recent presidential election in Kenya, a widely shared post on TikTok showed an altered and violent image of one of the candidates with a caption describing him as a killer. (The post was eventually removed.) Untruths about diets and school shootings easily spread to the app, Tiffany reported, as do variations on the PizzaGate conspiracy.

And on the serious, if not terrible side, American politicians and their allies are embracing TikTok to spread the messages of their campaign and promote policies like Child Tax Credit.

This may not be exactly what TikTok has in mind. Executives went on to describe TikTok as an entertainment app. And sure, most people use TikTok, Facebook, Pinterest, Nextdoor, YouTube, and Twitch in fun, productive, and informative ways.

But it’s inevitable that apps have to plan what will go wrong when online conversations ultimately span the whole realm of human interest. This will include political information and social activist movements, as well as nasty insults and even incitement to violence and selling bogus products for profit.

“It’s the lifecycle of a user-generated content platform that, once it reaches critical mass, runs into content moderation problems,” said Evelyn Douek, assistant professor at Stanford Law School whose research focuses about online speech.

The tricky part, of course, is how to manage apps that evolve from “We’re just for fun!” to “We take our responsibilities seriously”. (TikTok said this almost literally in his blog post Wednesday.)

Pinterest is best known for cute posts for wedding planning or meal inspiration, but it also has policies to weed out fake vaccine information and direct people to trustworthy sources when looking for self-harm related terms. Roblox is a silly virtual world, but it also takes precautions, such as urging people to “be nice” – in case children and young adults want to use the app to do harmful things like bully someone.

TikTok knows that people use the app to discuss politics and social movements and with that come the potential risks. On Wednesday, TikTok unveiled its plans to protect the 2022 U.S. election from malicious propaganda and unfounded rumors.

Perhaps more than other apps, TikTok doesn’t assume that every post is equally valid or that what becomes popular must be purely the will of the masses. TikTok creates trending hashtags, and reporters found that the app may have been trying to turn people away from certain materials, such as the Black Lives Matter protests.

(TikTok is owned by the Chinese tech company ByteDance. And the posts on Douyin, ByteDance’s version of TikTok in China, are strictly vetted, as are all sites in China.)

Whether TikTok is more or less effective at managing humans than Facebook or YouTube is open to debate. So is the question of whether Americans should be comfortable with an app owned by a Chinese company that influences people’s conversations.

To put it bluntly, it stinks that all apps have to plan for the worst of the human condition. Why should Twitch not just be a place to have fun watching people play video games, without fans abusing the app to chase the stars? Why can’t neighbors coordinate school bus pickups at Nextdoor without the site also containing racial profiles or vigilantism? Can’t TikTok just be for fun?

Sorry no. Mixing people with computer systems that focus their attention on the most compelling material will amplify our best and our worst.

I asked Douek how we should think about the existence of rumors and falsehoods online. We know we don’t believe every ridiculous thing we hear or see, whether it’s in an app or in conversations at our favorite lunch spot. And it can seem exhausting and counterproductive to scream scandal for every manipulated video or online election lie. It’s also counterproductive to feel so unsure of what to believe that you don’t trust anything. Some days it all looks awful.

Douek dissuaded me from that fatalism and focused on the need for a harm reduction plan for digital life. That doesn’t mean our only choices are for every single app to fill up with junk or Chinese-style government control over internet content. There are more than two options.

“As long as there have been rules, people have broken them. But that doesn’t mean platforms shouldn’t try to mitigate the harm their services contribute to and try to create a healthier, rather than unhealthy, public sphere, ”Douek said.

This article originally appeared in the New York Times.

Leave a Reply

Your email address will not be published. Required fields are marked *