It’s not an uncommon occurrence these days for a friend to tell you they read some article, about something, somewhere, only to have seen a TikTok about it the night before.
TikTok insists that it’s “first and foremost an entertainment platform,” but people aren’t necessarily using it that way. It’s increasingly becoming an alternative search engine and major source of information for young adults. However, a recent study from Newsguard classified over 20 percent of videos that came up from searching popular news stories on the app as misinformation. Researchers have found that young people are especially at risk for believing misinformation and unproven conspiracies online, according to studies by Stanford and the British Journal of Developmental Psychology.
The app has taken some measures against misleading information ahead of midterms, launching an “Elections Center” to connect users to authoritative election information. But the issue of misinformation on TikTok isn’t limited to political topics like elections and COVID-19, there’s all kinds of banal misinformation floating around on the app. The platform doesn’t allow for live links, making it difficult for creators to cite their sources and for users to quickly verify information. It’s not like TikTok wants you to leave; the infinite scroll encourages more time spent on the app, leaving little opportunity for users to fact-check what they’re watching. Instead, users internalize misinformation and keep scrolling. While TikTok has been adding some select things that users can link, they’re prioritizing e-commerce.
TikTok isn’t a platform designed for education, and yet young people are increasingly stumbling upon so-called news on their FYPs. In a society where more and more adults are getting their news from digital sources, including podcasts and social media apps, this makes sense. According to a 2022 Pew Research study, 16 percent of surveyed teens say they use TikTok “almost constantly.” That’s not going to change. So what can they do to protect themselves from misinformation and become more critical consumers?
Teens are really vulnerable to a kind of anti-institutionalism, and a kind of nihilism, when it comes to the sources of information.
Peter Adams is the senior vice president of research and design for the News Literacy Project, a nonpartisan education nonprofit that was founded in 2008 by investigative reporter Alan C. Miller. The organization’s mission is to help promote news literacy among students, professionals, and the general public, accomplished through online lessons, known as the Checkology Virtual Classroom®, misinformation newsletters, and even an educational phone app, among many others.
Adams and the News Literacy Project take this problem seriously. « Teens are really vulnerable to a kind of anti-institutionalism, and a kind of nihilism, when it comes to the sources of information, » he explained to Mashable. « A lot of them embrace this idea that all sources of information are out to manipulate them in some way — that all sources of information have secret motives. They have these opinions about mainstream organizations, in ways that really make them vulnerable to manipulation by bad actors and conspiracy theorists. »
That isn’t to say that previous generations were any more engaged. « I’m wary of pushing the idea that there was a time when 13 year olds and 15 year olds and even 16 year olds sat down and read the daily paper, and that we’ve gotten away from that. I think that that sort of golden age, a sort of idealization of a previous nonexistent era, is not helpful, » said Adams. But today’s young people, students, and the chronically online masses are navigating an entirely new, substantially more robust, information environment that demands much more of their attention. « It’s also fraught with challenges and pitfalls. We kind of owe it to them to teach them how to navigate that safely, and in a way that empowers them and empowers their civic voices and choices, » he explained.
It’s not just a problem on TikTok, with other apps like Twitter, YouTube, and even Pinterest working to quell rampant misinformation from their user base, but it’s particularly difficult on an app that’s so good at keeping you swiping.
The combination of the infinite scroll and lack of live links has led to a very specific kind of information economy on the app. Creators whose brand involves discussing news or popular culture (and there are a lot of them) opt to screenshot headlines, summarize articles, and offer their takes. But it’s impossible to know whether the creator even read the article or if they’re just offering a summary of the online discourse. Even worse are the accounts that spread false headlines, whether they know it or not. Without live links, all these possible situations are presented to you in the exact same way: with a familiar face on your FYP speaking authoritatively.
This issue is compounded by the fact that there is a steadily increasing distrust of media at the same time people are forming parasocial relationships with creators, making people more willing to trust user-generated content than standards-based content.
Chances are you’re not actively questioning this content as you’re quickly scrolling on TikTok and are just taking the information in at face value. It’s not uncommon for someone to cite a TikTok in a real-life conversation, part of a running joke that when someone says they’ve « read an article » what they meant is that they saw it on their FYP.
We’ve come to refer to the screenshotted headline problem as TikTok’s lack of primary sources. On Twitter, which has live links, a more usable search function, and both misinformation « prebunks » and timeline flags for misleading content, users can usually find the root source of a viral conversation. TikTok, on the other hand, is more like a dark tunnel where you rely solely on your FYP to bring light to whatever is going on.
We can’t have memes unless we have media literacy
Another crucial difference: It’s a video-first platform.
« We’re sort of hardwired to believe our senses, right? And there’s something about video that’s much more compelling. We believe our eyes and ears because it’s part of our survival. And I think there’s something there with the video-based nature of TikTok, » Adams said. « Not that people don’t encounter videos on other platforms, but it just feels more immediate and real and credible, because you’re seeing and hearing it. »
Most major newsrooms across the country have joined TikTok, producing native social content and publishing bite-sized clips of broadcast footage, in an effort to engage a new audience and turn them into loyal subscribers. But few outlets are effectively adapting their stories for the platform, adding the necessary context about their expertise, the reporting process, and the news itself, without asking people to leave the app.
This issue is compounded by the fact that there is a steadily increasing distrust of media — a 2022 Gallup poll found that only 16 percent of surveyed adults felt they had « a great deal » of trust in newspapers — at the same time people are forming parasocial relationships with creators, making people more willing to trust user-generated content than standards-based content.
Eleanor Stern, a 25-year-old writer in London, posts video essays about literary analysis and the way we consume books. Her TikToks typically use an article or essay as a jumping off point. We spoke to her about whether or not she thinks her viewers leave the app to read her source material.
« I would like to think a substantial amount [of people read the article], but I’m not that optimistic. Because of the fact that you can’t link to things on TikTok, and that people are watching it on their phones, I do think it’s kind of unlikely that people are closing the app and opening up the article,” explained Stern. “That being said, I do tend to get a few comments that are either, like, ‘I’ve read this already, and I liked it, and this prompted me to reread it,’ or ‘wow, this made me read the article, thank you.’ Or ‘this made me read the article, and I hated it.’ And I feel like even if it’s just like bringing a couple of people to a reading experience that’s meaningful and interesting to them… I’m not gonna get everyone to open the article, and that’s fine. »
« I do try at least to heavily highlight the text of the article,” she added. “Obviously, you cannot feature the whole thing, but to put the text front and center… I don’t know how many people are going to open it after, and read it. »
TikTok wants you to go to the real experts
TikTok has made efforts to address the spread of algorithmic misinformation. In 2020 and 2021, it partnered with National Association for Media Literacy Education (NAMLE) to create educational content for creators and parents about online safety and misinformation. Across the platform, TikTok says it tries to direct users to outside authorities on trending topics, like its COVID-19 and election informational hubs.
TikTok’s fact-checking process addresses a specific type of misinformation, the company explained to Mashable, not just inaccurate information. « We will remove misinformation that causes significant harm to individuals, our community, or the larger public regardless of intent, » a spokesperson said. But the app makes exceptions for « educational, documentary, scientific, artistic, or satirical content, content in fictional or professional settings, and counterspeech. This allows space for medical creators, for example, to bust myths and educate our community about things they might see online. »
Anything that can’t be verified but doesn’t get removed under the misinformation guidelines is flagged ineligible for recommendation on users’ FYPs.
But the viral spread of inaccurate or misleading information on the app extends beyond bipartisan news and global health pandemics. Users share entertainment and pop culture rumors, often presenting satirical tweets and doctored images as facts. Creators try to quickly jump on trend bandwagons, capitalizing on the views and guaranteed engagement of whichever celeb or meme is dominating the zeitgeist without much regard for fact-checking the seemingly benign information shared.
These videos might not harm anyone immediately — they’re even played off as jokes — but their presence in the app’s ecosystem does have downstream ramifications. « Two or three or four people down the line, someone may believe, someone may act on it, or it may shift someone’s worldview in a way that disempowers them or their community, leads to someone making bad choices, or even embracing a conspiracy theory, » Adams said.
Fact-checking your FYP
So much of the responsibility to verify information lies in the hands of the people who are consuming this content. There are things you can do to become a more critical consumer of the things you watch on your FYP.
Where many TikTok users fail is checking what Adams calls their « system one thinking, » or our brain’s automatic or emotional responses to situations. He suggests making a habit of “evaluating the information that you’re scrolling through, and to know when something warrants being checked out.” You need to disrupt that automatic response to keep scrolling, Adams said, and always double-check a source’s credibility.
Older ways of determining this, like the CRAP test, no longer apply for today’s digital natives, he said. « I’m a big fan of trying to help people think about the differences between standards-based content and user-generated content as two totally different kinds of information. »
User-generated content is most of what you’ll find on your FYP, and there are absolutely no rules to these kinds of videos. « You have to ask very fundamental questions. Is this authentic? Is it being presented out of context? Has it been fabricated or doctored in some way? Is this a credible source in the first place? » Adams explained.
Standards-based content is created with an established set of guidelines in place before it can be shared to the public. These processes establish an implicit trust for consumers that answer the basic questions ahead of time. Instead, « you can ask about the choices being made — word choices, photo choice, the framing of the story — much more nuanced questions there. »
It’s important to think about these things as you navigate social media. What’s the purpose of the content? Why was it shared? And always fact-check the content yourself with an easy Google search. See an article screenshot and like what the person is summarizing? Great. Now: Go. Read. The. Article.
Newsrooms could capitalize on the same dynamic, establishing their standards and editorial processes early on with TikTok viewers that then support future content about trending news.
News organizations still have an obligation to help digital news consumers know what those standards are, and TikTok could still be the perfect platform to do so. « The public doesn’t really understand how seriously accuracy and fairness and transparency and independence are taken in newsrooms. There’s a gap between the public understanding of the practice of journalism and the actual practice of standards-based journalism. News organizations have an opportunity to really show the process at work and even explain some of those choices, » he said.
As official newsroom TikTok accounts continue leaning heavily on humor and memes (which have garnered success in instances like The Washington Post), the responsibility of creating informative content like Adams describes has fallen on individuals. Reporters like Bethany Dawson of Business Insider and Taylor Lorenz of The Washington Post frequently highlight their own work and reporting processes while sharing big news stories. In doing so, they build a reputation and sense of trust with their followers. Newsrooms could capitalize on the same dynamic, establishing their standards and editorial processes early on with TikTok viewers that then support future content about trending news.
It also wouldn’t hurt newsrooms to embrace the video-forward, creative outlet TikTok provides. Accounts like NPR’s Planet Money make engaging and informative videos explaining terminology, concepts relevant to socioeconomic reporting, and information necessary for living under capitalism.
No one’s saying we should all delete our TikTok accounts, but we should be consciously consuming the information we scroll past. « There’s a way in which you could use the algorithm for good. Teach the algorithm that you’re interested in credible information and have it sort of prioritize that for you, » Adams suggested. « It’s not a replacement for deliberate consumption of news and a deliberate sort of media diet. But it helps. It helps folks get those pieces of quality information tucked in and around the puppy videos and everything else that they get in their feeds. »