Billy Oberman was using TikTok as a means to an end. Instead of looking for his own entertainment, the New Jersey musician had downloaded the app as a way to promote his content, only browsing occasionally. But fairly quickly, an odd thing happened.
Seemingly out of nowhere, his feed was choked by Stewie, Brian and Peter Griffin. Because quite by accident, and against his will, he’d fallen down what he calls the Family Guy “pipeline.”
“You’re watching it and you’re not really taking it in — it’s just something to stimulate you,” he said. “It’s like Cocomelon,” a YouTube channel geared toward infants — but here, aimed at adults.
But what Oberman saw is just a small example of what the few people who have studied it are calling “sludge content.”
And while it seems insidious, Oberman says it’s an experience shared by many on the app: TikTok’s video recommendation algorithm, which is supposed to deliver content based on your interests, relentlessly showing users clips packaged in a very particular, and overstimulating, way.
Am I old or is something very wrong here <a href=”https://t.co/6TccyljcFV”>pic.twitter.com/6TccyljcFV</a>
The types of videos that make up this experience are everywhere on the app, but it’s unlikely non-users have seen anything like it. That’s because the style of video that Oberman stumbled upon exists almost solely on TikTok, and only came into being in the last few years.
The “pipeline,” as Oberman and others have dubbed it, is basically just segments from Seth MacFarlane’s animated sitcom Family Guy reposted on TikTok — what Canadian YouTuber Savantics referred to as “the new age of piracy: Family Guy episodes being posted in several parts, with soap-cutting underneath, by accounts run by bots.”
Instead of playing alone, the segments sit on top of low-substance, high-interest videos. Sometimes they’re recordings of mobile video games like Knife Jump or Subway Surfers. Other times they are ASMR “satisfying videos”: short for “autonomous sensory meridian response” (these videos show creators squishing and cutting into various substances — like coloured bars of soap — to elicit that response). Sometimes the segments are combined with a third or even fourth video to create a jumbled mess of meaningless visual stimulation.
“I will have a moment of clarity while I’m watching and be like, ‘What am I doing?’ Then I’ll just continue to watch,” said Oberman. “That’s where we’re at, technology and entertainment-wise.”
But cartoon clips taking over feeds is only a symptom of a wider change in media creation and consumption that’s altering the voices, and ideas, that gain audiences — all while going virtually unnoticed.
WATCH | What is the Family Guy pipeline?
“This is an example of this larger trend of dumbed-down content, which is meant to be consumed passively rather than intelligently and actively,” said Saif Shahin, an assistant professor of digital culture at Tilburg University in the Netherlands.
“What TikTok is doing with these videos is allowing people to have distractions on the same screen … [and therefore] have people stay on the same screen for an extended period of time.
“This form of media content is not meant for active engagement,” he added. “While it draws on people’s already limited abilities to be attentive to media for extended periods of time, it then reinforces that and further limits people’s attention spans.“
The Family Guy phenomenon, specifically, has been recognized mostly due to a related meme and the odd fact that a cartoon more than two decades old gained newfound popularity.
But the encompassing trend has been almost completely unrecognized, even as it becomes a dominant media form on one of the most dominant media platforms on Earth — a “digital advertising juggernaut” which made roughly $10 billion US in ad revenue alone last year, according to the New York Times.
The technique is a corollary to the TikTok trend “corecore” — a seemingly carelessly mashed-together style of video-making that has been called a “genuine Gen-Z art form” by Mashable. But the former’s pervasiveness hasn’t been as well-noted; in fact, it’s so successfully passed under the radar researchers have no idea where it came from — and it barely even has a name.
Understudied media trend
“Yours is one of the first emails I’ve gotten from a journalist where I was like, ‘We should do a study on that now,'” said Gordon Pennycook, an associate professor of behavioural science at the University of Regina. “Check back in a little bit, because we will probably run some experiments.”
That’s because Family Guy is far from the only source for this video treatment. South Park, The Simpsons and a litany of TV and movie clips have received the same packaging.
“At first it feels like a chaotic jumbled mess that has been hastily thrown together in the hopes that at least one element of it will grab your attention,” reads Kaycia Ainsworth’s essay The Content Culture Crisis. “But its disordered nature is not only intentional, it’s essential. The intention is to not only hook you in, but to disassociate you entirely.”
Given its newness, there have been a number of proposed names. Content creators interviewed for this article suggested “stim-maxxing” and “stim-tok” for what it does to the brain.
Ahmed Al-Rawi, an associate professor of social media and communications at Simon Fraser University in B.C., suggested “cocktail content” for how it mixes unrelated ingredients: “Most of the time it’s nonsensical, there is no connection … [but] I don’t think this will stop — it will continue to grow.”
But Ainsworth first labelled the trend “content sludge,” though for whatever reason, the word order was reversed as the idea picked up on Twitter. Still, the few TikTok posts that recognize the trend retain the original phrase — intended to contrast what Ainsworth described as the “once rich and fertile mud” of past internet platforms with their current state.
i think the worst part about sludge content is that i have already been doing this. maybe not with family guy clips above subway surfers gameplay. but with games im not fully engaging with and my phone playing youtube videos on the side
“The more our media focuses on producing sensory stimulating content, the more we search it out and begin to require it to avoid boredom,” Ainsworth wrote. “We are so overwhelmed by sensory input and wading through content sludge that we are trained into craving it.”
Sheena Peckham, a digital content executive for children’s online safety non-profit organization Internet Matters, likened that training to “second screening,” the pandemic-fuelled trend of, for example, simultaneously using your phone while watching a movie.
While sludge content can be seen as a kind of built-in second-screening, Pennycook and Al-Rawi both cautioned against a moral panic.
Instead of turning it into a generational critique like past worries over the rise of video games, it would be better for parents to simply be mindful of screentime — and recognize the media we consume is forever changing, they said.
In abstract, the form isn’t even all that new. Chris Gabriel, creator of YouTube channel and multimedia project MemeAnalysis, noted its similarity to YouTube commentary, and the tactic of putting graphics around copyrighted videos to avoid automatic takedowns. While sludge content could have evolved directly from the success of the latter example, Gabriel said there’s a more obvious reason for its current ubiquity.
“Of course,” he said, “young people raised on this rather than on television or film or whatever — yes, they’re going to need things that are faster and faster.”
WATCH | Sludge content and ‘parasocial agency’:
Pennycook said that while it’s still untested, the greater risk to sludge content consumers comes from those using it to try to convince a person of a particular viewpoint.
“I can see the potential risk for it impacting the way that people process the information, because it’s essentially a form of distraction,” Pennycook, who specializes in the field of misinformation, said.
“Even if having the extra video increases the amount of time that people spend, and those people are successful at ignoring the message, that will still trick the algorithm into showing that video to more people — who may not be as discerning when they see the content.”
YouTuber Blair Chapman — a cognitive science graduate of USC who says he worked at a startup that used sludge content to test and promote engagement — pointed to it as the reason controversial influencers like Andrew Tate and Sneako gained such popularity.
Coining it “parasocial agency,” Chapman said those creators — who often package their opinions as self-help content — create an association between their advice and the feeling of accomplishment watchers get from tasks or video-game levels being completed in the accompanying videos.
“Then you stop watching the content and it’s like, ‘Oh wow, I’m still in the same position and none of this got done,'” he said. “But that’s what makes it such good content: it hooks you and [convinces you] all those things are happening.”
But Betsi Grabe, a researcher of cognitive processes and principal investigator at Indiana University’s Observatory on Social Media, says sludge content is unlikely to hypnotize anyone. Pointing to a field of study called “audio-visual redundancy,” she said that whenever sound and video compete for humans’ attention, video wins.
Because of that, she doesn’t believe sludge content influencers’ diatribes will seep into anyone’s unconscious. Instead, they’ll just ignore them.
“So would you draw eyeballs putting some visual candy to your talking head? Sure. I buy that,” she said. “Would you get your message across more effectively? No. And we know money is to be made by eyeballs, right?”
What is risky, she said, is letting the trend proliferate without researchers, or those watching, aware it even exists — or of how it affects them.