People's Instagram Reels feeds were flooded with violent content this week in what Meta says was a glitch unrelated to changes in its content moderation policies. A Wall Street Journal reporter's feed showed "scores of videos of people being shot, mangled by machinery, and ejected from theme park rides, often back to back" on Wednesday. The videos appeared in the feeds of large numbers of Instagram users, including minors, and not all of them had "sensitive content" warnings, the Journal reports.
"It's hard to comprehend that this is what I'm being served," Instagram user Grant Robinson tells the Journal. "I watched 10 people die today." The 25-year-old says many of his male friends around the same age saw similar content, which they don't usually view. Some users said the disturbing content appeared even when they had Instagram's "Sensitive Content Control" at the highest setting, CNBC reports. "We have fixed an error that caused some users to see content in their Instagram Reels feed that should not have been recommended," a Meta spokesman said. "We apologize for the mistake."
The spokesman told the Journal there was no connection to Meta's scaling back of content moderation. Meta's policy is to remove content considered particularly violent or graphic, including "videos depicting dismemberment, visible innards, or charred bodies," reports CNBC. When CEO Mark Zuckerberg announced changes earlier this year, including the scaling back of automated systems that block prohibited content, he said it would allow more free speech but acknowledged that the company would "catch less bad stuff" on Facebook, Instagram, and Threads, CNN reports. (More Instagram stories.)