Instagram officially views the sexual exploitation of children as a "horrific crime," and a statement from parent company Meta insists it is "continuously investigating ways to actively defend against this behavior." A new investigation by the Wall Street Journal and academic researchers suggests the company needs to step things up in a hurry. Instagram "helps connect and promote a vast network of accounts openly devoted to the commission and purchase of underage-sex content," reads the first paragraph of the story. What's more, the investigation found that Instagram's platform isn't merely a passive host for pedophiles—its algorithms actually promote related content. Researchers who set up test accounts were quickly directed to online communities promoting child sexual abuse and flooded with "suggested for you" recommendations that led to traders and sellers of such content.
"That a team of three academics with limited access could find such a huge network should set off alarms at Meta," says Alex Stamos, who now leads the Stanford Internet Observatory but previously served as Meta's chief security officer until 2018. "I hope the company reinvests in human investigators." A Meta spokesperson acknowledged shortfalls but said the company is working on it, having taking down 27 pedophile networks over the last two years. The investigation, however, found that Instagram's own algorithms appeared to work against this effort. After the company blocked one notorious service, researchers found that Instagram's autofill feature suggested workarounds with slight variations on the blocked hashtag name, including by adding "boys" to the end of it. (Read the full investigation.)