Child pornography is shared among TikTok users on private accounts that evade the service’s moderation features and reappear almost as soon as they are deleted, according to a recent Forbes investigation.
Some TikTok users are creating so-called “post-in-private” accounts, where posts are only visible to the account owner, then sharing their login credentials with users who are expected to post child pornography from their own collection.
Open invitations to these accounts are posted with intentional typos to dodge TikTok’s moderation features, a technique known as “algospeak,” Forbes reported. For example, while a search for “post in private” is blocked, “postinprvs” is not, as of Tuesday.
TikTok creator Seara Adair, child sexual abuse survivor, drew attention to the phenomenon earlier this year. Forbes reported that she informed the Department of Homeland Security, which did not tell the outlet whether they were investigating, and Adair has been contacted by an assistant U.S. attorney for the Southern District of Texas, who did not comment to Forbes.
DHS Special Agent Waylon Hinkle told Adair “we are working on it” in an email on March 31, according to Forbes.
This type of activity is common in some of the top social media apps, including Snapchat, Instagram and Discord, according to Haley McNamara, director of the International Centre on Sexual Exploitation.
“There is this trend of either closed spaces or semi-closed spaces that become easy avenues for networking of child abusers, people wanting to trade child sexual abuse materials,” she told Forbes. “Those kinds of spaces have also historically been used for grooming and even selling or advertising people for sex trafficking.”
One privacy and data policy expert interviewed by Forbes, Jennifer King, said private posting can be legitimately useful. But she questioned why TikTok isn’t detecting multiple logins as a red flag, saying, “You can absolutely know this is happening.”
King also described the situation as “a race against time,” as accounts are often made with the expectation they’ll be deleted quickly.
“You either post a ton of CSAM [child sexual abuse material] or consume a bunch of CSAM as quickly as possible,” she said. “It’s about distribution as quickly as possible.”
TikTok told Forbes that all videos, even private ones, are subject to its AI moderation and can also be reviewed by humans.
The service said it found no violations when Forbes used its in-app tools to report content promoting post-in-private groups. But when the outlet sent the content to TikTok in an email, the company removed it immediately, Forbes reported.
In a statement, a spokesperson reaffirmed TikTok’s “zero tolerance for child sexual abuse material and this abhorrent behavior which is strictly prohibited on our platform,” adding that when such content is found, the company makes reports to the National Center for Missing & Exploited Children.