Employees at a third-party auditing agency called Teleperformance, which works with TikTok and other companies, have asked them to review a disturbing daily spreadsheet known as DRR, or TikTok Audit Standards. The spreadsheet allegedly contained content that violated TikTok’s guidelines, including “hundreds” of images of children being naked or abused. The employees said hundreds of people at TikTok and Teleperformance could access the content from inside and outside the office — opening the door for a wider leak.
Teleperformance denied to Forbes that it had shown sexually abusive content to employees, and TikTok said its training materials had “strict access controls and didn’t include visual examples of CSAM,” though it didn’t confirm that all third-party vendors met that standard.
The employees told a different story, one that was legally dangerous, as Forbes described it. Content moderators are often forced to deal with CSAMs posted on many social media platforms. But child abuse imagery is illegal in the United States and must be handled with care. Companies should report this content to the National Center for Missing and Exploited Children (NCMEC) and then keep it for 90 days, but try to minimize the number of people who see it.
The charges here go far beyond that limit. They show that Teleperformance showed images and videos to employees as an example of tagging content on TikTok, while having quick and lenient handling of access to that content. One employee said she contacted the FBI to ask whether the practice constituted a crime to spread CSAM, although it was unclear whether a case had been opened.