The moderators of TikTok claim that during training, they were shown recordings of child sexual abuse | Lotal Ghana

 The moderators of TikTok claim that during training, they were shown recordings of child sexual abuse

 

The moderators of TikTok claim that during training, they were shown recordings of child sexual abuse
TikTok 

According to a Forbes investigation, TikTok's moderation crew allegedly gave unlawful photographs and videos wide-ranging, unsecure access, raising concerns about how it manages child sexual abuse material.

 


Employees of Teleperformance, a third-party moderation company that works with TikTok and other businesses, claim that they were required to read a troubling spreadsheet on TikTok moderation guidelines called DRR, or Daily Required Reading. The spreadsheet apparently featured "hundreds of photographs" of minors who were nude or being molested, among other things that were against TikTok's rules. Hundreds of people at TikTok and Teleperformance, according to the workers, might have accessed the content from both inside and outside the office, potentially leading to a wider leak.While TikTok said its training materials have "tight access controls and do not feature visual instances of CSAM," it did not ensure that all third-party vendors adhered to that standard. Teleperformance disputed with Forbes that it exposed staff to sexually exploitative information.

 


The employees' account is different, and as Forbes explains, it's a risky legal one. CSAM that is posted on numerous social media sites is a regular problem that content moderators must deal with. However, depictions of child abuse are forbidden in the US and must be used with caution. The National Center for Missing and Exploited Children (NCMEC) must be notified of the content, and companies are required to keep it for 90 days while limiting the number of people who see it.

 


These accusations go much further than that. They claim that while playing fast and loose with access to such content, Teleperformance gave staff graphic images and videos as examples of what to tag on TikTok. Although it's unclear if one was opened, one employee claims she contacted the FBI to inquire whether the practice constituted criminally disseminating CSAM.

 


The complete Forbes piece, which describes a situation where moderators were unable to keep up with TikTok's fast development and were instructed to watch crimes against children for reasons they thought didn't make sense, is definitely worth reading. It's an odd — and, if accurate, terrifying — situation even by the convoluted norms of online discussions about children's safety.

Source: The Verge

Post a Comment

Previous Post Next Post
'); ');