Arabic Arabic Chinese (Simplified) Chinese (Simplified) Dutch Dutch English English French French German German Italian Italian Portuguese Portuguese Russian Russian Spanish Spanish
| (844) 627-8267
0

TikTok under US government investigation on child sexual abuse material | #socialmedia | #hacking | #aihp



TikTok is under investigation by US government agencies over its handling of child sexual abuse material, as the burgeoning short-form video app struggles to moderate a flood of new content.

Dealing with sexual predators has been an enduring challenge for social media platforms, but TikTok’s young user base has made it vulnerable to being a target.

The US Department of Homeland Security is investigating how TikTok handles child sexual abuse material, according to two sources familiar with the case.

The Department of Justice is also reviewing how a specific privacy feature on TikTok is being exploited by predators, said one person with knowledge of the case. The DOJ has a longstanding policy of not confirming or denying the existence of ongoing investigations.

“It is a perfect place for predators to meet, groom and engage children,” said Erin Burke, unit chief of the child exploitation investigations unit at Homeland Security’s cyber crime division, calling it the “platform of choice” for the behaviour.

The investigations highlight how TikTok is struggling to cope with the torrent of content generated by more than 1bn users. The company, owned by China’s ByteDance, has more than 10,000 human moderators worldwide and has been rapidly hiring staff in this area.

The business is booming. A forecast from Insider Intelligence puts TikTok’s advertising revenue at $11.6bn this year — up threefold from last year’s $3.9bn.

Mark Zuckerberg, Meta chief executive, has blamed the popularity of TikTok among young people as a principal reason for slowing interest in its longer-established social media platforms such as Facebook and Instagram.

But Meta has greater experience with dealing with problematic material, with about 15,000 moderators globally and employing other automated systems designed to flag posts.

Between 2019 and 2021, the number of TikTok-related child exploitation investigations by Homeland Security has increased seven-fold.

Social media networks use technology trained on a database of images collected by the National Center for Missing and Exploited Children (NCMEC), a centralised organisation where companies are legally required to report child abuse material.

TikTok reported nearly 155,000 videos last year whereas Instagram, which also has more than 1bn users, had nearly 3.4mn reports. TikTok did not receive any takedown requests from NCMEC last year, unlike rivals Facebook, Instagram and YouTube.

“TikTok has zero-tolerance for child sexual abuse material,” the company said. “When we find any attempt to post, obtain or distribute [child sexual abuse material], we remove content, ban accounts and devices, immediately report to NCMEC, and engage with law enforcement as necessary.”

However, Homeland Security’s Burke claimed that international companies such as TikTok were less motivated when working with US law enforcement. “We want [social media companies] to proactively make sure children are not being exploited and abused on your sites — and I can’t say that they are doing that, and I can say that a lot of US companies are,” she added.

TikTok said it had removed 96 per cent of content that violated its minor-safety policies before anybody had viewed them. Videos of minors drinking alcohol and smoking accounted for the majority of removals under these guidelines.

One pattern that the Financial Times verified with law enforcement and child safety groups was content being procured and traded through private accounts, by sharing the password with victims and other predators. Key code words are used in public videos, usernames and biographies, but the illegal content is uploaded using the app’s “Only Me” function where videos are only visible for those logged into the profile.

Seara Adair, a child safety campaigner, reported this trend to US law enforcement after first flagging the content on TikTok and being told that one video did not violate policies. “TikTok talk constantly about the success of their artificial intelligence but a clearly naked child is slipping through it,” said Adair. All accounts and videos referred to TikTok by the FT have now been removed.

“We are deeply committed to the safety and wellbeing of minors, which is why we build youth safety into our policies, enable privacy and safety settings by default on teen accounts, and limit features by age,” TikTok added.

Click Here For The Original Source.


————————————————————————————-

Translate