Arabic Arabic Chinese (Simplified) Chinese (Simplified) Dutch Dutch English English French French German German Italian Italian Portuguese Portuguese Russian Russian Spanish Spanish
| (844) 627-8267
0

Meta, X, TikTok CEOs testify before Senate on child safety | #childsafety | #kids | #chldern | #parents | #schoolsafey | #hacking | #aihp


(AP) – Sexual predators. Addictive features. Self-harm and eating disorders. Unrealistic beauty standards. Bullying. These are just some of the issues young people are dealing with on social media — and children’s advocates and lawmakers say companies are not doing enough to protect them.

On Wednesday, the CEOs of Meta, TikTok, X and other social media companies are testifying before the Senate Judiciary Committee about child exploitation on their platforms, as lawmakers, families and advocates are growing increasingly concerned about the effects of social media on young people’s lives.

While Meta CEO Mark Zuckerberg is a veteran of congressional hearings since his first one over the Cambridge Analytica privacy debacle in 2018, it will only be the second time for TikTok CEO Shou Zi Chew and the first for Linda Yaccarino, the CEO of X. Snap CEO Evan Spiegel and Discord CEO Jason Citron are also scheduled to testify.

TikTok CEO Shou Zi Chew arrives to appear before the Senate Judiciary Committee’s hearing on online child safety on Capitol Hill, Wednesday, Jan. 31, 2024 in Washington. (AP Photo/Mark Schiefelbein)

“We understand that they are companies and they have to make profit. But when you’re faced with really important safety and privacy decisions, the revenue in the bottom line should not be the first factor that these companies are considering,” said Zamaan Qureshi, co-chair of Design It For Us, a youth-led coalition advocating for safer social media. “These companies have had opportunities to do this before they failed to do that. So independent regulation needs to step in.”

Meta will likely be a central focus of the hearing, as the Menlo Park, California-based tech giant has been sued by dozens of states that say it knowingly and deliberately designing features on Instagram and Facebook that addict children to its platforms and failed to protect them from online predators.

New internal emails between Meta executives released by Sen. Richard Blumenthal’s office show Nick Clegg, president of global affairs and others asking CEO Mark Zuckerberg to invest in hiring additional people to “strengthen our position on wellbeing across the company” as concerns grew around social media’s effects on youth mental health.

“From a policy perspective, this work has become increasingly urgent over recent months. Politicians in the U.S., U.K., E.U. and Australia are publicly and privately expressing concerns about the impact of our products on young people’s mental health,” Clegg wrote in an August 2021 email.

He wrote that the company is “being held back” by a lack of investment in these efforts, “which means that we’re not able to make changes and innovations at the pace required to be responsive to policymaker concerns.” Among the problem areas the email notes are “problematic use” of the apps, such as excessive use, as well as bullying and harassment and suicide and self-injury.

The emails released by Blumenthal’s office don’t appear to include a response, if there was any, from Zuckerberg. In September 2021, The Wall Street Journal released the Facebook Files, its report based on internal documents from whistleblower Frances Haugen, who later testified before the Senate.

Clegg circled back on the August email in late last year, proposing a scaled-down investment and telling Zuckerberg that the funding is important to ensure the company can back up its “external narrative of well-being on our apps.” It’s not clear if there was a response from the CEO.

Meta CEO Mark Zuckerberg arrives to appear before the Senate Judiciary Committee’s hearing on online child safety on Capitol Hill, Wednesday, Jan. 31, 2024 in Washington. (AP Photo/Mark Schiefelbein)

On Wednesday, Zuckerberg is expected to tout the more than 30 existing tools and features designed to help parents and teens, according to a prepared testimony released ahead of the hearing.

The company has been beefing up its child safety features in recent weeks, announcing earlier this month that it will start hiding inappropriate content from teenagers’ accounts on Instagram and Facebook, including posts about suicide, self-harm and eating disorders. It also restricted minors’ ability to receive messages from anyone they don’t follow or aren’t connected to on Instagram and on Messenger and added new “nudges” to try to discourage teens from browsing Instagram videos or messages late at night. The nudges encourage kids to close the app, though it does not force them to do so.

But critics and child safety advocates say its actions fall short of meaningful changes that would address kids’ safety.

“Looking back at each time there has been a Facebook or Instagram scandal in the last few years, they run the same playbook. Meta cherry picks their statistics and talks about features that don’t address the harms in question,” said Arturo Béjar, a former engineering director at the social media giant known for his expertise in curbing online harassment who recently testified before Congress about child safety on Meta’s platforms.

“Instagram promises features that end up hidden in settings that few people use. Why is ‘quiet mode’ not the default for all kids?” Béjar added. “Meta says that some of the new work will help with unwanted advances. It is still not possible for a teen to tell Instagram when they’re experiencing an unwanted advance. Without that information how can they make it safer?”

X, formerly Twitter, said its CEO Linda Yaccarino was in Washington last week to meet with senators to talk about how the company is addressing child sexual exploitation, along with a broad range of other topics that included privacy, artificial intelligence, content moderation and misinformation.

“As an entirely new company, X has strengthened its policies and enforcement to tackle CSE. We are now taking action on users that distribute this content and also taking immediate action on the networks of users who engage with this horrible content,” the company said in a blog post Friday.

Google’s YouTube is notably missing from the list of companies called to the Senate Wednesday. That’s even though more kids use YouTube than any other platform, according to the Pew Research Center. Pew found that 93% of U.S. teens use YouTube, with TikTok a distant second at 63%.

“The thing about YouTube is that it kind of flies under the radar,” said Larissa May, the founder and executive director of the nonprofit #HalfTheStory, which helps teens develop healthy relationships with technology. “I think Meta has gotten so used to taking so much of the heat for the issues that young people are facing. But it’s actually much, much bigger than that.

AP Business writer Haleluya Hadero contributed from Jersey City, New Jersey.


Click Here For The Original Source.


————————————————————————————-

Translate