In a move to address rising concerns from parents and activists, Sam Altman-led OpenAI is hiring a “specialist” for its Child Safety team to set up processes for policies dedicated to child safety.
What Happened: Microsoft Corp.-backed OpenAI has formed a Child Safety team to investigate ways of preventing misuse or abuse of its AI tools by children. The company disclosed the formation of this team in a job listing on its careers page.
The Child Safety team will collaborate with platform policy, legal, and investigations groups within OpenAI, as well as external partners. Their primary focus is on managing “processes, incidents, and reviews” related to underage users. The team is currently seeking to hire a child safety enforcement specialist.
This development comes on the heels of OpenAI’s recent partnership with Common Sense Media to develop kid-friendly AI guidelines. The move also indicates OpenAI’s approach to avoiding breaching policies related to minors’ use of AI and steering clear of negative publicity.
See Also: If You Invested $1000 In Apple When Steve Jobs Returned To Apple 28 Years Ago, Here’s How Much You’d Have
Children and teenagers increasingly turn to generative AI tools for help with schoolwork and personal issues. However, the potential misuse of these tools has sparked concerns, leading to calls for stricter guidelines on children’s use of GenAI.
Why It Matters: The formation of the Child Safety team by OpenAI comes in the wake of growing concerns about children’s potential misuse of AI tools.
In January 2023, the New York City Department of Education banned students and teachers from using the AI-powered chatbot ChatGPT on its devices or internet networks.
Following this, in February 2023, educators across Silicon Valley began reimagining teaching and testing methods due to concerns about ChatGPT being a serious “cheating temptation.”
Check out more of Benzinga’s Consumer Tech coverage by following this link.
Read Next: Mark Cuban Claps Back At Elon Musk, Dubbing Him A ‘Real Bad B***h’
Disclaimer: This content was partially produced with the help of Benzinga Neuro and was reviewed and published by Benzinga editors.
Photo courtesy: Shutterstock
© 2024 Benzinga.com. Benzinga does not provide investment advice. All rights reserved.