Artificial Intelligence (AI) has started taking over our lives in one way or another. As tech companies are heavily investing in AI, they’re also responsible for safeguarding their platforms, especially for children. Considering the concerns, big companies like Google, OpenAI, Discord, and Roblox have jointly formed a new child safety group and launched the Robust Open Online Safety Tools (ROOST) initiative at the AI Action Summit in Paris.
Google, OpenAI, Discord & Roblox founded a new child safety group to tackle online harm
ROOST is a new non-profit organization that aims at improving child safety online. Companies that join hands with the child safety group will have access to free open-source AI tools. They can use such tools to detect, review, and report child sexual abuse materials (CSAM).
Per the joint press release, this initiative aims to bring together “the expertise, resources, and investments” to create a more accessible and transparent safety infrastructure to tackle online harm involving AI. Several philanthropic organizations like the Knight Foundation, the Patrick J. McGovern Foundation, and the AI Collaborative have also backed the initiative.
Bluesky, Microsoft, Mozilla, and Github are also some of the major partners of this initiative. No doubt, the initiative launched under the child safety group is great and all. But, it remains to be seen whether it can achieve some of its big goals.
The founding partners have a spotty history when it comes to regulating their platform
Let’s take Roblox as an example; the company has a poor track record of regulating its platform. In fact, last year, Hindenburg bashed Roblox in one of its reports.
While Google is doing a great job ensuring child safety online, it’s still struggling to combat the distribution of explicit AI-generated content. Therefore, all we can do is wait and hope that the child safety group gets all the praise for doing its job down the line.
2025-02-12 15:07:39