#News

Roblox, Discord, and OpenAI Unite to Launch ROOST, a Non-Profit Initiative for Child Safety in the AI Era

Roblox, Discord, and OpenAI Unite to Launch ROOST, a Non-Profit Initiative for Child Safety in the AI Era

Date: February 11, 2025

Tech giants launched the ROOST initiative, a non-profit effort to provide free, open-source AI tools that detect and prevent child exploitation.

In a major move to tackle online child exploitation, tech leaders Google, Roblox, Discord, and OpenAI have launched ROOST (Robust Open Online Safety Tools), a nonprofit dedicated to providing free, open-source AI tools to detect and prevent harmful content. The initiative, unveiled at the Artificial Intelligence Action Summit in Paris, will develop scalable safety infrastructure to protect young internet users across platforms.

With $27 million in funding from philanthropic organizations like the Knight Foundation and Patrick J. McGovern Foundation, ROOST will unify existing detection tools and leverage AI to identify and report child sexual abuse material (CSAM). Industry leaders have stressed the urgency of such measures, especially as child exploitation cases continue to rise.

Eric Schmidt, former Google CEO and founding partner of the initiative, shared:

“ROOST addresses a critical need to accelerate innovation in online child safety,” 

He emphasized that smaller companies and non-profits often lack access to essential safety technology, and ROOST aims to bridge that gap by making these tools widely available.

ROOST to Offer AI-Powered Moderation Tools and Expand Industry Collaboration

ROOST will focus on open-source AI-powered moderation tools, allowing companies to integrate automated detection systems through APIs. The initiative builds on existing industry efforts, such as Discord’s Lantern cross-platform safety project and Roblox’s AI-driven voice safety classifier, which processes over 400,000 hours of active speech daily to detect harmful content.

Naren Koneru, Roblox’s Vice President of Engineering, Trust, and Safety, underscored the significance of collaboration in this effort: 

“ROOST is working on three core areas of online safety that are mission-critical for Roblox and other platforms.” 

He noted that companies can enhance their detection capabilities and improve moderation at scale by partnering with experts in machine learning and AI safety.

Despite ongoing efforts, some details remain unclear, including how ROOST’s AI moderation systems will integrate with existing first-line CSAM detection tools like Microsoft’s PhotoDNA. However, the initiative’s leaders remain optimistic about its potential to set a new industry standard for child protection.

Arpit Dubey

By Arpit Dubey LinkedIn Icon

Have newsworthy information in tech we can share with our community?

Post Project Image

Fill in the details, and our team will get back to you soon.

Contact Information
+ * =