Date: February 11, 2025
Tech giants launched the ROOST initiative, a non-profit effort to provide free, open-source AI tools that detect and prevent child exploitation.
In a major move to tackle online child exploitation, tech leaders Google, Roblox, Discord, and OpenAI have launched ROOST (Robust Open Online Safety Tools), a nonprofit dedicated to providing free, open-source AI tools to detect and prevent harmful content. The initiative, unveiled at the Artificial Intelligence Action Summit in Paris, will develop scalable safety infrastructure to protect young internet users across platforms.
With $27 million in funding from philanthropic organizations like the Knight Foundation and Patrick J. McGovern Foundation, ROOST will unify existing detection tools and leverage AI to identify and report child sexual abuse material (CSAM). Industry leaders have stressed the urgency of such measures, especially as child exploitation cases continue to rise.
Eric Schmidt, former Google CEO and founding partner of the initiative, shared:
“ROOST addresses a critical need to accelerate innovation in online child safety,”
He emphasized that smaller companies and non-profits often lack access to essential safety technology, and ROOST aims to bridge that gap by making these tools widely available.
ROOST will focus on open-source AI-powered moderation tools, allowing companies to integrate automated detection systems through APIs. The initiative builds on existing industry efforts, such as Discord’s Lantern cross-platform safety project and Roblox’s AI-driven voice safety classifier, which processes over 400,000 hours of active speech daily to detect harmful content.
Naren Koneru, Roblox’s Vice President of Engineering, Trust, and Safety, underscored the significance of collaboration in this effort:
“ROOST is working on three core areas of online safety that are mission-critical for Roblox and other platforms.”
He noted that companies can enhance their detection capabilities and improve moderation at scale by partnering with experts in machine learning and AI safety.
Despite ongoing efforts, some details remain unclear, including how ROOST’s AI moderation systems will integrate with existing first-line CSAM detection tools like Microsoft’s PhotoDNA. However, the initiative’s leaders remain optimistic about its potential to set a new industry standard for child protection.
By Arpit Dubey
Arpit is a dreamer, wanderer, and tech nerd who loves to jot down tech musings and updates. Armed with a Bachelor's in Business Administration and a knack for crafting compelling narratives and a sharp specialization in everything from Predictive Analytics to FinTech—and let’s not forget SaaS, healthcare, and more. Arpit crafts content that’s as strategic as it is compelling. With a Logician mind, he is always chasing sunrises and tech advancements while secretly preparing for the robot uprising.
Apple Taps Anthropic to Supercharge Xcode with AI-Powered Coding Assistant
Apple collaborates with Amazon-backed Anthropic to create a next-gen AI assistant for Xcode, aiming to revolutionize how developers write, edit, and test code through an intuitive “vibe-coding” experience.
How Much Does a Digital Marketing Agency Cost?
Discover the factors that manipulate the marketing agency costs and drive you to hefty bills. Observe and plan smartly! We got some tips too.
Quantum Leap: Amaravati to Build India’s First Tech Village
Amravati’s quantum computing village, India’s first, pioneers a tech revolution with IBM, TCS, and L&T, fostering innovation in quantum research and collaboration.
Microsoft Goes Passwordless by Default, Pushing Passkeys Mainstream
Microsoft ditches passwords for new users—passkeys are in, friction is out. Is this the tech giants’ way of embracing smarter sign-ins?