Roblox is releasing an open-source AI system named Sentinel to proactively detect predatory language within in-game chats. This move follows criticism and lawsuits regarding child safety on the platform. Sentinel analyses chat snapshots, comparing them to indexes of benign and harmful conversations to detect potential child endangerment.
Sentinel has already led to 1,200 reports to the National Center for Missing and Exploited Children in the first half of 2025. The system identifies early signs of potential child endangerment, such as sexually exploitative language. Roblox is open-sourcing Sentinel so that other platforms can also utilise it. The platform has over 111 million monthly users and blocks sharing videos or images in chats and attempts to block personal information.