With input from CNN, Roblox, the Guardian, and Reuters.
If you want to chat on Roblox soon, you’re going to have to prove how old you are — either by uploading a government ID or letting an AI tool scan your face.
The gaming platform, which has more than 150 million users and a huge child audience, is rolling out a new age-check system that it says will help stop kids from chatting with adult strangers. The move comes as Roblox faces a growing pile of lawsuits and angry parents accusing it of failing to protect children from grooming and abuse.
Roblox lets users build and play games, hang out in virtual worlds and chat with each other. Unlike many big platforms, it openly targets kids: roughly one-third of its users are under 13, and it has long marketed itself as a way for children to learn to code through play.
But in recent years, Roblox has been dragged into the spotlight for all the wrong reasons. Families and state officials accuse the platform of being used by sexual predators to target kids, in some cases as young as seven. Attorneys general in Kentucky and Louisiana have sued Roblox for allegedly harming children. Florida’s attorney general issued a criminal subpoena and called the platform a “breeding ground for predators.” Texas has also filed suit.
Individual families are suing, too — including one mother who says her 15-year-old son died by suicide after being groomed via Roblox and chat app Discord.
Roblox says it’s “deeply troubled” by any case where kids are harmed and insists safety is its top priority. Now it’s betting that stricter age checks — powered by AI — can rebuild trust.
Under the new rules, anyone who wants to use chat on Roblox will have to verify their age. There are two options:
- Upload a government ID (for users 13 and older), or
- Use “Facial Age Estimation,” an AI tool from identity firm Persona that uses your device’s camera to estimate your age.
Here’s the basic flow for the AI option:
- You open the Roblox app and start the age-check process.
- Using your front-facing camera, you record a short video or follow prompts to move your face in different directions.
- The AI scans your face and drops you into one of several age “buckets”:
Under 9
9–12
13–15
16–17
18–20
21+
From there, your chat options are locked to your age range (and nearby ranges):
- A user estimated to be around 12 can chat with players 15 and younger, but not with those 16 or older.
- Someone around 18 can chat with users 16 and up.
- For the youngest players, chat defaults are even stricter: in-game chat is turned off by default for kids under 9 unless a parent gives consent, and chat outside of games remains restricted for users under 13.
The idea is to recreate something like school age bands — elementary, middle, high school — and stop random 30-year-olds from DM’ing 10-year-olds they don’t know.
There will also be “Trusted Connections”: people who’ve passed an age check and know each other in real life (like parents, siblings or close friends) will be allowed to chat across age groups, as long as the child is at least 13. Roblox says it’s working on additional tools to make direct parent–child communication easier for kids under 13 as well.
If the AI gets your age wrong, users over 13 can upload an ID to correct it, and parents linked through parental controls can fix their child’s age. Roblox says facial images and video used in the age-check process are only used for that purpose and are deleted once an age bucket is assigned.
Roblox is pitching this as a big moment both for itself and the wider industry.
The company says it will be the first major gaming or communication platform to require age checks just to access chat, something that goes well beyond the usual “type in your birthday and promise you’re 13” approach that dominates social media.
“We believe this will become the gold standard for communication safety,” the company said, arguing that “all chat is monitored,” text is filtered based on age, and image and video sharing in chat is banned entirely. It also says it has launched 145 new safety features and tools since January 2025.
Matt Kaufman, Roblox’s Chief Safety Officer, framed the age checks as part of a broader push to make the platform genuinely age-appropriate:
“Our priority is safety and civility. We want to make Roblox a safe, positive, age-appropriate experience for everybody… We see it as a way for our users to have more trust in who they’re talking to.”
Some child-safety advocates cautiously welcome the move but say Roblox is playing catch-up.
Beeban Kidron, founder of the UK’s 5Rights Foundation, called it “time for gaming companies to put their responsibilities to children at the centre of their services,” but pointed out that Roblox has “been slow to address predatory behaviour” and previously allowed adult strangers and older teens easy access to millions of younger users. Still, she said, if the company really follows through, it could set a new benchmark.
Parents’ groups are keeping the pressure on. The nonprofit ParentsTogether Action has organized a “virtual protest” inside Roblox calling for stronger default privacy settings for younger kids and broader safety reforms.
Scanning users’ faces is always going to raise privacy questions, especially when kids are involved.
Roblox stresses that the Facial Age Estimation system is designed to be “privacy-protective”:
- Images and video captured during the age check are processed by Persona and then deleted immediately after processing.
- The tool doesn’t try to identify who you are — only which age band you fall into.
- ID checks are only available to users over 13, in part because younger kids are covered by stricter privacy rules.
Kaufman says the AI is “typically pretty accurate within one or two years” for users between 5 and 25. Roblox declined to give a specific accuracy rate, but says the system includes “robust fraud checks,” such as:
- Making users move their heads or follow on-screen prompts to prove they’re a live person, not a photo.
- Looking for repeated use of the same face.
- Flagging other “anomalies” that might suggest someone is trying to spoof the system.
On other platforms, some users have tricked age-estimation tools by pointing the camera at video game characters, photos of older people or even deepfakes. Roblox insists it has factored those tactics in — though real-world testing will be the true proof.
Roblox is phasing in the age checks:
- Now: Age checks are available on a voluntary basis globally. Users can opt in to secure ongoing access to chat.
- Early December: Age checks become mandatory for chat in Australia, New Zealand and the Netherlands.
- Early January: The requirement will roll out to the rest of the world wherever chat is available.
After that, Roblox plans to extend age checks beyond chat, including:
- Access to social media links on profiles, communities and game pages;
- Creator collaboration tools like Team Create in Roblox Studio, so only users of similar ages can work together.
The company says the goal is simple: keep minors on-platform, inside an environment where it controls the safety tools and filters, and limit the chance they’ll be pushed toward less-regulated chats and social networks.
Roblox’s new system is arriving under intense legal and public pressure.
Multiple lawsuits describe what lawyers call the “systemic predation of minors” on the platform, accusing Roblox of failing to properly screen users and design features that protect children. In one recent case, the family of a 13-year-old girl alleges that a predator posed as a child on Roblox, built a fake emotional bond, persuaded her to share her phone number and then coerced her into sending explicit photos and videos of herself.
Roblox argues that no system can ever be perfect, but that it’s doing more than most to keep kids safe, pointing to parental controls, strict media-sharing bans, AI and human moderation and now age-based chat.
Kaufman has been blunt that the company wants to set an example:
“It’s not enough just for one platform to hold a high standard for safety. We really hope the rest of the industry follows suit… to raise the protections for kids and teens online everywhere.”
Whether AI face scans and age-banded chat will be enough to convince parents — or judges — that Roblox is truly safe remains to be seen. But one thing is clear: for millions of kids and teens who treat Roblox as their digital playground, the way they talk to other players is about to change in a very real way.








The latest news in your social feeds
Subscribe to our social media platforms to stay tuned