Roblox’s AI-based age verification is a complete mess

Just a few days later When it launched, Roblox’s much-hyped, AI-powered age verification system was a complete mess.
Roblox’s facial scanning system, which estimates people’s ages before they can access the platform’s chat features, rolled out in the United States and other countries around the world last week, after initially launching in a few locations in December. Roblox claims to implement the system to allow users to securely chat with users of the same age.
But players are already in revolt because they can no longer chat with their friends, developers are demanding that Roblox roll back the update and, most importantly, experts say that not only does the AI age young players poorly as adults and vice versa, but the system does little to solve the problem it was designed to solve: the flood of predators using the platform to groom young children.
In fact, WIRED found several examples of people advertising age-verified accounts on eBay for minors as young as 9 for as little as $4.
After WIRED reported the listings, eBay spokeswoman Maddy Martinez said the company was removing them for violating the site’s policies.
In an email, Matt Kaufman, Roblox’s chief security officer, told WIRED that a change of this magnitude on a platform with more than 150 million daily users takes time.
“You can’t flip a switch and still build something that didn’t exist before,” he said. “To expect the system to be perfect overnight is to ignore the scale of this undertaking.”
Kaufman said the company was pleased with the adoption, adding that “tens of millions of users” have already verified their age, which he said proves that “the vast majority of our community values a safer, more age-appropriate environment.”
The company also responded to some criticism in an update posted Friday, writing: “We are aware of instances where parents are verifying the age of their children, leading to the children being 21 and older. We are working on solutions to resolve this issue and will share more here soon.”
Roblox announced the age verification requirement last July as part of a series of new features designed to make the platform safer. The company has come under intense pressure in recent months after several lawsuits allege it failed to protect its youngest users and made it easier for predators to groom children.
Attorneys general in Louisiana, Texas and Kentucky also filed lawsuits against the company last year, making similar allegations, while Florida’s attorney general issued criminal subpoenas to assess whether Roblox “helps predators access and harm children.”
Roblox claims that requiring people to verify their age before allowing them to chat with others will prevent adults from being able to interact freely with children they don’t know.
While the process is optional, refusing to do so means a person will no longer have access to the platform’s chat features, one of the main reasons most people use Roblox.
To verify their age, people are asked to take a short video using their device’s camera, which is processed by a company called Persona that estimates their age. Alternatively, users can upload a government-issued photo ID if they are 13 or older.
Roblox says all personal information is “deleted immediately after processing.” However, many online users say they are reluctant to carry out age verification due to privacy concerns.
People who have verified their age are only allowed to chat with a small group of other players of the same age. For example, people verified as under 9 years old can only chat with players under 13 years old. Players considered 16 years old can chat with players between 13 and 20 years old.



