Gaming platform Roblox will require its users to verify their age using facial recognition software to chat with other players, a company executive said, with the goal of limiting communication between children and adults on the platform.
The company announced this new requirement amid criticism from government officials around the world, who claim that the Roblox platform does not protect its young users from sexual predators and child exploitation. In the United States, Roblox also faces lawsuits from the attorneys general of Texas, Kentucky and Louisiana, as well as individuals, over concerns about child safety.
Roblox will require users to take a selfie and will use that image to estimate their age, explained Matt Kaufman, the company’s chief security officer. Users will then be assigned an age group, he added. They will only be able to communicate with other users in their assigned age group, such as 9-12 years old or 13-15 years old.
“Legislators are faced with a lot of companies saying, ‘It’s too complicated. We couldn’t do it,'” Kaufman said. Roblox seeks to be an example to follow.
Read more: Texas is third state to sue Roblox amid growing child safety controversy
The company will implement this requirement in Australia, New Zealand and the Netherlands in December. It will apply to users in other countries starting in January. Australia passed a law last year prohibiting minors under 16 from creating social media accounts. Roblox is not included in this measure.
According to an October report, the company recorded an average of 151.5 million daily active users in the third quarter. It is one of the most popular destinations for kids online.
With information from Reuters
Do you like photos and news? Follow us on our Instagram












































