Business

Roblox to Launch Age-Based Accounts to Enhance Child Safety

Roblox Corporation announced it will roll out new age-based user accounts in early June as part of its ongoing efforts to improve safety for children on its platform. The California-based company said these accounts will customize content, communication, and parental controls according to users’ ages.

Children aged 5 to 8 will be automatically assigned a Roblox Kids account, while users aged 9 to 15 will have access to a separate account called Roblox Select. Those under 9 will be restricted from using the chat function entirely, and users between 9 and 15 years old will have limited chat capabilities. Once users reach 16, they will gain full access to the gaming platform and most of its features, except for certain restricted content limited to those 18 and older.

In addition to these age-based accounts, Roblox will offer parents enhanced controls allowing them to block specific games and manage direct chat options for their children. This follows the company’s earlier implementation of an age verification system designed to reduce communication between adults and users under 16.

The new safety initiatives come amidst multiple lawsuits filed by families accusing Roblox of insufficient safeguards against exposing children to harmful content and interaction with older users. Roblox, which supports over 150 million daily users, functions as a virtual platform where players create and engage in games collaboratively.

Roblox CEO and co-founder Dave Baszucki highlighted the company’s safety measures in a November 2025 interview, expressing confidence that these features would set a new benchmark for safety standards across gaming and social applications.

Why it matters

Roblox’s introduction of age-specific accounts addresses significant concerns around child safety in online environments, where unrestricted communication and inappropriate content have posed longstanding risks. By tailoring user experience and enhancing parental oversight, Roblox aims to reduce exposure to harmful interactions, which is critical given the platform’s large and predominantly young user base. The move also reflects increasing regulatory and legal pressures on digital platforms to better protect minors.

Background

Roblox has faced scrutiny and legal challenges over claims that it failed to adequately protect children from predatory behavior and inappropriate material. Earlier in 2026, the company introduced age verification to prevent adult users from contacting children under 16. These efforts are part of a broader trend among online services to implement stricter safety policies and address growing concerns about the risks for young users in digital communities.

Read more Business stories on Goka World News.

Giorgio Kajaia
About the author

Giorgio Kajaia

Giorgio Kajaia is a writer at Goka World News covering world news, politics, business, climate, and public-interest stories. He focuses on clear, factual, and reader-first reporting based on credible reporting, official statements, and publicly available source material.

View all posts by Giorgio Kajaia