To say Roblox is a popular game is like saying the sky is blue. It's a silly thing to say, but it's kind of an apt thing to say as Roblox boasts more than 151 million daily active users. With major companies taking a dive into making their own Roblox game to appeal to the youth, it's kind of a no-brainer with such a massive player base. Companies like IKEA, Hyundai, Vans, Spotify, and more have all made efforts to enter the space. Meanwhile, you have game makers like Poppy Playtime's Mob Entertainment that made something as a tie-in of sorts to its popular game.
One company you aren't likely to see, though, is Disney. Now, many expected this was because of their $1.5 billion stake they have in Epic Games, but that might not be the case. The decision to not partner with Roblox is kind of wild, in a way, as Disney would no doubt find a lot of success, as while yes, they have a good deal of success with thier Fortnite partnership, with the recent The Simpsons deal garnering a lot of players for Fortnite as well as Disney Plus traffic, much of their library is more geared towards a Roblox-aged audience.
A Variety report's surfacing that the mouse's not too keen on tactics the game maker has to keep children safe... or the lackthereof, though. Roblox has a notable problem with predators, with some stuff gone as far as being swept under the rug under guise of, ironically, user safety. Notably, they banned a known content creator, Schlep, who specialized in catching predators and turning any info they have to the police that way a proper investigation can go forward, which has led to the arrest of multiple predators.
Now, Roblox is adding safety measures, such as age checks before users can use communication features to avoid children and adults from interacting, which are at the moment just in select countries, but will be expanding into the US by January 2026. For their part, if you want to believe them, a rep did tell Variety that they regularly engage with their partners about safety and how they have "rigerous safety measures" ranging from AI to a team of mods working 24/7.
“regularly with our partners to provide visibility into platform-wide updates and ongoing investments in safety advancements. We have rigorous safety measures in place from advanced AI models to an expertly trained team of thousands moderating our platform 24/7 for inappropriate content.”