Discord is taking a significant step toward stronger child safety measures with a major global policy update. On February 9th, 2026, the company announced the rollout of "teen-by-default" settings for all users worldwide, set to begin in early March through a phased implementation for both new and existing accounts. This approach automatically applies age-appropriate restrictions unless users prove they are adults, responding to growing international pressure for better online protections for minors.
Under the new system, unverified users will face several limitations:
- No access to age-restricted servers, channels, or commands.
- Inability to speak in stage channels (Discord's livestream-like voice spaces).
- Automatic content filters for graphic or sensitive material.
- Warning prompts on friend requests from unfamiliar users.
- Direct messages from non-friends routed to a separate, filtered inbox.
Normal direct messaging and non-age-restricted servers remain unaffected. For existing age-restricted servers, Discord will "obfuscate" them with a black screen until verification occurs, preventing viewing or interaction. Users also cannot join new age-restricted servers without completing the process.
To lift these restrictions, adults can verify through one of several methods. The primary option is facial age estimation, where users submit a short video selfie processed by AI entirely on-device Discord emphasizes that the data never leaves the device. If the estimate (teen or adult) is inaccurate, users can appeal or switch to submitting a photo of a government-issued ID. These documents are handled by third-party vendors and deleted immediately after confirmation in most cases, with no retention of personal details like name, location, or document type.
Discord stresses this is facial age estimation, not biometric scanning or facial recognition. The company switched vendors after an October 2025 data breach exposed age verification data from a previous partner, immediately halting use of that vendor. A new age inference model runs in the background, analyzing metadata such as gaming activity, Discord usage patterns, behavioral signals (like working hours), and time spent online. If the system has high confidence a user is an adult, no further verification is needed. In some cases, multiple methods may be required for full access.
Savannah Badalich, Discord's global head of product policy, explained in an interview with The Verge that the changes primarily affect truly adult or sensitive content. "A majority of people on Discord are not necessarily looking at explicit or graphic content," she said. "We're really talking about things that are truly adult content and age-inappropriate for a teen." Most users should see minimal disruption to everyday chat, gaming, or community use.
This global expansion builds on regional pilots in the UK and Australia launched last year. Those early implementations faced workarounds (such as using Death Stranding's photo mode), which Discord patched within a week. Badalich acknowledged ongoing efforts to counter circumvention attempts through continuous bug fixes.
The policy aligns with broader industry and regulatory trends. Governments worldwide are pushing for age assurance on platforms popular with youth, driven by concerns over exposure to inappropriate content, grooming risks, and mental health impacts. Discord's move follows similar steps by other services and comes amid ongoing scrutiny in the US, including lawsuits against platforms for child safety failures.
Privacy remains a key concern for many users. The October breach heightened fears about sharing IDs, and some adults may opt out of verification to avoid data risks, accepting teen-level restrictions instead. Badalich admitted the rollout could lead to user attrition, stating, "We do expect that there will be some sort of hit there, and we are incorporating that into what our planning looks like. We'll find other ways to bring users back."
Discord has established a Teen Council, open for applications from 13-17-year-olds until May 1st, 2026, to gather youth input on future safety features. The company positions these changes as balancing protection with the platform's core values of community and privacy.
As the phased rollout approaches, users should prepare for potential prompts when accessing certain servers or features. While the majority of Discord's 200 million+ monthly active users engage in non-sensitive activities, the update underscores the platform's evolving approach to safety in an increasingly regulated digital landscape. Whether it curbs risks without alienating core users remains to be seen, but Discord appears committed to iterative improvements based on feedback and compliance needs. Will this change how you use Discord or if you will be using it in the future?