Discord announced it will put all existing and new profiles in teen-appropriate mode by default in early March.
The teen-appropriate profile mode will remain in place until users prove they are adults. To change a profile to “full access” will require verification by Discord’s age inference model—a new system that runs in the background to help determine whether an account belongs to an adult, without always requiring users to verify their age.
Savannah Badalich, Head of Product Policy at Discord, explained the reasoning:
“Rolling out teen-by-default settings globally builds on Discord’s existing safety architecture, giving teens strong protections while allowing verified adults flexibility. We design our products with teen safety principles at the core and will continue working with safety experts, policymakers, and Discord users to support meaningful, long term wellbeing for teens on the platform.”
Platforms have been facing growing regulatory pressure—particularly in the UK, EU, and parts of the US—to introduce stronger age-verification measures. The announcement also comes as concerns about children’s safety on social media continue to surface. In research we published today, parents highlighted issues such as exposure to inappropriate content, unwanted contact, and safeguards that are easy to bypass. Discord was one of the platforms we researched.
The problem in Discord’s case lies in the age-verification methods it’s made available, which require either a facial scan or a government-issued ID. Discord says that video selfies used for facial age estimation never leave a user’s device, but this method is known not to work reliably for everyone.
Identity documents submitted to Discord’s vendor partners are also deleted quickly—often immediately after age confirmation, according to Discord. But, as we all know, computers are very bad at “forgetting” things and criminals are very good at finding things that were supposed to be gone.
Besides all that, the effectiveness of this kind of measure remains an issue. Minors often find ways around systems—using borrowed IDs, VPNs, or false information—so strict verification can create a sense of safety without fully eliminating risk. In some cases, it may even push activity into less regulated or more opaque spaces.
As someone who isn’t an avid Discord user, I can’t help but wonder why keeping my profile teen-appropriate would be a bad thing. Let us know in the comments what your objections to this scenario would be.
I wouldn’t have to provide identification and what I’d “miss” doesn’t sound terrible at all:
- Mature and graphic images would be permanently blocked.
- Age-restricted channels and servers would be inaccessible.
- DMs from unknown users would be rerouted to a separate inbox.
- Friend requests from unknown users would always trigger a warning pop-up.
- No speaking on server stages.
Given the amount of backlash this news received, I’m probably missing something—and I don’t mind being corrected. So let’s hear it.
Note: All comments are moderated. Those including links and inappropriate language will be deleted. The rest must be approved by a moderator.
We don’t just report on threats – we help protect your social media
Cybersecurity risks should never spread beyond a headline. Protect your social media accounts by using Malwarebytes Identity Theft Protection.









