Discord Is Treating Every Account Like a Teenager Until You Prove Otherwise
News5 min read

Discord Is Treating Every Account Like a Teenager Until You Prove Otherwise

1AM Gamer Team

1AM Gamer Team

11 February 2026 13:00 PM

Your Discord account is about to get a lot more restricted, and the only way out is handing over your face or your ID.

On 9 February 2026, Discord announced a global rollout of what it calls "teen-by-default" settings. Starting in early March, every single account on the platform, new or old, gets automatically placed into a teen-appropriate experience. No exceptions. No grandfathering in long-term users. Everyone starts from the same restricted baseline.

This isn't entirely new. Discord tested the approach in the UK and Australia last year. Now it's coming for the rest of the world.

What Teen Mode Actually Restricts

So what does being stuck in teen mode actually mean for your account?

Quite a bit, as it turns out.

  • Sensitive content stays blurred permanently, and you lose the ability to turn that filter off
  • Age-restricted channels, servers, and app commands become inaccessible entirely
  • Direct messages from people you don't know get routed to a separate inbox by default, and only verified adults get to change that setting
  • Friend requests from unfamiliar users come with warning prompts
  • You cannot speak on stage in any server

For most casual users, some of that probably sounds fine. But for anyone in communities that involve mature discussion, creative content, or any kind of age-gated server, the restrictions cut deep.

How You Get Out of It

Two options. A facial age estimation via video selfie, processed on your device (Discord says the video never leaves it), or submitting a government-issued ID to one of Discord's "vendor partners" for review.

Discord also says it's running an "age inference model" in the background, which analyses account tenure, device data, and activity patterns to estimate whether an account belongs to an adult. For some users, that passive system alone gets them cleared without needing to do anything actively. Others will be prompted to verify.

Savannah Badalich, Discord's Head of Product Policy, said "Rolling out teen-by-default settings globally builds on Discord's existing safety architecture, giving teens strong protections while allowing verified adults flexibility."

The verification is a one-time process. Once done, your account adjusts to your confirmed age group and you don't get asked again. Your verification status is also private, so no other users see whether you've verified.

The Privacy Problem Nobody Can Ignore

Here's where it gets uncomfortable.

In October 2025, Discord disclosed a breach involving a third-party customer service vendor, 5CA, in which approximately 70,000 government ID photos were exposed. The attackers, linked to the Scattered LAPSUS$ Hunters group, attempted to extort Discord and claimed the real number of compromised records was far higher than Discord's official figure. Some reports suggested the actual scope included over 520,000 age-verification tickets.

Discord has since cut ties with 5CA and says it no longer permanently stores identity documents or video selfies. The company is now working with different vendor partners and says documents are "deleted quickly, in most cases immediately after age confirmation."

That's a reasonable set of assurances on paper. Whether anyone believes them is a different matter.

The community reaction has been pretty stark. Reddit threads have filled with users threatening to delete their accounts, calling the changes "game over for Discord," and pointing out that submitting your government ID to a company with that data breach history requires a significant leap of faith. "I categorically cannot trust tech companies with that kind of personal data," wrote one user, which summed up a feeling shared across most of the threads discussing it.

Discord, speaking to The Verge, acknowledged it expects some users to leave the platform entirely because of this, with Badalich noting the company will "find other ways to bring users back." That's a fairly unusual thing to say out loud.

The Workaround Situation

When the same rollout hit UK users last year, some figured out they could fool Discord's facial age scan using Death Stranding 2's in-game photo mode. Discord says that loophole was closed within a week. Expect more attempts when this goes global in March.

Minors finding ways around verification systems is, honestly, a structural problem with any age-gating approach. VPN usage spiked noticeably in the UK when its own age verification laws came in, and a chunk of Discord's userbase will almost certainly look for alternatives or workarounds rather than handing over biometric data.

The Bigger Picture

Discord isn't acting alone here. Roblox recently made facial verification mandatory for chat access. YouTube launched age-estimation technology in the US last year. The UK's Online Safety Act, Australia's social media age restrictions, and various US state-level laws have pushed platforms toward more aggressive age-checking. Discord's move fits a clear industry-wide direction.

Whether it actually keeps younger users safer is a different question entirely. Critics have noted that strict verification often pushes activity into less regulated spaces rather than eliminating it, and that the real cost falls on adult users who now have to choose between privacy and access.

Discord says it also wants to hear from teenagers directly. The platform is launching its first Teen Council, a group of 10-12 teens aged 13-17 based in the US, with applications open through May 2026. The idea is to bring actual teen perspectives into how Discord shapes its safety features rather than adults making assumptions.

The phased global rollout begins in early March 2026. If you want to avoid being locked into teen mode, the clock is ticking.

DiscordAge VerificationTeen ModePrivacyOnline SafetySocial MediaGaming NewsData BreachChild SafetyFace Scan

Related Articles