Discord will soon require users globally to verify their age before accessing adult content. Users must confirm their age with a face scan or upload an official ID. The platform says it has more than 200 million monthly users.
The company says the new safety measures will place all users into a teen-appropriate environment by default. Discord already enforces age checks in the UK and Australia to comply with online safety laws. The platform will expand the system globally from early March.
Savannah Badalich, Discord’s policy head, said teen safety remains a top priority. She added that teen-by-default settings strengthen protections while giving verified adults more flexibility.
Default settings will restrict content and messaging
Discord says the new default rules will limit what users can see and how they communicate. Only verified adults will access age-restricted communities and unblur sensitive material.
Users will also lose the ability to view direct messages from unknown contacts unless they complete age verification. Drew Benvie, head of social media consultancy Battenhall, said the move toward safer communities is positive. He warned that rolling out the system could face challenges across millions of communities.
Benvie said Discord could lose users if the verification system fails. He added that stricter safety measures could attract users who prefer platforms built with safety in mind.
Age verification relies on account and activity data
Discord said it will use inference tools to identify users likely to be adults. Badalich said most adults will not need manual verification. The system will analyze account tenure, device data, activity patterns, and aggregated community signals.
She said Discord will not access private messages or message content during the verification process.
Privacy concerns and previous data breaches
Users can upload an ID photo or record a video selfie for AI-based age estimation. Discord said it will not store verification information. Face scans will not be collected, and ID uploads will be deleted after verification.
Privacy advocates warned such methods could threaten personal privacy. Discord faced criticism in October after hackers potentially exposed ID photos of roughly 70,000 users. A third-party verification company suffered the breach.
IPO ambitions and rising regulatory pressure
The announcement follows reports that Discord explored a public share offering in early January. The company also launched a teen advisory council to support its safety initiatives.
Discord now mirrors platforms like Facebook, Instagram, TikTok, and Roblox with similar teen safety measures. Benvie said other social networks will closely monitor how users respond to Discord’s rollout.
Social platforms introduced many protections for teens due to growing pressure from lawmakers. Discord CEO Jason Citron faced intense questioning about child safety at a US Senate hearing in 2024, alongside leaders from Meta, Snap, and TikTok.

