Why Is Discord Requiring Face Scans Now?

Why Is Discord Requiring Face Scans Now?

TL;DR

Discord is rolling out age verification globally, requiring users to submit a face scan or government-issued ID to access adult content. The platform will blur adult content by default, with only verified adults able to opt out. The move responds to mounting regulatory pressure and Discord's push to make the platform safer for teens.

What Happened

According to The Verge, Discord is launching a global rollout of age verification that will require users to scan their face or upload a government ID to prove they are 18 or older. Users who don't verify will have adult content blurred by default and will be unable to access age-restricted servers or channels.

BBC News reported that the platform, which has 200 million monthly users, will blur adult content by default as part of the changes. Discord is also introducing "teen by default" settings globally, restricting direct messages from strangers and limiting exposure to potentially harmful content for younger users.

Discord says it is partnering with a third-party provider to handle the verification process. The company says face scans are processed to estimate age and are not stored, and that government IDs are deleted after the check is completed.

Why People Are Talking About It

Biometric age verification at this scale is unusual for a social platform. Most major services rely on self-reported birth dates, which are trivially easy to bypass. Discord's decision to require face scans or government IDs sets a new precedent for how platforms might enforce age restrictions going forward.

The privacy implications are significant. Collecting biometric data - even temporarily - creates a target for data breaches and raises questions about how thoroughly the third-party verification provider deletes submitted information. Users in regions with strict data protection laws may face additional questions about how their biometric data is handled.

Discord is used heavily by gaming communities, open-source projects, and developer teams — groups that have historically been vocal about privacy changes on platforms they rely on.

Key Viewpoints

Child safety advocates have pushed for stricter verification. Regulatory pressure worldwide has been mounting on platforms to go beyond self-reported ages, and Discord's move aligns with this direction. Discord's move aligns with this direction.

Privacy concerns center on biometric data collection. Even with Discord's stated policy of not retaining face scans or IDs, the act of submitting biometric data to a third party introduces risk. Users have no independent way to verify that data is actually deleted after processing.

The "teen by default" approach shifts the burden. Rather than asking minors to opt into protections, Discord is restricting access by default and requiring adults to prove their age to unlock content. This inverts the typical platform model where everything is accessible unless restricted.

What's Next

Users who want continued access to age-restricted content on Discord can prepare by having a government-issued ID ready or ensuring their device camera works for face scanning when the feature rolls out.

Privacy-conscious users can review Discord's data processing agreements and the third-party verification provider's privacy policy once details are published. Tools like browser-based privacy extensions or VPN services won't bypass biometric checks, but understanding what data is collected remains important.

Other platforms are likely watching Discord's rollout closely. If the approach reduces regulatory pressure without significantly hurting user growth, other major platforms may consider similar verification systems. Developers building community tools on Discord's API should watch for any changes related to age-verification status, which could affect bot and integration behavior.

Sources