Discord adopts facial recognition in child safety crackdown

Share This Post



Messaging platform Discord announced Monday it will implement enhanced safety features for teenage users globally, including facial recognition, joining a wave of social media companies rolling out age verification systems.

The rollout, beginning in early March, will make teen-appropriate settings the default for all users, with adults needing to verify their age to loosen protections including content filters and bans on direct messaging, the company said.

The San Francisco-based platform, popular among gamers, will use facial age estimation technology and identity verification through vendor partners to determine users’ ages.

Tracking software running in the background will also help determine the age of users without always requiring direct verification.

“Nowhere is our safety work more important than when it comes to teen users,” said Savannah Badalich, Discord’s head of product policy.

Discord insisted the measures came with privacy protections, saying video selfies for age estimation never leave users’ devices and that submitted identity documents are deleted quickly.