Discord announced on Monday that it will soon ask all users globally to confirm their age via a facial scan or by uploading a form of identification to access adult content.
Discord’s press statement revealed that the improved teen safety features being implemented globally will further the company’s long-standing goal of making the app safer and more welcoming for users aged 13 and older.
The chat platform said that this update will automatically provide all new and current users worldwide with a teen-appropriate experience with revised communication settings, limited access to age-gated areas, and content filtering that protects the privacy and deep connections that characterize Discord.
Discord allows people to create and join groups based on their interests. The group messaging tool revealed that it has more than 200 million monthly users.
Discord currently requires certain users in the UK and Australia to confirm their age to adhere to online safety regulations. However, the chat platform announced that it will implement age checks for all new and existing users globally starting in early March this year. This means that some users will need to complete an age-verification process to change certain settings or access sensitive content, such as servers, age-restricted channels, app commands, and certain message requests.
The community server app stated that the new default settings will limit what users can see and how they may communicate. Only users who will authenticate as adults will be allowed to access age-restricted forums and unblur sensitive content. The site also revealed that until users pass Discord’s age checks, they won’t be allowed to view direct messages sent to them by an unknown user.
Drew Benvie, head of social media consultancy Battenhall, stated that it’s a good idea to support efforts to make social media a safer place for all users.
Discord’s move comes amid growing global concern over how social media platforms expose children and teenagers to harmful content and addictive design features.
Governments, regulators, and courts are increasingly examining tech companies to determine whether they are doing enough to protect young users. Recent measures demonstrate growing pressure to enhance industry-wide online safety standards.
The European Union on February 6 accused TikTok of breaching the bloc’s digital regulations wth “addictive design” features that lead to compulsive use by children.
EU regulators said that their two-year probe found that TikTok has not done enough to evaluate how features like autoplay and infinite scroll may affect users’ physical and emotional health, particularly children and “vulnerable adults.”
The European Commission said it believes TikTok should change the “basic design” of its service.
The largest social media corporations in the world, including TikTok, are facing a number of historic trials that aim to make them accountable for injuries to children who use their services in 2026. February 9 marked the start of opening arguments in one such trial held in Los Angeles County Superior Court.
There are allegations that Google’s YouTube and Instagram’s parent firm, Meta, intentionally injures and addicts children. The lawsuit’s original names, TikTok and Snap, reached settlements for unknown amounts.
An American Lawyer, Mark Lanier, said in the opening statement that the case is as “easy as ABC,” which he said stands for “addicting the brains of children.” The lawyer also called Google and Meta “two of the richest corporations in history” that have “engineered addiction in children’s brains.”
Prosecution attorney Donald Migliori said in his opening statement that Meta has fabricated claims about the security of its platforms by designing its algorithms to keep youth online despite being aware that youngsters are vulnerable to sexual exploitation on social media.
Get seen where it counts. Advertise in Cryptopolitan Research and reach crypto’s sharpest investors and builders.


