EU Launches Investigation into Snapchat Over Child Safety Failures

The EU has opened a formal investigation into Snapchat over child safety failures, including grooming risks, illegal content, and weak age verification.

EU Launches Investigation into Snapchat Over Child Safety Failures
Stock Image courtesy of <a href="https://depositphotos.com">Deposit Photos</a>

The European Commission has opened a formal investigation into Snapchat, raising serious concerns about the platform's failure to protect children and teenagers from grooming, criminal recruitment, and exposure to illegal products.

The Commission suspects the platform may allow adults to masquerade as young users in order to contact children, with the aim of sexual exploitation or recruitment for criminal activities.

The investigation was launched on 26 March 2026 and focuses on whether Snapchat is meeting its obligations under the EU's Digital Services Act (DSA), which sets strict standards for online platforms operating within the European Union. Non-compliance could result in fines of up to 6% of Snapchat's global annual turnover.

Henna Virkkunen, Executive Vice-President for Tech Sovereignty, Security and Democracy at the European Commission, said:

"From grooming and exposure to illegal products to account settings that undermine minors' safety, Snapchat appears to have overlooked that the Digital Services Act demands high safety standards for all users. With this investigation, we will closely look into their compliance with our legislation."

Five Areas Under Scrutiny

The investigation covers five specific areas of concern.

On age verification, Snapchat currently relies on users self-disclosing whether they are over 13, which the Commission has said is insufficient to keep children from accessing the platform. The Commission suspects this approach neither prevents under-13s from signing up, nor reliably identifies users aged under 17 who require additional protections.

On default account settings, the Commission suspects that children and teens are automatically recommended to other users through the 'Find Friends' system, and push notifications remain enabled by default. When creating an account, users are not offered adequate guidance on privacy and safety features, nor given an explanation of how to adjust account settings.

Regarding illegal content, the Commission suspects that Snapchat's content moderation tools are not effective in preventing the spread of information pointing users to the sale of illegal products, such as drugs, or age-restricted items, including vapes and alcohol.

The investigation also raises concerns about reporting mechanisms for illegal content, which the Commission suspects are neither easy to access nor user-friendly, and may use so-called dark patterns in their design, with users not being properly informed about how to file internal complaints.

Dutch Investigation Absorbed into Broader Probe

The investigation builds on earlier action by the Netherlands Authority for Consumers and Markets (ACM), which launched a probe in September 2025 into the availability of vapes on Snapchat following a request by the Dutch Youth Smoking Prevention Foundation. The Commission has since determined that the risks are systemic in nature, and the ACM investigation will now form part of the broader EU-level proceedings.

Snapchat's Response

A Snapchat spokesperson said the safety and wellbeing of all Snapchatters is a top priority and that the company has worked for years to raise the bar on safety, continuously reviewing, strengthening, and investing in its safeguards. The company also stated it has acted proactively and transparently to meet DSA requirements and will fully cooperate with the investigation.

The Commission will now gather further evidence, including sending formal requests for information to Snapchat and conducting interviews and inspections. It is also empowered to adopt interim measures and issue a non-compliance decision if necessary.

How Parents Can Help Keep Their Children Safe on Snapchat

While the investigation is ongoing, parents in Ireland can take practical steps right now to protect younger users on the platform.

Check your child's age and account type. Snapchat requires users to be at least 13, and accounts for users aged 13 to 17 should be set up as "teen" accounts with additional protections. However, as the EU investigation highlights, these safeguards rely on accurate age information being provided at sign-up.

Review privacy settings together. Go into Settings, then Privacy Controls, and make sure "Who Can Contact Me" and "Who Can View My Story" are set to "Friends" only, rather than "Everyone." Turn off the Snap Map feature, or set it to Ghost Mode, so your child's location is not visible to others.

Disable Quick Add and Find Friends. These features can make your child visible to strangers and can be turned off under Settings, reducing the risk of unknown adults making contact.

Turn off push notifications for new friend requests. This reduces pressure on younger users to respond to contact from people they do not know.

Talk openly about what to do if something feels wrong. Encourage your child to screenshot and report any uncomfortable contact using the in-app report function, and to tell a trusted adult immediately. The report option is accessible via the three-dot menu on any profile or message.

Enable Family Centre. Snapchat's Family Centre tool allows parents to link their account to their child's and monitor their activity, including setting content controls, without being able to read private messages. It can be accessed under Settings on a parent's Snapchat account.

Parents with specific concerns can also contact Webwise, Ireland's internet safety awareness initiative at webwise.ie, or Hotline.ie for reporting harmful online content.

Follow our WhatsApp ChannelLive Alerts