Connect with us

Child protection

Commission scrutinises safeguards for minors on Snapchat, YouTube, Apple App Store and Google Play under the Digital Services Act

SHARE:

Published

on

We use your sign-up to provide content in ways you've consented to and to improve our understanding of you. You can unsubscribe at any time.

The European Commission has initiated the first investigative actions following the Guidelines on Protection of Minors under the Digital Services Act (DSA).

The Commission is requesting Snapchat, YouTube, Apple and Google to provide information on their age verification systems, as well as on how they prevent minors from accessing illegal products, including drugs or vapes, or harmful material, such as content promoting eating disorders.

Tech Sovereignty Executive Vice President Henna Virkkunen (pictured) said: “We will do what it takes to ensure the physical and mental well-being of children and teens online. It starts with online platforms. Platforms have the obligation to ensure minors are safe on their services – be it through measures included in the guidelines on protection of minors, or equally efficient measures of their own choosing. Today, alongside national authorities in the Member States, we are assessing whether the measures taken so far by the platforms are indeed protecting children.”

The Commission is requesting Snapchat to provide information about how it prevents children under 13 years of age from accessing its services, as prohibited by the platform's own terms of service. The Commission is also requesting Snapchat to provide information on the features it has in place to prevent the sale of illegal goods for children, such as vapes or drugs.

With regards to YouTube, in addition to information on its age assurance system, the Commission is seeking more details on its recommender system, following reporting of harmful content being disseminated to minors.

For Apple App Store and Google Play, the Commission is seeking information on how they manage the risk of users, including minors, being able to download illegal or otherwise harmful apps, including gambling apps and tools to create non-consensual sexualised content, the so-called ‘nudify apps'. The Commission is also seeking to understand how the two app stores apply apps' age ratings.

To ensure effective enforcement of the Guidelines on protection of minors across all platforms, large and small, the Commission is taking further actions with the national authorities to identify platforms posing the greatest risk for children.

Advertisement

Share this article:

EU Reporter publishes articles from a variety of outside sources which express a wide range of viewpoints. The positions taken in these articles are not necessarily those of EU Reporter. Please see EU Reporter’s full Terms and Conditions of publication for more information EU Reporter embraces artificial intelligence as a tool to enhance journalistic quality, efficiency, and accessibility, while maintaining strict human editorial oversight, ethical standards, and transparency in all AI-assisted content. Please see EU Reporter’s full A.I. Policy for more information.
Advertisement

Trending