STRASBURGO (FRANCIA) (ITALPRESS) – The European Parliament has approved with 483 votes, 92 votes against and 86 abstentions a resolution, not binding, in which they express strong concerns for the physical and mental health of children online and ask for greater protection against manipulative strategies that can increase dependence and adversely affect their ability to focus and interact healthy with digital content. To help parents manage their children’s digital presence and ensure age-appropriate online interactions, Parliament proposes to set at age 16 the minimum age limit in the EU to access social media, video sharing platforms and virtual companions based on AI, leaving the possibility for children between 13 and 16 years to access them on parental permission.
MEPs welcome the Commission’s efforts to develop an EU age verification app and the European digital identity portfolio (eID). They point out however that age guarantee systems must be accurate and respect the privacy of boys and girls. These systems also do not lift platforms from the responsibility of ensuring that their products are safe and appropriate for minors. In order to encourage better compliance with the Digital Services Regulation and other rules, MEPs propose that high management be called to respond personally in the event of serious and persistent failure to comply with the provisions, especially those relating to child protection and age verification.
Parliament also asks to prohibit the most harmful dependence practices and to deactivate by default other features that create dependence for minors, including endless scrolling, automatic reproduction, upgrade by dragging down, reward cycles and harmful gamification practices (or ludification); prohibit websites that do not comply with EU rules; intervene to counter persuasive technologies, such as targeted ads, influencer advertising, design that creates dependence and dark paths, within the upcoming law on digital equity; prohibit child involvement-based recommendation systems; apply the rules of the Digital Services Regulation to online video game platforms and prohibit prize boxes and other randomized content, such as internal app currencies, luck wheels and pay-to-progress mechanisms (i.e. those that push the user to spend money to advance faster); protect minors from commercial exploitation, even by prohibiting platforms from providing financial incentives for children influencers; urgently address the ethical and legal challenges posed by creative intelligence tools, such as deepfake, company chatbots, IA agents, and AI-based denuding apps (which generate manipulated images of people without their consent).
“I am proud of this Parliament, of the fact that we know how to unite to protect minors online – said rapporteur Christel Schaldemose (S&D) during the plenary debate. Together with a rigorous and consistent application of the Digital Services Regulation (DSA), these measures will significantly raise the level of child protection. We’re finally tracking a line. We clearly say to the platforms: Your services are not designed for minors. And this experiment ends here.”.
97% of young people use the Internet every day and 78% of children between the ages of 13 and 17 control their devices at least once an hour. At the same time, a minor on four makes “problematic” or “disfunctional” use of the smartphone, i.e. it has behavioral dynamics related to addiction. According to the Eurobarometer 2025, more than 90% of Europeans consider it urgent to intervene to protect children online, in particular with regard to the negative impact of social media on their mental health (93%), online bullying (92 %) and the need for effective tools to limit unsuitable content to their age (92%). Member States are beginning to intervene and react with measures such as age limits and verification systems.
