A scientist’s opinion: interview with Francesca Pisanu on recent social media bans for children

“A social media ban alone cannot address the full spectrum of risks children face online,” says international children’s rights expert Francesca Pisano. She is an EU Advocacy Officer at Eurochild, where she informs and influences EU policy and legislation to advance children’s rights across Europe.


How do you balance the legitimate need to protect children from harm with their equally legitimate right to participate in digital public life?

Francesca Pisanu profileFrancesca Pisano: Society accepts age limits where activities pose meaningful risks to children’s health and development. The question is not whether every child is harmed in the same way, but whether the overall risk profile of social media, combined with their scale and design, justifies protective safeguards.

Child participation is a right under the UN Convention on the Rights of the Child, and governments are responsible for it. In many cases, social media has filled gaps left by insufficient investment in safe community spaces, youth services and mental health support. States must stop outsourcing these responsibilities to social media platforms. Instead, they should ensure that children have safe offline and online environments that support development, well-being, participation, and protection simultaneously.


Could outright bans create a false sense of security — and potentially push young people toward less regulated parts of the internet?

Francesca Pisano: This is a real risk, and experiences such as Australia’s approach will need careful monitoring over time. Restrictions alone cannot eliminate exposure to risk. If bans are introduced without broader safeguards, they may create the impression that risks have been solved while harmful design practices remain unchanged.

There is no excuse for platforms to keep high-risk environments for “unregistered” users. Platforms must be safe by default, with high privacy, safety, and content protections applied universally, not only to logged-in profiles.

It is reasonable to assume that children will circumvent restrictions, as they do with other age-regulated products. Any restriction should be accompanied by alternatives: safe digital environments, accessible youth services, and meaningful opportunities for participation. Otherwise, children will simply migrate to platforms with weaker protections.


How meaningful is it to design child protection policy while leaving children and adolescents themselves out of the conversation?

Francesca Pisano: Children have the right to be heard in decisions affecting them, and evidence consistently shows they hold nuanced views, and want stronger protection.

Consultations conducted by Eurochild across Europe demonstrate that children understand both benefits and risks of social media. They ask for things like private-by-default accounts, limits on how personal information can be shared, and algorithmic nudges that remind them when they have been scrolling too long. Their perspectives help policymakers move beyond simplified debates and design measures that are realistic, trusted, and effective.


Could interventions aimed at platforms – such as algorithmic accountability, stricter moderation, or limits on engagement-maximising design – be more effective and rights-compatible?

Francesca Pisano: These should not be seen as alternatives to age restrictions but as essential structural measures that must accompany any policy response. The current business model treats children’s identities, emotions, and behaviors as monetisable assets.

The current debate is often framed as “ban or regulate platforms”, but this is a false opposition. Whether bans are introduced or not, platforms must comply with children’s rights. Core protections, including safe-by-default design and limits on addictive features, must apply to all users universally, independent of age thresholds.


What role should schools, families, and public health frameworks play in building digital literacy?

Francesca Pisano: Although essential, they cannot replace regulation, and responsibility cannot be placed on children or parents. Digital literacy helps children navigate risks and exercise agency, with schools, families, and youth workers playing a crucial role.

However, existing EU frameworks, including the Digital Services Act, General Data Protection Regulation (GDPR) and AI Act, must be fully enforced and upcoming legislations should address harmful design and commercial manipulation.

A social media ban alone cannot address the full spectrum of risks children face online. Effective protection requires a whole-of-society approach combining regulation, education, public investment, and platform accountability.

Related article

European Science-Media Hub
Privacy Overview

This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.