APRIL 25 — There is a comforting idea in policy circles: we do not need to restrict children’s access to social media, because with enough regulation, education, and corporate responsibility, these platforms can be made safe from within.It is reassuring, but it is also dangerously incomplete.The recent joint letter by NGOs opposing a social media ban for under-16s frames the issue as a choice between rights and restrictions, between freedom and overreach. But that framing collapses the moment we ask a more basic question: When a system is already exposing children to harm at scale, what do you do first?You reduce exposure, and everything else comes after.1) Social media is not neutralToday’s social media platforms are not neutral spaces.
They are engineered environments designed to capture attention and shape behaviour. For children, this is not participation, but an asymmetry.And the harms are no longer theoretical. In Malaysia alone, the Malaysian Communications and Multimedia Commission (MCMC) identified 957 cases of offensive content involving children on social media in just the first 11 months of 2025.
The author argues that restricting under-16s’ access to social media should be prioritised as an immediate harm-reduction measure, given clear evidence that current platforms expose children to significant risks, while longer-term reforms to regulation, literacy, and platform accountability continue. — Istock.com pic These include exploitation material serious enough to trigger nationwide police operations and arrests.Unicef has warned that non-consensual sexual content, including sextortion and abuse involving minors, is now the most severe online threat facing Malaysian children. Many victims never report what happened.Globally, the situation is deteriorating further.
Law enforcement agencies have raised alarm over a surge in online sextortion cases, with hundreds of thousands of reports annually and devastating mental health consequences, including suicide in some cases.2) Failing safeguardsEven where safeguards exist, they are failing. A 2025 study found that accounts set to mimic 13-year-olds were exposed to harmful content faster and more frequently than adult accounts, sometimes within minutes of use.This is the reality the joint letter does not confront directly.Instead, it proposes a familiar solution: regulate platforms, improve literacy, consult stakeholders, conduct impact assessments.All necessary, but all too slow.
Platform reform is not immediate. It requires legislation, enforcement, technical redesign, and continuous oversight of some of the most powerful corporations in the world. Even in advanced economies, regulators are still struggling to make platforms meet basic safety standards.3) What rights?Children do not live in policy timelines but in real time.
Telling them to remain inside a harmful system while adults negotiate better rules is not neutrality. It is a choice to accept continued exposure.This is where the argument about “rights” needs to be clearer.Children do have rights to expression and access to information. But they also have a right to protection, safety, and healthy development.
These rights must be balanced, not selectively invoked.Every society already draws age-based boundaries. We regulate when children can drive, vote, work, or consume alcohol. Not to deny rights, but to recognise vulnerability.Social media should not be treated as an exception.4) Harm-reducing policyCritics argue that bans are blunt, imperfect, and may be bypassed.
That is true. But no policy is perfect. The question is whether it reduces harm.Even if some teenagers circumvent restrictions, reducing direct exposure for the majority still matters.
Even if age verification raises privacy concerns, those are governance and design challenges, not reasons for inaction.What does not make sense is rejecting protection because it is not flawless.5) Protect children, reject corporate overreachThere is also a deeper imbalance in this debate. We are more afraid of state overreach than of corporate overreach. We scrutinise governments for setting boundaries, but we accept that private platforms can shape children’s attention, behaviour, and emotional lives at scale, with limited accountability.That is not neutrality, but surrender.
A democratically enacted, reviewable policy to protect minors is not authoritarian, but governance.And it does not have to stand alone. Regulate platforms, strengthen enforcement, improve digital literacy, and support parents. But at the same time, reduce children’s exposure to systems that are already known to be unsafe.Even Unicef, while cautioning against relying solely on bans, acknowledges that children are already encountering bullying, grooming, and sexual exploitation online, and that the current system is failing them.That is the point.This is not about choosing between reform and restriction, but it is about priority. Fixing platforms is necessary but protecting child
