A letter to the Honourable Marc Miller from the Safer Online Spaces Coalition
Hon Marc Miller, PC, MP
Minister of Canadian Identity and Culture
House of Commons
Ottawa
26th January 2026
Dear Minister Miller:
We are writing because Canada is at risk of drifting once again on one of the most urgent and pervasive issues facing Canadian families: the safety of children and youth online.
Legislation to ensure safer online spaces for children and youth has been promised for years, yet Canadians are still waiting. Prime Minister Carney has spoken clearly about protecting children as a collective responsibility. Parents, educators, health care professionals, and young people themselves see the damage being done every day by platforms that are designed to maximize attention, data extraction, and profit at the expense of well-being. We are encouraged by recent media reports that the government plans to introduce legislation.
The urgency is real. Canada now ranks near the bottom of high-income countries on youth well-being. A significant share of Canadian adolescents report negative mental-health impacts linked to online activity, including compulsive use, diminished self-worth, cybervictimization, and self-harm behaviours. These trends are reflected in rising hospital admissions. These outcomes are not accidental; they are the predictable result of systems designed and allowed to operate without guardrails.
At its core, this is a market failure. Digital platforms have little financial reason to redesign products that are addictive, data-harvesting, and optimized for engagement – even when those designs expose children to violence, exploitation, harmful content, scams, and manipulation. The costs of these choices are borne by families, schools, hospitals, and communities, not by shareholders. Public oversight must correct this imbalance.
Canada therefore needs a focused, credible, made-in-Canada Online Safety Act that is built around public oversight and accountability and includes special protections for children and youth.
Such a framework should start with a clear duty on platforms to act responsibly: to assess foreseeable risks of serious harm, especially to children and youth, and to take reasonable steps to prevent them. This must be paired with independent public oversight: an arms-length regulator with real enforcement powers and technical expertise. Voluntary measures and self-regulation have repeatedly failed. Instead, platforms and services delay, evade, and selectively comply, treating harm as a business expense rather than a legal responsibility.
A regulator also addresses a practical problem that Parliament cannot: the speed of technological evolution. Algorithms, recommender systems, advertising models, and AI-driven tools constantly change. An independent body can update standards, mandate audits, and respond to emerging risks without waiting years for legislative amendments.
Protecting children and youth must be central to this effort. Age and developmentally appropriate privacy and design standards, limits on data collection, meaningful controls over algorithmic feeds, and restrictions on advertising targeting minors are not radical ideas; they are proportional safeguards in an environment where children are uniquely vulnerable. AI-driven services, including chatbots, must also fall clearly within scope.
Additionally, victims of child sexual abuse material suffer from wide-ranging harms throughout their lives, frequently exacerbated by failures of the technology industry to take action or respond to complaints. For this reason, the new framework must include robust requirements for the removal of child sexual abuse material.
Just this month, the American Academy of Pediatrics, which represents nearly 70,000 pediatricians, became the latest expert body to demand accountability and public oversight of tech companies. Experience has shown that companies often act only after tragedy or litigation forces their hand. Canadian families should not have to wait for that moment.
Legally guaranteed access to platform data is essential for public-interest researchers. Secure, lawful access to platform data would allow Canada to evaluate risks, measure outcomes, and adjust policy based on evidence rather than industry assurances.
We have also learned from recent legislative experience. The previous legislation was met with broad support where it focused on duty-of-care, transparency, and systemic accountability. Where it faltered was in overreach, insufficient consultation on new provisions, and fragmentation across departments. A more focused bill can avoid those mistakes.
Finally, while Canada must navigate a complex geopolitical and trade environment, international experience shows that firm, well-designed accountability regimes can withstand pressure. The European Union and the United Kingdom have held their course, and there are lessons that Canada can apply from their Digital Services Act and Online Safety Act respectively. Retreating pre-emptively would only signal that sustained lobbying by powerful foreign corporations can override the public interest.
Children and youth are already living with the consequences of inaction on online harms. Canadians have expressed strong, durable support for meaningful rules that protect children while preserving an open internet. The question is whether Canada chooses to shape its digital environment deliberately, or allows it to be shaped by corporate self-interest.
We urge you to make the introduction and passage of a focused Online Safety Act a top priority in the 45th Parliament.
Sincerely,
The Safer Online Spaces Coalition
Amanda Todd Legacy Society
Canadian Centre for Child Protection
Children’s Healthcare Canada
Canadian Medical Association
Canadian Paediatric Society
Inspiring Healthy Futures
Hospital for Sick Children (Sick Kids)
cc Hon Anna Gainey, PC, MP
Minister of State for Children and Youth
Hon Marjorie Michel, PC, MP
Minister of Health