Youth social media: Why proposed Ontario and federal legislation won’t fix harms related to data exploitation

Ontario school board lawsuits against social media giants including Meta, Snapchat and TikTok are seeking damages – money paid as a remedy – for the disruption of the educational system.

Author

  • Teresa Scassa

    Canada Research Chair in Information Law and Policy, L’Université d’Ottawa/University of Ottawa

A growing volume of evidence indicates that young people have become addicted to social media. It suggests social media platforms are designed to foster such addiction, that online activities contribute to behaviour such as bullying and harassment, and that excessive use of social networks can harm students’ mental health, even influencing suicide .

Ontario school boards, speaking as a coalition called Schools for Social Media Change, argue “social media products, designed for compulsive use, have rewired the way children think, behave and learn” and that ” schools are unfairly bearing the brunt of the learning and mental health epidemic caused by the alleged negligent conduct of social media companies.” Lawsuits come as 95 per cent of Ontario schools report needing more resources to support student mental health.

At the core of the litigation are concerns about the impact on young people of social media companies’ practices. But neither lawsuit victories, nor existing or proposed Ontario provincial or federal privacy or AI legislation will prevent problems related to rampant collection and processing of human-derived data.

Boards in U.S. and Canada

Four Ontario school boards announced that they were suing social media giants including Meta, Snapchat and TikTok in March 2024. Five other school boards and two private schools also filed suit shortly afterwards .

These actions follow a flood of lawsuits launched in the U.S. by over 200 school districts against social media companies.

The U.S. lawsuits link social media engagement with a decline in students’ mental health. One U.S. statement of claim describes the situation as “perhaps the most serious mental health crisis [the nation’s children, adolescents and teenagers] have ever faced.”

The Canadian lawsuits make similar claims. For example, one alleges that the defendant social media companies “employ exploitative business practices and have negligently designed unsafe and/or addictive products” that they market and promote to students.

Regulating digital information

The litigation on both sides of the border is novel. In Canada it has also been somewhat controversial. When asked about the Ontario lawsuits, premier Doug Ford called them “nonsense,” suggesting that the school boards should focus on educating students.

Shortly after the launch of these lawsuits, the Ontario government introduced Bill 194 . This bill proposes, among other things, new regulation of digital information of children and youth in schools and in children’s aid societies.

Nonetheless, what is proposed in the bill won’t address what these lawsuits attempt to tackle: the impact on education from how social media companies engage with children and youth – including in time spent out of school. Ontario’s Information and Privacy Commissioner, in her submission on Bill 194, recommends largely replacing what the government proposes with improvements to existing privacy law .

Similarly, the province’s school cell phone ban tackles only one dimension of a much bigger problem.

Impact of company practices on youth

The Canadian lawsuits against social media giants are not framed as privacy claims. Indeed, school board led litigation could not raise such claims since any privacy rights are those of the children and youth who engage with social media and not those of the school boards.

The damage alleged by the school boards is the disruption of the operation of schools, but at the core of the litigation are concerns about the impact on young people of social media companies’ practices.

While privacy claims are not part of the school board litigation, they are not far from the surface. Social media user data fuels these companies’ business models, incentivizing them to engage in practices that draw users in, and that drive continued engagement and social dependence. Although all users are affected by these practices, evidence suggests that children and youth are particularly susceptible to becoming addicted.

Data gathered through engagement on these platforms also fuels targeted advertising, which can foster insecurities around body image and other self-confidence-affecting concerns of young people.

Privacy laws out of step?

The roots of the harm alleged by the boards are therefore in personal data collection and processing. However, the consequences far transcend the individual privacy harms recognized in privacy laws or privacy torts. This suggests that our privacy laws are out of step with contemporary data practices.

It would be good to take comfort from the fact that Bill C-27 , currently before Parliament’s Standing Committee on Industry and Technology , proposes long-awaited reforms to Canada’s private sector privacy law in the form of a new Consumer Privacy Protection Act .

It also contains a new law that would regulate the development and use of artificial intelligence (AI) technologies. Unfortunately, even if the bill is passed into law before the coming election (which seems increasingly unlikely), these reforms will do little to address the broader systemic harms impacting our society that come from the exploitation of personal data.

Legislation falling short

The proposed Consumer Privacy Protection Act takes only small steps to recognize the sensitivity of children’s information. It falls far short of the United Kingdom’s age-appropriate design code of practice for online services.

Further, although the proposed Artificial Intelligence and Data Act would set parameters for the design, development and deployment of AI systems, it defines harms in individual terms – and doesn’t acknowledge group and community harms from algorithm-driven practices, such as the disruption of the educational system.

The European Union’s AI Act is not so limited. In its first recital, it describes its broad goals to ensure “a high level of protection of health, safety, fundamental rights … including democracy, the rule of law and environmental protection.”

What the school boards are advancing in their litigation are novel claims for redressing what they and a growing body of experts say are harms rooted in the collection and processing of human-derived data. These harms go beyond the individuals whose data is harvested and impact society more broadly.

As this litigation unfolds, we should be asking: When new bills to regulate AI or privacy are introduced, how will they equip us to address the group and social harms of personal data exploitation?

The Conversation

/Courtesy of The Conversation. View in full here.