Meta: Will the Social Media Giant Be Forced to Protect Its Young Users?
Consumers and states are bringing lawsuits against social media conglomerate Meta, seeking billions in damages and substantial change to the company’s allegedly addictive technologies. (Naomi Nix, The Washington Post). Meta runs Instagram, Facebook, WhatsApp, and other popular technology platforms. (Meta.com). Despite the company’s commitment to “keeping people safe and making a positive impact,” many users and state governments believe Meta leverages addictive methods to encourage teen engagement on Instagram and Facebook. (Meta.com; Jonathan Stempel et al., Reuters). States and individuals are pushing for Meta to take accountability for its addictive algorithms and make changes to protect the mental health of its minor Instagram and Facebook users, which would likely affect company policies and Meta’s stakeholders. (Meta.com; Jonathan Stempel et al., Reuters). This post considers the social media policies giving rise to widespread claims against Meta, as well as the potential effects of such claims.
In New York, a thirteen-year-old girl is suing Meta for $5 billion in damages resulting from her mental health and school success decline, allegedly due to Meta’s addictive features. (Naomi Nix, The Washington Post). The teen claims Instagram’s addictive features, including the visible “like” count, has led to depression, anxiety, decreased body image, and diminishing grades. Id. The suit pushes for more intense parental controls that would allow parents to limit the addictive aspects of Instagram exposed to their children. Id. The plaintiff hopes to form a class-action lawsuit and disperse the damages among other affected users. Id.
In California, Alice Prakash, a twenty-seven-year-old long-time user of Meta, brought claims for strict liability and common law negligence against the company. Prakash v. Meta Platforms, Inc., 2024 Cal. Super. LEXIS 1627, *15-16. Prakash’s two claims for strict liability are based on (1) Meta’s “defective design” of its platforms that proffer an environment “not reasonably safe” for its users, especially minors, and (2) Meta’s failure to warn minors and their parents of potential “mental, physical, and emotional harms arising from the foreseeable use of their social media products.” Id. Prakash’s common law negligence claim alleges Meta knew or should have known about its social media platforms’ harms to minors, and it failed to mitigate these harms. Id. at *16. Prakash references a past Facebook employee who provided “internal documents showing that Meta was aware that its platforms and products cause significant harm to its users, especially [] children.” Id. at *16-17. While the full effects of Meta’s addictive technologies are not yet known, Meta continues to develop functions that keep users’ attention. Id. at *17.
Prakash points to several features that contribute to Meta’s addictive qualities and why teens are highly susceptible to social media addiction. Id. at *24-28. Instagram and Facebook have an endless feed of posts by the users’ followers, and Instagram has a “discover” page with infinite posts by the users’ non-followers. Id. at *26. One of Instagram’s most addictive tactics is intermittent variable rewards (“IVR”), including features like comments, shares, and a “like” button. Id. at *26-27. IVR fosters an addictive online environment “by spacing out dopamine triggering stimuli with dopamine gaps—a method that allows for anticipation and craving to develop and strengthens the addiction with each payout.” Id. at *27-28. Minors are at an especially high risk of developing a social media addiction, driven by Instagram’s loose age-verification standards and teens’ lack of impulse control and self-regulation. Id. at *32-33. Even Meta’s own analysis of Instagram’s effects on teens revealed a rise in mental health issues due to app use. Id. While Prakash was just one individual to bring a case like this against Meta, she is not the only one, and states have joined in to hold Meta accountable.
In October of 2023, thirty-three state attorney generals filed a joint complaint against Meta and its Instagram platform— claiming the company is contributing to teen social media addiction and mental health struggles. (Jonathan Stempel et al., Reuters). Specifically, they allege Meta violated state laws by deceiving its users about the safety of its technologies, knowingly making its platforms addictive to young people in violation of state and federal unfair and deceptive trade practices statutes. (Jonathan Stempel et al., Reuters; Isaiah Poritz et al., Bloomberg Law). Attorney General of New York, Letitia James, stated “‘Meta has profited from children’s pain by intentionally designing its platforms with manipulative features that make children addicted to their platforms while lowering their self-esteem.’” (Office of the New York State Attorney General). Additionally, the states claim Meta illegally collected data from children under thirteen, in violation of the Children’s Online Privacy Protection Act (“COPPA”). Id. The states conducted research on Meta’s mental health effects, concluding that children’s use of the platform is related to “depression, anxiety, insomnia, interference with education and daily life, and many other negative outcomes.” Id. Chief Executive of Meta, Mark Zuckerburg, denied the company puts profit above the safety of its users, citing to Facebook’s new “Meaningful Social Interactions” in its News Feed and Meta’s work to show only positive advertisements. (Jonathan Stempel et al., Reuters).
Approximately nine states, including Massachusetts, filed their own claims against Meta. Massachusetts alleged Meta “violated G.L. c. 93A § 2, and created a public nuisance, by designing and using addictive design features on Instagram to exploit children’s psychological vulnerabilities, and falsely represented to the public that its features were not addictive and that Meta prioritized youth health and safety.” Commonwealth of Massachusetts v. Meta Platforms, Inc., Memorandum and Order on Motion to Dismiss, Docket No. 2384CV02397, 1 (Mass. Super. Ct. Oct 24, 2023). Massachusetts points to “Platform Tools” Meta uses to limit minors’ ability to manage their time spent on Meta, including incessant notifications, IVR, infinite scroll and autoplay, and ephemeral features. Id. at 2-4. The complaint alleges Meta knows many of its users are under the allowed age of thirteen but fails to remove them from the app in support of its user growth. Id. at 4. It also emphasizes that Instagram has consistently denied features that would address teen mental health concerns. Id. In response, Meta moved to dismiss the claims, but the court found Meta’s arguments unpersuasive. Id. at 6, 18, 19, 20-28.
The results of these lawsuits will likely affect Meta’s policies and even nationwide legislation. (Naomi Nix, The Washington Post). Meta has already seen the immediate negative effects of its lawsuits when its shares dropped 0.6% on the Nasdaq after states sued the company, and it likely wants to avoid economic harms to its stakeholders. (Jonathan Stempel et al., Reuters). Meta has already implemented new policies and features meant to mitigate some of the proposed harms to children, such as messages urging teens to stop scrolling and more extensive parental controls. (Naomi Nix, The Washington Post). Still, these methods are unlikely to settle the wide outrage against the company. Id. In July 2024, “the Senate passed a pair of bills to expand online privacy and safety protections for children, including by forcing digital platforms to take ‘reasonable’ steps to protect children from drug addiction, sexual exploitation and bullying.” Id. This is the first step in parents’ and youth activists’ movement to make social media safer for kids. (Cristiano Lima-Strong, The Washington Post). The House’s support of the bills is not pronounced, and some decisionmakers expressed concerns about the bills limiting free speech. Id. Other critics of the bills, including tech companies and the American Civil Liberty Union, believe the bills could deprive minors of some important content on topics like abortion and vaccines. (Moira Warburton et al., Reuters). More safeguards protecting youth mental health will likely arise as these lawsuits continue to move through the judicial process. (Naomi Nix, The Washington Post).
As young users, parents, and states begin the battle against Meta, it will be crucial to see if the company implements features to protect its users. While Meta might revise its policies voluntarily, it is doubtful that the company would sacrifice the income from its young users. Id. It will likely require an unfavorable legal outcome for Meta to make real changes. Whether or not the states prevail on their claims, the alleged harmful effects of Meta’s platforms are now visible to the public. Public scrutiny will inform parents of the potential danger to their children and could result in a decreased push for children to engage with social media. Both states and individuals want Meta to acknowledge its alleged addictive features and their mental health implications. (Naomi Nix, The Washington Post). However, it will not be easy to fight a tech giant like Meta. Meta consistently claims that its users’ health and safety are top priorities; however, the judicial process might conclude otherwise.