The provided excerpts reveal how Meta's social media platforms, particularly Instagram, have been linked to severe physical and mental health issues, especially among young users. This includes higher rates of depression, anxiety, sleep problems, and even suicidal thoughts. Meta's focus on design choices aimed at increasing user engagement has worsened the situation, leading to a concerning positive feedback loop. Young users, in particular, are vulnerable to these harmful effects, and the rise of social media during their formative years has long-term consequences on their well-being. The excerpts also highlight Meta's knowledge of these issues and its questionable actions, or lack thereof, to address them.
People Mentioned
The United States v Meta Platforms Court Filing October 24, 2023 is part of HackerNoon’s Legal PDF Series. You can jump to any part in this filing here. This is part 27 of 100.
D. Meta’s Platform features cause young users significant physical and mental harm, of which Meta is keenly aware.
508. Increased use of social media platforms, including those operated by Meta, result in physical and mental health harms particularly for young users, who experience higher rates of major depressive episodes, anxiety, sleep disturbances, suicide, and other mental health concerns.[14]
509. Social media use among young users began a dramatic increase in the United States in 2012 when Meta acquired Instagram to expand its youth appeal. Instagram increased from 50 million users in 2012 to over 500 million users by 2016, with a significant share of its user base composed of young users.
510. As Meta focused on designing features to increase time spent on its Platforms, heavy consumers of social media began to exhibit worse mental health outcomes than light consumers.[15]
511. Hours spent on social media and the internet have become more strongly associated with poor psychological health (such as self-harm behaviors, depressive symptoms, low life satisfaction, and low self-esteem) than hours spent on electronic gaming and watching TV.[16] Making matters worse, heavier social media use has led to poorer sleep patterns (e.g., later sleep and wake times on school days and trouble falling back asleep after nighttime awakening) and poorer sleep quality.[17]
512. Such sleep interference in turn causes or exacerbates symptoms of depression and anxiety.[18] Lack of sleep also has negative physical effects, including interfering with the antibody response to vaccines.[19]
513. These physical and mental harms are particularly acute for young users, who are less able to self-regulate the time they spend on social media platforms. When companies like Meta design platforms to exploit young users’ psychological vulnerabilities, the harms are compounded. Researchers call this a positive feedback loop: those who use social media habitually are less able to regulate their behavior; that habitual use, in turn, can lead back to more social-media use; and restarting the cycle, that additional use makes it even harder to regulate the problematic behavior.[20]
514. Young users are at a formative stage of development where they are both especially vulnerable to excessive social media use and especially sensitive to its ensuing impacts. Research indicates that going through puberty while being a heavy social media user interferes with a sensitive period for social learning.[21] Heavy use of social media in this sensitive developmental period can have negative impacts on long-term life satisfaction.[22]
515. Young users—who are particularly attuned to FOMO—often feel an extra need to be connected at night and frequently wake up throughout the night to check social media notifications.[23] Socializing at night makes it harder for young users to sleep.[24]
516. Young users who use social media for more than five hours per day are three times more likely than non-users to not sleep enough,[25] contributing to associated physical and mental health impacts.
517. Children who use social media for more than five hours per day are many times more likely to have clinically relevant symptoms of depression than non-users.[26]
518. Beginning with Instagram’s rise in popularity in 2012, the Centers for Disease Control and Prevention (CDC) observed in its Youth Risk Behavior Study the percentage of high school students “who experienced persistent feelings of sadness or hopelessness” skyrocket over the subsequent decade.[27]
519. Over this same time period, there has also been an increase in youth hospitalization rates for suicidal ideation and suicide attempts. In 2008, prior to the rise of Instagram, hospital visits for suicidal ideation and attempts represented only 0.66% of visits among all age ranges. By 2015, as Instagram’s popularity grew, that share had almost doubled, with suicidal ideation and attempts accounting for 1.82% of all visits, with the highest rates of increase among youth ages 12 to 17 years old.[28]
520. The youth mental health crisis fueled by social media platforms has been particularly detrimental for girls and young women.
521. Immediately before Instagram’s rise in popularity and usership, major predictors for the mental health well-being of U.S. girls and young women were stable or trending down.
522. Beginning with Instagram’s rise in popularity in 2012, however, the rates of suicides, self-poisonings, major depressive episodes, and depressive symptoms among girls and young women jumped demonstrably. [29]
523. Particularly concerning is the rise of suicidal ideation among girls over the time period that Instagram has surged. According to the CDC’s Youth Risk Behavior Survey, in 2011, 19% of high school girls seriously considered attempting suicide. By 2021, that figure reached 30%: [30]
524. This increase in suicidal ideation among girls has been matched by an increase in suicide attempts. In just the one decade of Instagram’s rising popularity, there was a 30% increase in the rate of high school girls who attempted suicide: [31]
525. Increased rates of suicidal ideation and attempts have led to an overall higher rate of completed suicide among young girls. Indeed, in 2013 alone—the year after Instagram’s surge in popularity among young users—the suicide rate for 13-year-old girls jumped by around 50%.[32]
526. This youth mental health crisis fueled by social media platforms like Instagram only stands to worsen. The COVID-19 pandemic has exacerbated excessive social media use. The increase in consumption of digital and social media by young users during this time is linked to an increase in “ill-being” and media addiction.[33] [Redacted]
527. Meta is not only fully aware that the worsening youth mental health crisis is fueled by social media platforms, but has long known that its Platforms are directly contributing to this crisis.
528. [Redacted]
529. Meta’s design choices and practices take advantage of and contribute to young users’ susceptibility to addiction. They exploit psychological vulnerabilities of young users through the false promise that meaningful social connection lies in the next story, image, or video and that ignoring the next piece of social content could lead to social isolation.
530. [Redacted]
531. Meta has conducted detailed internal research that demonstrates the mental health impacts of its Platforms on young users, notably a “Teen Mental Health Deep Dive” that surveyed over 2,500 young users in the U.S. and U.K.
532. Through this “Teen Mental Health Deep Dive,” Meta identified that young users are coping with a variety of emotional issues, including not having “enough friends” or having friends “who aren’t really their friends” (52%), having “to create a perfect image” and not being “honest about feelings” (67%), wanting to “hurt [or] kill themselves” (14%), feeling “down, sad, []depressed[,] [a]lone, or lonely (62%), and feeling “not good enough [or] [a]ttractive” (70%).
533. The broad takeaway from Meta’s “Teen Mental Health Deep Dive” was that “[s]ocial media amplifies many of the age-old challenges of being a teenager. The always-on nature of social media means that teens’ social lives have infiltrated into every part of life without a break.” [Redacted]
534. Meta has found that Instagram specifically impacted young users, with one in five teens stating that Instagram makes them feel worse about themselves.
535. Elaborating further, [Redacted] teens responded that Instagram use led to them feeling “not good enough,” with [Redacted] 24% reporting the feelings started on Instagram.
536. Meta knows that “[t]eens blame Instagram for increases in the rates of anxiety and depression among teens.” Instagram’s deliberate design features, such as “comparisons of followers and like counts,” exploit teens’ vulnerability to social comparison, creating a negative feedback loop that leads to mental health harm including self-esteem, anxiety, and insecurity issues.
537. Meta also knows that although “young people are acutely aware that Instagram can be bad for their mental health,” they feel “compelled to spend time on the app” because Meta has designed its Platforms to exploit young users’ “fear of missing out on cultural and social trends.”
538. These problems are not confined to Instagram but implicate Facebook as well. When Facebook was rolled out to college campuses from 2004 to 2006, researchers compared the rollout at particular colleges to the subsequent mental health of those colleges’ students. After Facebook arrived on campus, students at the college suffered from worse mental health: they used mental-healthcare services more, their academic performance suffered, and so did their job prospects.[34]
539-551. [Redacted]
552. As Meta’s Platforms disturb sleep, fuel adverse mental health consequences, facilitate social comparison, cause anxiety, and fail to prevent bullying and harassment, this combination has a dangerous effect on young users. [Redacted]
553-561. [Redacted]
562. Similarly, even though Meta knows that its Platforms are harmful to teenagers’ mental health, Meta externally characterizes Instagram as a source of support for teens struggling with thoughts of suicide and self-injury and mental health issues generally, including in Mosseri’s December 8, 2021 congressional testimony.
563. [Redacted]
564. Meta takes great effort to distance itself from the reality that Meta’s Platforms are harmful for teen mental health. For example, when M.R., a 14-year-old, committed suicide after being exposed to suicide and self-injury content on Instagram, Meta sent an executive to a U.K. coroner’s court to deny that its Platform played any role in M.R.’s suicide—[Redacted]
565. During an official inquest investigating the role that social media platform content played in M.R.’s death, and as reported by the Guardian on September 30, 2022, a Meta executive said that such content was “safe” for children to see. The coroner rejected this claim, finding instead in his October 13, 2022 report that M.R. “died from an act of self-harm whilst suffering from depression and the negative effects of on-line content” that she had not sought out, but that the Platforms’ algorithms had pushed on her.
566. The coroner’s inquest report continued:
The platform operated in such a way using algorithms as to result, in some circumstances, of binge periods of images, video clips and text some of which were selected and provided without requesting them. These binge periods . . . are likely to have had a negative effect on [M.R.] . . . In some cases, the content was particularly graphic, tending to portray self-harm and suicide as an inevitable consequence of a condition that could not be recovered from. The sites normalised her condition focusing on a limited and irrational view without any counterbalance of normality.
567. The coroner further observed that “[t]here was no age verification when signing up to the on-line platform” and that M.R.’s parents “did not have access, to the material being viewed or any control over that material.” Unsurprisingly, M.R. was under the age of 13 when she began using Instagram.
568-573. [Redacted]
574. Meta similarly downplayed the issue of compulsive use on its Platform. [Redacted]
575. In addition to downplaying statements about the harms of its Platforms, Meta also mischaracterizes platform features as helpful to well-being when in fact they are designed to fail.
576. To illustrate, Meta knows that its features contribute to teens struggling with the amount of time they spend on Meta’s Social Media Platforms such as Instagram. Meta researchers noted that “[t]eens talk about the amount of time they spend on Instagram as one of the ‘worst’ aspects of their relationship to the app.” Meta researchers observed that in conversations, teens had “an addicts’ narrative about their use” and “wish[ed] they could spend less time caring about it, but they can’t help themselves.”
577. While Meta adopted so-called “time management” tools, in reality, those tools cannot effectively counteract the overwhelming power of features like infinite scroll, autoplay, and other use-inducing features.
578. In 2018, Meta launched “Daily Limit,” a feature it claimed would enable users to restrict the amount of time they spend on Instagram each day. Despite the feature’s name, it does not enable users to restrict the amount of time they spend on the app.
579. Instead, Daily Limit serves a pop-up notification whenever a user reaches the maximum amount of time they wish to spend on Instagram each day. But this feature was designed so that the user can easily dismiss the notification and return to using Instagram unimpeded.
580. Moreover, the Daily Limit pop-up notification invites the user to reconsider their preferred time limit. Upon information and belief, similar to nudges described above (where, if a user turns their notifications off, Meta nudges the user to turn notifications back on), Meta designed the Daily Limit feature to regularly tempt users, especially young users, to revert to harmful, time-maximizing settings each and every time the user reaches their chosen limit.
581. In December 2021—just one day before Mosseri was scheduled to appear before Congress, and shortly after a whistleblower thrust the well-being issues Meta causes teens onto the national stage—Instagram launched the “Take a Break” tool. Take a Break sends users a popup notification when they have spent more than a specified period of time scrolling without interruption.
582. As with the Daily Limit notification, the Take a Break notification is easily dismissed for a quick return to more infinite scrolling.
583. [Redacted]
584. [Redacted] Once the whistleblower report was no longer front-page news, Meta further watered down the Daily Limit tool: while users could initially select a Daily Limit as low as ten minutes, in February 2022, Meta quietly raised the minimum to 30 minutes.
585-587. [Redacted]
588. In designing its Daily Limit and Take a Break features, Meta could have provided young users with robust tools that, once enabled, empowered young users to effectively selfregulate their use of Meta’s Social Media Platforms.
589. But instead of being able to set it and forget it, young users who make what can be a difficult choice to limit their daily use or take a break must make this difficult decision over and over again. Meta’s design choices make the proverbial wagon that much easier for young users to fall off.
590. Upon information and belief, Meta does so because it does not want its users to avail themselves of tools that could help protect them from the addictive nature of Meta’s Platforms.
591. Moreover, Meta has repeatedly made misleading statements regarding its own internal research on user harms on its Platforms.
592. For example, Meta claims that it conducts research to make its Platforms safer for teens. During congressional testimony on September 30, 2021, Davis stated that “we conduct this research [about young people’s experiences on Instagram] . . . to minimize the bad and maximize the good.” [Redacted]
593. As another example, in August 2021, Senators Richard Blumenthal and Marsha Blackburn wrote to Zuckerberg with detailed questions concerning the nature and findings of Meta’s research on “the effects of social media platforms on kids’ well-being.” The senators specifically asked whether Meta’s research had “ever found that its platforms and products can have a negative effect on children’s and teens’ mental health or well-being.” Meta’s letter in response failed to disclose its own studies demonstrating that the answer was yes.
594. Beginning in September 2021, the Wall Street Journal published a series of articles based on documents leaked by whistleblower Haugen, which detailed Meta’s knowledge of the harms associated with using Meta’s platforms.
595. Meta—at the direction of its highest officers—publicly downplayed the results of the company’s own research. Meta criticized its researchers’ methods and conclusions, and the company crafted statements that sidestepped the negative experiences that its research showed many teen users—especially teen girls—had on its platforms.
596. For instance, in a September 26, 2021, blog post, Meta’s Vice President of Research Pratiti Raychoudhury suggested that some of the presentations relied upon by the Wall Street Journal used “shorthand language . . . and d[id] not explain the caveats on every slide” because they were “created for and used by people who understood the limitations of the research.”
597. [Redacted] the Hard Life Moments research—which revealed that some Instagram users experiencing certain mental health struggles believed the Platform exacerbated those issues—[Redacted]
599. [Redacted]
600. More broadly, and as the New York Times reported in October 2021, Meta’s external response to the leaks “angered some employees who had worked on the research.” As one researcher noted, the company was in effect “making a mockery of the research.”
601-603. [Redacted]
604. Yet, on September 30, 2021, when Senator Blackburn asked Davis in a congressional hearing how Meta was “restricting access to data internally” and whether Meta’s “policies changed since the Wall Street Journal articles,” Davis responded, “not that I’m aware of certainly.”
605. Meta knows that its Social Media Platforms caused, and continue to cause, harm to young users.
606. [Redacted]
607. Nevertheless, Meta repeatedly failed to implement changes over the years to address these ongoing harms.
608. In 2017, Facebook’s former Vice President for User Growth publicly stated that he prohibits his own children from using Facebook, and Meta researchers wrote in a public post that they were “worrie[d] about [their] kids’ screen time.”
609-610. [Redacted]
611. Instead of listening to its employees’ concerns and prioritizing user well-being and safety, Meta disbanded its responsible innovation team, which was devoted to addressing “the potential downsides of its products.”
612-628. [Redacted]
629. [Redacted] Bejar, former Meta Director of Site Integrity and former consultant to Meta, testified in 2023 that Zuckerberg ignored his appeals for Meta to prioritize user well-being and engage in a “culture shift” to ensure teen safety on its Platforms. As Bejar further testified, Meta “know[s] about harms that teenagers are experiencing in its product, and they’re choosing not to engage about it or do meaningful efforts around it.”
630. Despite the direct, personal experience of Meta’s employees of the harms of Meta’s design and features, Meta’s own internal studies documenting the harmful effects of these features, the opinions of many external experts and whistleblowers, and the voices of Meta’s young users themselves [Redacted] Meta has persisted in developing and deploying features that exploit young users’ psychological vulnerabilities and significantly harm young users in its pursuit of profit.
[14] See, e.g., Jonathan Haidt & Jean Twenge, Social Media and Mental Health: A Collaborative Review (unpublished manuscript, on file with New York University), available at (last visited Oct. 23, 2023); Jacqueline Nesi et al., Handbook of Adolescent Digital Media Use and Mental Health, Cambridge Univ. Press (2022).
[15] See, e.g., Jean Twenge & W. Keith Campbell, Digital Media Use Is Linked to Lower Psychological Well-Being: Evidence from Three Datasets, 90 Psychiatric Q. 311 (2019).
[16] Jean Twenge & Eric Farley, Not All Screen Time Is Created Equal: Associations with Mental Health Vary by Activity and Gender, 56 Soc. Psychiatry & Psychiatric Epidemiology 2017 (2021).
[17] Holly Scott et al., Social Media Use and Adolescent Sleep Patterns: Cross-Sectional Findings from the UK Millennium Cohort Study, 9 BMJ Open 1 (2019); Garrett Hisler et al., Associations Between Screen Time and Short Sleep Duration Among Adolescents Varies by Media Type: Evidence from a Cohort Study, 66 Sleep Med. 92 (2020).
[18] Megan A. Moreno & Anna F. Jolliff, Depression and Anxiety in the Context of Digital Media, in Handbook of Adolescent Digital Media Use and Mental Health 227 (2022); see also, e.g., Huges Sampasa-Kanyinga et al., Use of Social Media is Associated With Short Sleep Duration in a Dose-Response Manner in Students Aged 11 to 20 Years, 107 Acta Paediatrica 694, 694-700 (2018).
[19] Karine Spiegel et al., A Meta-analysis of the Associations Between Insufficient Sleep Duration and Antibody Response to Vaccination, 33 Current Biology 998 (2023).
[20] Maria T. Maza et al., Association of Habitual Checking Behaviors on Social Media with Longitudinal Functional Brain Development, 177 JAMA Pediatrics 160 (2023).
[21] See, e.g., Amy Orben et al., Windows of Developmental Sensitivity to Social Media, 13 Nature Comm. 1649 (2022).
[22] Id.
23 Anushree Tandron et al., Sleepless Due to Social Media? Investigating Problematic Sleep Due to Social Media and Social Media Sleep Hygiene, 113 Computers in Human Behavior 106487 (2020).
[24] Regina J.J.M. van den Eijnden et al., Social Media Use and Adolescents’ Sleep: A Longitudinal Study on the Protective Role of Parental Rules Regarding Internet Use Before Sleep, 18 Intl. J. Envtl. Res. Pub. Health 1346 (2021).
[25] Sampasa-Kanyinga et al., supra note 18; see also Marian Freedman & Michael G. Burke, Social Media and Sleep Duration-There Is a Connection!, 35 Contemp. Pediatrics J. (2018).
[26] Twenge & Farley, supra note 16.
[27] Youth Risk Behavior Survey, Data Summary & Trends Report: 2011-2021, at 61, Ctrs. for Disease Control & Prevention (2023), .[28] Gregory Plemmons et al., Hospitalization for Suicide Ideation or Attempt: 2008-2015, 141 Pediatrics 1, 4-5 (2018); see also Brett Burstein et al., Suicidal Attempts and Ideation Among Children and Adolescents in US Emergency Departments, 2007-2015, 173 JAMA Pediatrics 598, 598-600 (2019).
[29] Jean Twenge, Increases in Depression, Self-Harm, and Suicide Among U.S. Adolescents After 2012 and Links to Technology Use: Possible Mechanisms, 2 Psychiatric Res. Clinical Prac. 19 (2020).
[30] Youth Risk Behavior Survey, supra note 27.
[31] Id.
[32] Haidt & Twenge, supra note 14, at 316.
[33] Laura Marciano et al., Digital Media Use and Adolescents’ Mental Health During the Covid-19 Pandemic: A Systematic Review and Meta-Analysis, 9 Front Pub. Health 793868 (2021).
[34] See Press Release, MIT Sloan School of Management, Academic Study Reveals New Evidence of Facebook’s Negative Impact on the Mental Health of College Students (Sept. 27, 2022), //archive.today/tv6Ff.
About HackerNoon Legal PDF Series: We bring you the most important technical and insightful public domain court case filings.
This court case 4:23-cv-05448 retrieved on October 25, 2023, from is part of the public domain. The court-created documents are works of the federal government, and under copyright law, are automatically placed in the public domain and may be shared without legal restriction.