
Social media is a regular part of everyday life, and it shapes how millions of people get information, entertainment, and connect with others. Platforms like TikTok market themselves as entertainment, using short videos and viral trends to keep users engaged.[i] For many users, these engaging, or addictive, features go unnoticed because they have become the reality of the social media landscape. However, as concerns over social media’s effects on young people’s mental health have grown more prevalent,[ii] the question is no longer whether social media can be harmful, but when can that harm become the responsibility of the platform itself?[iii]
For years, debates over social media’s impact on younger generations circulated widely in academic journals and in parent forums, framed as social, psychological, or parental problems rather than legal ones.[iv] Meanwhile, the law has treated social media platforms primarily as publishers under Section 230 of the Communications Decency Act (“CDA”), shielding them from liability for harm arising from user-generated content.[v] Beginning in 2022, that framework faced pushback when claims arose against social media platforms alleging that addiction was not just a byproduct of its use, but a foreseeable result of its deliberate product design.[vi]
Addiction-based liability cases are not entirely unprecedented, as they mirror earlier tobacco litigation. Plaintiffs in social media addiction cases are drawing from a well-established playbook used in the 1990s, when cigarette manufacturers were accused of creating an addictive product while publicly denying both its addiction and harm.[vii] In those cases, liability depended on whether tobacco companies understood the addictive nature of their products, reinforced that dependence through design, and failed to warn consumers of known risks.[viii] That history helps frame today’s social media cases, which turn on the same three factors.[ix] The viability of any social media claim depends on whether courts treat the alleged harm as arising from a protected publishing activity or from the product’s design decisions.[x] Section 230 of the CDA bars claims that would require platforms to alter how they publish or recommend third-party content.[xi] However, in addiction liability cases, it is argued that liability can attach even where no specific piece of content is at issue, as the injury flows from the way the product itself operates.
At centerstage of the social media addiction litigation is K.G.M., a nineteen-year-old plaintiff who alleges that her compulsive use of Instagram, Snapchat, TikTok, and YouTube beginning at age ten led to depression, anxiety, and body dysmorphia.[xii] In 2023, her case was selected as a “bellwether” case in a California proceeding involving hundreds of similar claims brought by families, school districts, and state attorney generals.[xiii] The “bellwether” structure tests how juries respond to a broader legal theory, like in this case, the theory that social media companies deliberately engineered their platforms to addict their users.[xiv]
K.G.M.’s lawsuit focuses not on any single video or post, but on platform design, alleging that social media companies deliberately incorporated features that foreseeably foster addiction in users[xv], including variable reward schedules, frictionless content delivery, and barriers to disengagement.[xvi] Under this theory, social media platforms cause the alleged harm not through the substance or content they deliver but from how they deliver it. For the first time, courts are asking jurors to decide whether features such as infinite scroll, autoplay, persistent notifications, and recommendation algorithms can constitute actionable design defects rather than mere design choices.[xvii] The legal question becomes: when does a social media platform’s engaging design cross the line from protected business choice to actionable product defect or tortious conduct?
In K.G.M.’s case, just hours before jury selection on January 26, 2026, TikTok confidentially settled with K.G.M., leaving behind Meta and YouTube to proceed to trial alone.[xviii] TikTok’s settlement was particularly notable given the volume of internal evidence that may never reach a jury.[xix] Investigative reporting revealed internal company documents suggesting that TikTok executives were aware of the platform’s capacity to foster compulsive use in minors, sometimes within thirty-five minutes of continuous viewing.[xx] Despite this, TikTok continued to refine its algorithm to maximize retention even while acknowledging that the platform’s built-in safeguards had little to no effect on reducing time spent on the app.[xxi] Internal research reportedly linked the excessive use of TikTok to anxiety, sleep deprivation, diminished executive brain function, and other mental health harms.[xxii] This evidence bears directly on foreseeability and corporate knowledge, undermining claims that the alleged addiction harms caused by TikTok’s platform were speculative or unintended.
While TikTok’s settlement does not settle the question of whether social media platforms are legally responsible for addiction-based injuries, it does mark a shift in how those claims are being framed and received. In the months ahead, if they do not settle, Meta and YouTube will head to trial in K.G.M.’s case. Courts will ask jurors to decide whether algorithm-driven design can be treated as a defective product, particularly when aimed at an impressionable audience.[xxiii] The answer will not just affect the future of social media litigation but may redefine the limits of product liability in a digital age.
[i] See Matthew Bergman, TikTok Lawsuit for Teenage Harm, Soc. Media Victims L. Ctr. (Jan. 22, 2026), https://socialmediavictims.org/tiktok-lawsuit/ [https://perma.cc/WWN5-ZVC4] (“TikTok’s algorithm purposely targeted these children with dangerous content to increase their engagement time on the platform and drive revenue.”).
[ii] Id. (“Unfortunately, using the platform is uniquely dangerous to young people. Using TikTok has been linked to several physical and mental health harms, including depression, eating disorders, loneliness, and even death or suicide.”).
[iii] See Shannon Bond, Meta and YouTube head to trial over harm to children after TikTok settles, CapRadio (Jan. 27, 2026), https://www.capradio.org/news/npr/story?storyid=nx-s1-5684196 [https://perma.cc/M322-D62S] (“A key question will be whether tech companies deliberately built their platforms to hook young users, contributing to a youth mental health crisis.”).
[iv] See Ariel Zilber, TikTok reaches last-minute settlement with 19-year-old who said app gave her depression, body dysmorphia, N.Y. Post (Jan. 27, 2026), https://nypost.com/2026/01/27/business/tiktok-reaches-last-minute-settlement-with-woman-who-blamed-app-for-depression-body-dysmorphia/ [https://perma.cc/S2V6-XGM5] (“Multiple large-scale reviews have found consistent links between heavy social media use and worsening mental health.”).
[v] See Danny Tobey et al., Navigating the digital dilemma: Court addresses social media liability in adolescent addiction litigation, DLA Piper (Jan 11, 2024), https://www.dlapiper.com/en-us/insights/publications/2024/01/navigating-the-digital-dilemma-court-addresses-social-media-liability-in-adolescent-addiction [https://perma.cc/2H5P-6YTQ] (“The court reasoned that allegations targeting defendants’ role as publishers of third-party content fell within Section 230’s immunity provisions. This included features such as providing endless content, distributing ephemeral content, and the timing and clustering of third-party content.”).
[vi] See Bond, supra note iii (“The suits accuse Instagram, Facebook, YouTube, TikTok and Snapchat of engineering features that make their apps nearly impossible for kids to put down, like infinite scroll, auto-play videos, frequent notifications and recommendation algorithms, leading in some cases to depression, eating disorders, self-harm and even suicide.”).
[vii] See Omar Kabir, Meta, TikTok, and YouTube could be the next big tobacco, CTECH (Oct. 28, 2025), https://www.calcalistech.com/ctechnews/article/5upff41tl [https://perma.cc/ZA5W-SPMX] (“[F]our major tobacco manufacturers signed the Tobacco Master Settlement Agreement . . . [which] led to the release of millions of documents proving that tobacco companies had long concealed evidence of the harm caused by their products.”).
[viii] Id. (“The [Tabacco Master Settlement] agreement reshaped the industry’s public standing. It imposed restrictions on the advertising and sale of cigarettes and other tobacco products, and it led to the release of millions of documents proving that tobacco companies had long concealed evidence of the harm caused by their products.”).
[ix] See Dara Kerr, Tech giants head to landmark US trial over social media addiction claims, Tech. (Jan. 27, 2026), https://www.theguardian.com/technology/2026/jan/27/social-media-trial-meta-tiktok-youtube [https://perma.cc/67DU-56R7] (“Lawyers for the plaintiffs are using a playbook similar to what was used against tobacco companies in the 1990s, which focused on cigarettes being addictive and companies publicly denying that for decades while knowing their products’ harms.”).
[x] Id. (“[S]ince [social media companies] had long been absolved from liability because of section 230. This time was going to be different . . . jurors must look not only at content on the platforms, but also at the companies’ design choices.”).
[xi] See Tobey et al., supra note v (“Section 230 of the CDA generally affords providers of interactive computer services immunity from liability under state or local law as publishers of third-party content generated by its users.”).
[xii] Id. (“[P]laintiff K.G.M, now 19 years old, says her use of Instagram, Snapchat and TikTok led to depression, anxiety and body dysmorphia.”).
[xiii] See Kerr, supra note ix (“Her case will be the first of about 22 “bellwether” trials . . ..”).
[xiv] Id. (“Her case will be the first of about 22 “bellwether” trials, which are used as test cases to gauge juries’ reactions and potential verdicts. Ultimately, the landmark trials will cover thousands of lawsuits that have been coordinated in what is known as a judicial council coordination proceeding (JCCP).”).
[xv] See Kabir, supra note vii (“At the core of the claims is the allegation that social media companies deliberately engineered their platforms to foster addiction, resulting in higher rates of depression, anxiety, sleep and eating disorders, self-harm, and suicide, particularly among young users.”).
[xvi] See Tobey et al., supra note v (“The plaintiffs identified several features of the defendants’ platforms that they allege are defective . . . endless-content feeds, intermittent variable rewards, ephemeral (or disappearing) content, deployment of notifications to attract and re-attract users, algorithm-based prioritization of content . . ..”).
[xvii] See Kerr, supra note ix (“The judge in the case ruled in November that jurors must look not only at content on the platforms, but also at the companies’ design choices.”).
[xviii] See Bond, supra note iii (“Meta, the owner of Instagram and Facebook, and Google’s YouTube will stand trial in California state court after TikTok settled the lawsuit on the eve of the trial.”).
[xix] See Bobby Allyn et al., TikTok executives know about the app’s effect on teens, lawsuit documents allege, NPR (Oct. 11, 2024), https://www.npr.org/2024/10/11/g-s1-27676/tiktok-redacted-documents-in-teen-safety-lawsuit-revealed [https://perma.cc/DEK2-JQZJ] (“For the first time, internal TikTok communications have been made public that show a company unconcerned with the harms the app poses for American teenagers.”).
[xx] Id. (“TikTok determined the precise amount of viewing it takes for someone to form a habit: 260 videos. After that, according to state investigators, a user ‘is likely to become addicted to the platform.’”).
[xxi] Id. (“One document shows one TikTok project manager saying, ‘Our goal is not to reduce the time spent.’ In a chat message echoing that sentiment, another employee said the goal is to ‘contribute to DAU [daily active users] and retention’ of users.”).
[xxii] Id. (“TikTok’s own research states that ‘compulsive usage correlates with a slew of TikTok fights for survival in latest filing as ban approaches negative mental health effects like loss of analytical skills, memory formation, contextual thinking, conversational depth, empathy, and increased anxiety,’ according to the suit.”).
[xxiii] Id. (“The court’s consideration of claims against social media platforms as ‘products’ with potential design defects continues to extend the realm of product liability into the digital space.”).