Plaintiffs Defeat Meta/Tiktok Etc. Efforts to Dismiss Sprawling Teen Social Media Addiction Litigation

As reported by Law360 (subscription) and other outlets, a California federal judge has denied Meta and other social media companies’ attempts to dismiss a federal class action lawsuit brought by Lieff Cabraser on behalf of parents and their children over allegations that the tech giants intentionally design their platforms to hook young users in a predatory manner that damages their mental health.

The lawsuit claims that the companies have failed to warn teens and their parents about the risks of addiction to social media apps such as Instagram and Tiktok. Plaintiffs say that specific app features such as image filters can harm the mental health of young people by promoting unrealistic beauty standards, leading to body image issues, eating disorders, anxiety, depression and even suicide.

In her ruling denying the social media companies’ motions to dismiss, U.S. District Judge Yvonne Gonzalez Rogers determined that certain features of the platforms, such as image filters, could be treated like products. According to a Reuters report, Judge Gonzalez noted that “the companies legally owed a duty to their users arising from their status as product makers and could be sued for negligence over their duty to design reasonably safe products and to warn users of known defects.”

The plaintiffs are represented by Lieff Cabraser partner Lexi Hazam, and co-counsel from Motley Rice and Seeger Weiss. “Today’s decision is a significant victory for the families that have been harmed by the dangers of social media,” plaintiffs’ counsel told CNN in a shared statement. “The mental health crisis among American youth is a direct result of these defendants’ intentional design of harmful product features.”

Full articles are available on the Law360 (subscription required), CNN, and Reuters websites.

Learn more about the teen/youth social media injuries litigation.

Contact us

Use the form below to contact a lawyer at Lieff Cabraser.