Defendants in a civil trial against thousands of plaintiffs, who argue that the use of social media contributed to “an unprecedented mental health crisis” in teenagers, will include tech giants Meta, Google, ByteDance, and Snap, whose attempts to have the case dismissed on summary judgment were denied by a superior court judge in the last effort.
In a written statement, the attorneys of plaintiff John Morgan and Emily Jeffcott praised the court in this historic ruling that will enable families throughout this country to demand accountability over the purportedly disastrous impact of social media addiction. “The defendants have been accused of being aware of the negative consequences of their platforms on the developing minds over the years. We are hoping to bring the facts to the jury to pursue justice in this regard due to the damage done by these social media sites”
The flood of personal injury lawsuits has been pooled into a single coordinated court action in downtown LA. The judge has chosen three bellwether cases, and all three plaintiffs have gone forward under pseudonyms: K.G.M., R.K.C. and Moore. One of them should be put before the jury trial in January. The decisions made in the three initial cases will, at the very least, serve as the foundation of a settlement with the other plaintiffs.
The creators of Facebook, Instagram, YouTube, TikTok, and Snapchat all claimed in their motions seeking a summary judgment that they were under the protection of the claims as provided under Section 230 of the Communications Decency Act that shields online platforms against claims based on the actions of third parties. They claimed that it was not the apps but the content that was damaging the minors.
A Superior Court Judge, Carolyn Kuhl, already decided to dismiss some of the claims made by plaintiffs under Section 230. As an illustration, she agreed to drop any allegations on TikTok due to its viral challenges, including the blackout challenge or the Benadryl challenge, which, respectively, challenged users to choke themselves to unconsciousness and to take Benadryl.
“Those”, she said, “were archetypal instances of third-party content. She too ruled that Section 230 precluded negligence allegations on the grounds of the defendants not taking down content which allegedly created a depiction of child sexual abuse”.
However, much of the rest of the claims, she stated, were features that were introduced by the software engineers of the apps, e.g., the feature of the infinite scroll that has become a staple of Instagram or the autoplay option on YouTube and TikTok, both of which are structures that are intended to keep the user attached to the app.
In her new case, Kuhl said that the companies had not provided a new argument or new authority to prove to this court that it should change its conclusions. She said, “Meta might definitely claim to the jury that K.G.M. incurred injuries because of the content that she watched.” However, Plaintiff has provided evidence that the features of Instagram that compel the user to view the content compulsively were a significant aspect of her damage.
Meta had also contended that its decisions regarding the manner in which to structure and package third-party speech were covered by the First Amendment.
Kuhl dismissed this objection too, citing once again one of her earlier cases where she had held that the so-called addictive nature of the platforms used by the defendants (endless scroll) could not be compared to the way in which a publisher decides upon making a selection of information, but which are instead founded on harm caused by alleged design characteristics that influence the interaction between the Plaintiffs with the platforms irrespective of the nature of the third-party content accessed.
“Google had used another defense: K.G.M. had not shown that she was addicted to social media because of using YouTube,” Kuhl disagreed.
She wrote, “Google is right that the issues of causation in this case are admittedly complicated. The resolution of those causation issues should be done by a jury when all admissible evidence at trial is provided.”
Snap and TikTok made similar arguments, but they were also not accepted.
Snap lawyers in a written statement said: “We are still preparing to go to trial and are eager to demonstrate why the claims by Plaintiffs against Snap are both factually and legally misguided. Snapchat was made in contrast to traditional social media since it opens to the camera, and Snapchatters can engage with their family and friends in a space where their privacy and safety are of utmost importance.”
Lawyers representing other firms either refused to comment or did not respond to e-mails.
The first time that the video clips of the deposition of K.G.M. were played during oral arguments by the plaintiffs’ attorneys over the motions to decide upon the summary judgment last week.
She said that she had got over 50 sexually explicit photos with Snapchat sent to her by strangers, and that she had what she described as sexual grooming. She also used filters on Instagram and Snapchat, which transform photos, which also led her to body dysmorphia and low self-image.
Other plaintiffs have reported that social media led or made them anxious, depressed, and have suicidal urges. Others cited the addiction to the subject of likes, making them stay up late. Others reported that they had social media for eight hours per day.
In the event that the case proceeded, it would mark the first occasion on which a court would listen to arguments on whether social media is harmful and addictive to underage children.
The same measure by 42 states is being taken in federal court in San Francisco, though at a slower pace than in LA.