Are Product Liability Lawsuits the Way to Hold Tech Companies Accountable?

By Dylan Moses, J.D. Candidate, 2024

At their core, social networking companies sell products and services. So why don’t we regulate them that way?

There is no doubt that social media companies like Facebook (now Meta) and Twitter have been boons for the internet. These platforms offer immeasurable opportunities to connect public leaders with constituents, businesses with consumers, and communities across the globe. But for all their value these companies often help amplify the world’s social problems.

To foster growth and engagement, platform companies’ algorithms reward users who have the ability to create engaging content with the opportunity to become an “influencer” or go “viral.”1 However, the consequences of social media’s amplification of divisive content are almost endless.2 One need only look at the Christchurch massacre or the insurrection on January 6th to know that division on social media eventually produces negative externalities when paired with each platform’s desire for user engagement.3 Should platform companies be liable for those externalities?

In the past, Section 230 of the Communications Decency Act (“CDA 230”) has generally shielded platforms from this liability. CDA 230 states that platforms are not liable as either the speaker or publisher of third-party content.4 But the statute is silent on the liability of a platform that actively incentivizes dangerous or incendiary content—a claim that the Ninth Circuit just allowed to proceed.

In Lemmon v. Snap, Inc., the Ninth Circuit found that Snap, maker of the popular application Snapchat, could be held liable for negligence under a theory of product liability.5 Lemmon involves a tragic story of teens who died in a car crash while traveling at nearly 120mph. Immediately prior to the accident, the teens were actively using Snapchat’s speedometer filter to send “snaps” of their joyride to other users. The plaintiffs, parents of the victims, contended that Snapchat knew or should have known that their product was “incentivizing young drivers to drive at dangerous speeds,” based on a number of media reports of similar accidents.6 Snap, however, claimed that the plaintiffs have not actually stated a claim for relief, and that even if they did, CDA 230 immunized Snap from liability.7 Although the district court granted Snap’s motion to dismiss, the Ninth Circuit reversed the lower court’s ruling and held that the plaintiffs’ claim sought to hold Snap liable for “its distinct capacity as a product designer,” and not its role as either a speaker or publisher.8

While critics of this litigation strategy say that product liability lawsuits of this type would be a detriment to online freedom of expression, CDA 230 immunity was never meant to shield platforms from all liability.9 Senator Ron Wyden, one of the co-authors of CDA 230, cared about preserving speech and protecting “the little guy, the startups . . . essential for innovation and competition,” because he knew liability for what a third-party posts on a platform’s website (and the potential threat of litigation) would make it infeasible to grow.10 Yet, there is no way that Senator Wyden could have foreseen that the platforms to which his legislation gave birth would dominate the industry and boost the worst of its users’ third-party content.11 And while ultimately liability should be imposed on the real-world participants in dangerous events like the Christchurch massacre or the January 6th insurrection, we can be certain that neither Senator Wyden nor Congress intended to shield platforms from their role in fomenting these activities.

Social media has been a boon for the social web. Yet, the question today is whether CDA 230 liability should extend to platforms that knowingly incentivize user generated content in a way that would reasonably lead to real-world harm. The court in Lemmon opened a way of examining liability for social media companies’ products and reward systems, a way that beckons as a strategic avenue for litigation in other areas of social media liability and accountability.