Subscribe to Our Newsletter
Argues lower court precedent undermines longstanding state authority and is out of step with congressional intent
OAKLAND – California Attorney General Rob Bonta, alongside a bipartisan coalition of attorneys general, filed an amicus brief in Gonzalez v. Google urging the U.S. Supreme Court to interpret section 230 of the Communications Decency Act to allow social media companies to be held liable when they use algorithms to make targeted recommendations of harmful third-party content. In the brief, the attorneys general urge the Supreme Court to reverse the lower court’s decision to immunize Google subsidiary YouTube from action under the Communications Decency Act and ask the court to interpret "publisher immunity" to allow websites and apps to be held liable for the targeted promotion of content and preserve states’ historic authority to determine appropriate remedies for harms to private parties.
“Under the lower courts’ current, overly broad interpretation of Section 230, states are severely hampered from holding social media companies accountable for harms facilitated or directly caused by their platforms,” said Attorney General Bonta. “This was certainly not Congress’s intent when it carved out a narrow exception in the Communications Decency Act. Companies like Google are not just publishing material from users, they are exploiting it to make a profit. I urge the Supreme Court to adopt a more reasonable view of ‘publisher immunity’ under the Communications Decency Act that is in line with Congress’s intent.”
In Gonzalez v. Google, families of several victims of an ISIS terrorist attack in Paris, France sued Google for harm allegedly caused by YouTube’s video-recommendation algorithm that promoted radicalizing and recruiting videos posted by ISIS to users. The lower court dismissed the claims seeking to hold YouTube accountable for the recommendations of content, reasoning that section 230 of the Communications Decency Act grants YouTube immunity from such claims.
In the brief, the attorneys general urge the Supreme Court to interpret “publisher immunity” in Section 230 in a way that does not insulate social media companies from liability for making targeted recommendations of harmful third party content to consumers. In recent years, lower courts have adopted a broad interpretation of Section 230, providing websites and apps like Google with immunity beyond anything Congress could have intended when the Communications Decency Act was enacted. This broad immunity has resulted in the widespread displacement of state laws and the erosion of traditional state authority to determine appropriate remedies for harms to private parties.
Attorney General Bonta is committed to holding social media companies accountable for the harms caused by their platforms. Attorney General Bonta has launched bipartisan, nationwide investigations into Meta and TikTok for providing and promoting their social media platforms to children and young adults despite knowing that use of these platforms is associated with physical and mental health harms. The investigations are targeting, among other things, the techniques utilized by Meta and TikTok to increase the frequency and duration of engagement by young users and the resulting harms caused by such extended engagement. The Attorney General was also part of a bipartisan coalition of attorneys general that urged Meta to abandon plans to launch a version of Instagram for children under the age of 13. Following heavy criticism and shocking new reports from Wall Street Journal and other publications, Meta paused development of the new platform.
Attorney General Bonta joins the attorneys general of Alabama, Alaska, Arkansas, California, Colorado, Connecticut, the District of Columbia, Idaho, Illinois, Indiana, Kentucky, Louisiana, Massachusetts, Minnesota, Mississippi, Nebraska, New Hampshire, New Jersey, New York, North Carolina, Oregon, Rhode Island, South Carolina, South Dakota, Tennessee, Vermont, and Virginia in filing the brief.
A copy of the amicus brief is available here.