Back to all Insights

CHT Files Amicus Brief Highlighting the Importance of Design for Regulating Social Media and AI

This post is part of The Catalyst newsletter series. Subscribe here for future resources.
CHT files an amicus brief to the Ninth Circuit Court of Appeals in support of Bonta in NetChoice v. Bonta.

Background 

In 2022, California unanimously passed the Age Appropriate Design Code Act, a law that would require online platforms to proactively consider how their product design impacts the privacy and safety of children and teens in California. Shortly after, NetChoice, a tech lobbying group whose members include Meta, TikTok, Google, Amazon, and other major tech companies, brought a lawsuit against the bill, stopping it from moving forward (NetChoice v. Bonta).

In October 2023, Judge Beth L. Freeman of the US District Court for the Northern District of California preliminarily enjoined the California law in response to NetChoice’s challenge, thereby halting the Age Appropriate Design Code Act from going into effect in order to allow courts to review the merits of the case. The decision was appealed by California Attorney General Robert Bonta shortly after, sending the case up to the 9th Circuit Court of Appeals and extending the legal battle.

In December 2023, Center for Humane Technology joined more than 60 experts and advocacy groups, collectively representing more than 1.8 million Americans, to support the Age Appropriate Design Code in amicus curiae briefs countering NetChoice’s lawsuit. Center for Humane Technology was represented by the Social Media Victims Law Center, the first firm to file product liability claims against social media platforms based on youth addiction.

Amicus Overview

Center for Humane Technology argued that the preliminary injunction should be vacated, as the district court ignored the broader reality of how these products are built, and how they harm children. The injunction had been upheld by the court under the premise that the bill could infringe on a company’s First Amendment rights. However, this failed to acknowledge the distinct ways that computer code is used to structure digital products, as well as the fact that choices made in code in turn dictate the physical, mental, and emotional experiences of young users. From our brief:

“The district court failed to recognize that the only way to protect vulnerable kids from [AI] is to regulate their data and code with forward-looking regulatory schemes focused on requiring safe digital designs rather than outcome-based regulations destined to quickly become obsolete.”

Three Takeaways

  1. It is critical to construct regulatory regimes that are forward-looking and adaptable, given that rapidly advancing technologies such as artificial intelligence will amplify and expand existing online harms for youth. 
  2. Regulating the root cause of harms from advanced digital technologies – like social media and artificial intelligence – requires addressing the design of these products, which comes down to their data and their code. 
  3. If upheld, the court’s decision would hinder the government's ability to regulate any advanced digital technologies, whether social media or more nascent AI products, in a way that meaningfully protects consumers from the harms of these products. 

To read the amicus brief in full, click here.

Published on
December 20, 2023

Posted in: