Injunction Junction: NetChoice v. Bonta and Securing the Future of Teen Online Privacy and Safety
Oct 2, 2023 by BBB National Programs Privacy Initiatives Team
California’s Age-Appropriate Design Code Act (AADC), a sweeping internet oversight law aimed at protecting minors from design decisions that magnify online harms, has been temporarily enjoined by the U.S. District Court for the Northern District of California on the basis that enforcement of the Act would violate the First Amendment to the U.S. Constitution.
What does this mean for the state of teen privacy laws and related protections for minors (primarily ages 13-17)?
While the preliminary injunction is certainly not the final word on the constitutionality of California’s approach to regulating online harms, the meticulous district court decision granting the injunction scrutinizes virtually all the AADC’s requirements and prohibitions. The injunction—and the reasoning that underlies the district court’s decision—raises important questions and creates an entry point to establish a robust minimum bar of protections for teens, focused on current and future efforts to regulate digital content and platform design.
Meanwhile, companies that offer digital products to minors still face scrutiny from legislators, regulators, and plaintiffs. And, while the preliminary injunction may allow digital platforms an opportunity to breathe, legal and reputational risks abound for companies that shy away from high standards of safety and privacy for teenagers.
Companies: Prepare for the Shifting Teen Privacy Landscape
In the absence of legal certainty and clear industry standards for teen data privacy and online safety, BBB National Programs is seeking to fill the gap by providing third-party accountability support to companies through the TeenAge Privacy Program (TAPP). Participating companies will be able to recognize potential areas of risk to teens, identify best practices and design patterns that harmonize business objectives and teen safety, and seek an achievable standard for product and platform design that encourages trust, civility, and meaningful online interaction.
TAPP is timely due to the widespread sense of urgency for industry to proactively commit to a set of best practices – instead of waiting on lawmakers to decide for them – so that companies serving this vulnerable teen demographic can demonstrate a high level of accountability and safety to their users while simultaneously avoiding duplicative or unnecessary legal and regulatory scrutiny.
Legislative Background
The tension between online child safety regulation and the First Amendment is virtually as old as the commercial internet itself. While Congress led the first regulatory efforts, state lawmakers have taken up the mantle in the absence of successful federal proposals.
Federal Activity
- Congress’s first attempt to regulate minors’ access to obscene and indecent content in cyberspace, the Communications Decency Act of 1996 (CDA), was struck down on first amendment grounds by the Supreme Court in the landmark Reno v. ACLU, 521 U.S. 844 (1997).
- The next year, Congress tried again to regulate the digital distribution of “material harmful to minors” with the Child Online Protection Act of 1998 (COPA).
- Even though Congress had tailored the objectionable CDA language to better reflect the First Amendment concerns highlighted by the Reno Court, COPA was enjoined from enforcement by a district court in 1999 and ruled unconstitutional by the Supreme Court in Ashcroft v. ACLU, 542 U.S. 656 (2004).
Following these defeats in the 1990s, federal lawmakers largely moved away from content-based digital regulations. While Congress has continued to introduce controversial bills that might raise First Amendment concerns (e.g., the Kids Online Safety Act, the RESTRICT Act, etc.), virtually no content-based regulation to prevent minor’s access to internet services has successfully come out of Congress since the school- and library-focused Children’s Internet Protection Act of 2000 (CIPA), which imposed content filtering and obscenity restrictions as a condition for federal education funding.
State Activity
Since 2022, multiple states have enacted laws that explicitly regulate digital platforms and services likely to be accessed by children. California kicked off this recent legislative activity by passing the AADC, a law modeled after the UK Information Commissioner’s Office Age-Appropriate Design Code GDPR guidance. Subsequently, Utah, Texas, Delaware, Arkansas, Ohio, and Louisiana all adopted laws (or specific provisions in broader laws) that established additional obligations for social media companies to protect minors in some form: for example, by imposing requirements to verify the age of users, seeking parental consent or parental access to minors’ accounts and content, or creating additional protections, default settings, or consent options for teens aged 13-17.
While the goal of protecting minors from unwanted and unlawful contact is laudable, states have sought to impose several guardrails and prohibitions that will inevitably restrict user access to online speech. In situations where this is the case, courts are likely to follow a similar line of reasoning to the district court in the AADC injunction. In the opinion granting a preliminary injunction, the federal district court judge found that 10 provisions of the AADC would likely be ruled unconstitutional under the intermediate scrutiny Central Hudson test.
In particular, the court took issue with requirements to perform a data protection impact assessment, to estimate the age of users within reasonable certainty, to set high default privacy and safety settings, and to enforce published terms, policies, and community standards. The court also took issue with the AADC’s restrictions on dark patterns, on profiling minors, and on collecting, selling, sharing, and retaining children’s data. While the reasoning is nuanced, the court found that the AADC’s goal of protecting minors did not fit the actual restrictions the California legislature sought to impose, as the AADC was overbroad and could chill a substantial amount of commercial speech not directly relevant to protecting minors.
The AADC preliminary injunction is the second injunction this year brought about by litigation from NetChoice, an industry group that has called children’s privacy and safety laws into question.
What's coming next?
While injunctions may prevent the enforcement of certain laws, the underlying desire to establish guardrails for teen online privacy and safety will not lose its political salience.
Just this past summer, the White House brought teen online mental health issues to the forefront by establishing an interagency task force and a call to action focused on these very issues. And just this week, the National Telecommunications and Information Administration (NTIA) put out a call for stakeholder comments regarding protecting children and teens’ safety online. So, all eyes remain on what to do next in this critical space.
The bottom line: Legislators, regulators, and plaintiffs are still seeking avenues to heighten privacy protections for minors and establish accountability for digital platforms.
A few important policy trends to watch:
- A class action lawsuit for child privacy violations against Google has been allowed to proceed in California Federal District Court after a finding that COPPA (Children’s Online Privacy Protection Act) did not preempt plaintiffs from making such claims under state law.
- State legislators in Maryland and Minnesota are still looking to introduce their own Age-Appropriate Design Codes that learn from the constitutional issues that have held up enforcement of the California law.
- Delaware recently passed a comprehensive consumer privacy law that requires opt-in consent to direct targeted advertising to or sell data from, minors under 18.
- The Federal Trade Commission (FTC) has been looking to expand its use of its Section 5 unfair or deceptive practices authority to bring cases like its case against Epic Games, where design defaults were scrutinized as unfair for allowing adults to contact minors, including teenagers who are not protected under COPPA. FTC Commissioner Alvaro Bedoya has called on the agency to hire child psychologists to better shape investigations on the basis of harm to minors.
Given this avalanche of activity focused on teen privacy and safety, businesses that operate an online platform should seek to implement a form of privacy by design that both mitigates risks to minors and protects the business from legal and reputational risks.
As a recognized leader in children’s privacy since the founding of the Children’s Advertising Review Unit (CARU) in 1974, BBB National Programs can help companies comply with the law and data privacy best practices and pave the way forward for a strengthened framework for protections for minors’ data through TAPP.
To learn more about joining TAPP, contact us at TAPP@bbbnp.org.