Center for Industry

BBB National Programs’ Center for Industry Self-Regulation (CISR), a 501(c)(3) non-profit, was created to harness the historic power of self-regulation, also called soft law, in the United States in order to empower business accountability. CISR is dedicated to education and research that supports responsible business leaders developing fair, future-proof best practices, and to the education of the general public on the conditions necessary for industry self-regulation.

Harnessing the Power of Self-Regulation to Empower Business Accountability

For Funders

Our research explores how to solve collective challenges in the business community, calling on decades of experience operating independent self-regulatory and co-regulatory programs.



For Business

Learn about the challenges facing your industry to help identify opportunities for new best practices that will enhance the trust and respect of consumers, partners, and regulators.




In the Incubator


TeenAge Privacy Program (TAPP)

The TAPP Incubator project has designed safeguards for the personal data of teens, building a bridge between privacy protections for children and adults that can serve as a global model. The TAPP Roadmap is an operational framework designed to help companies develop digital products and services that consider and respond to the heightened potential of risks and harms to teenage consumers and to ensure that businesses collect and manage teen data responsibly. Get the Roadmap

AI in Hiring and Recruiting

In the recruiting and hiring process, where algorithms increasingly provide an aid to human decision making, how can we combine important technological innovation with a proactive approach to employment law regulations and future-proof standards? The AI Incubator project has developed the Principles and Protocols for Trustworthy AI in Recruiting and Hiring, a global baseline standard for the use of AI applications in recruitment and hiring providing practical and actionable guidance for employers and vendors seeking to leverage AI technology responsibly and equitably. Learn More

Emerging Areas of Interest

Connected Vehicles: As cars become smarter and more interconnected, do the rules of the road need to change? How do we anticipate the new normal of safety, security, and data protection, while ensuring that businesses remain on a level playing field and consumers are heard?

The Metaverse: The rules of the road for the metaverse, which is being hailed as the next big technological revolution, are still being written. How can we ensure consumers are protected while encouraging innovation as businesses explore this next digital frontier?
Get Involved






CISR focuses on research that addresses industry-wide challenges to develop fair, future-proof best practices.







Injunction Junction: NetChoice v. Bonta and Securing the Future of Teen Online Privacy and Safety

Oct 2, 2023, 12:13 PM by BBB National Programs Privacy Initiatives Team
While the AADC injunction is not the final word on the constitutionality of California’s approach to regulating online harms, the injunction—and the reasoning that underlies the district court’s decision—raises important questions and creates an entry point to establish a robust minimum bar of protections for teens.

California’s Age-Appropriate Design Code Act (AADC), a sweeping internet oversight law aimed at protecting minors from design decisions that magnify online harms, has been temporarily enjoined by the U.S. District Court for the Northern District of California on the basis that enforcement of the Act would violate the First Amendment to the U.S. Constitution. 

What does this mean for the state of teen privacy laws and related protections for minors (primarily ages 13-17)? 

While the preliminary injunction is certainly not the final word on the constitutionality of California’s approach to regulating online harms, the meticulous district court decision granting the injunction scrutinizes virtually all the AADC’s requirements and prohibitions. The injunction—and the reasoning that underlies the district court’s decision—raises important questions and creates an entry point to establish a robust minimum bar of protections for teens, focused on current and future efforts to regulate digital content and platform design. 

Meanwhile, companies that offer digital products to minors still face scrutiny from legislators, regulators, and plaintiffs. And, while the preliminary injunction may allow digital platforms an opportunity to breathe, legal and reputational risks abound for companies that shy away from high standards of safety and privacy for teenagers.


Companies: Prepare for the Shifting Teen Privacy Landscape

In the absence of legal certainty and clear industry standards for teen data privacy and online safety, BBB National Programs is seeking to fill the gap by providing third-party accountability support to companies through the TeenAge Privacy Program (TAPP). Participating companies will be able to recognize potential areas of risk to teens, identify best practices and design patterns that harmonize business objectives and teen safety, and seek an achievable standard for product and platform design that encourages trust, civility, and meaningful online interaction. 

TAPP is timely due to the widespread sense of urgency for industry to proactively commit to a set of best practices – instead of waiting on lawmakers to decide for them – so that companies serving this vulnerable teen demographic can demonstrate a high level of accountability and safety to their users while simultaneously avoiding duplicative or unnecessary legal and regulatory scrutiny.


Legislative Background

The tension between online child safety regulation and the First Amendment is virtually as old as the commercial internet itself. While Congress led the first regulatory efforts, state lawmakers have taken up the mantle in the absence of successful federal proposals.

Federal Activity

  • Congress’s first attempt to regulate minors’ access to obscene and indecent content in cyberspace, the Communications Decency Act of 1996 (CDA), was struck down on first amendment grounds by the Supreme Court in the landmark Reno v. ACLU, 521 U.S. 844 (1997)
  • The next year, Congress tried again to regulate the digital distribution of “material harmful to minors” with the Child Online Protection Act of 1998 (COPA).
  • Even though Congress had tailored the objectionable CDA language to better reflect the First Amendment concerns highlighted by the Reno Court, COPA was enjoined from enforcement by a district court in 1999 and ruled unconstitutional by the Supreme Court in Ashcroft v. ACLU, 542 U.S. 656 (2004)


Following these defeats in the 1990s, federal lawmakers largely moved away from content-based digital regulations. While Congress has continued to introduce controversial bills that might raise First Amendment concerns (e.g., the Kids Online Safety Act, the RESTRICT Act, etc.), virtually no content-based regulation to prevent minor’s access to internet services has successfully come out of Congress since the school- and library-focused Children’s Internet Protection Act of 2000 (CIPA), which imposed content filtering and obscenity restrictions as a condition for federal education funding. 

State Activity

Since 2022, multiple states have enacted laws that explicitly regulate digital platforms and services likely to be accessed by children. California kicked off this recent legislative activity by passing the AADC, a law modeled after the UK Information Commissioner’s Office Age-Appropriate Design Code GDPR guidance. Subsequently, Utah, Texas, Delaware, Arkansas, Ohio, and Louisiana all adopted laws (or specific provisions in broader laws) that established additional obligations for social media companies to protect minors in some form: for example, by imposing requirements to verify the age of users, seeking parental consent or parental access to minors’ accounts and content, or creating additional protections, default settings, or consent options for teens aged 13-17. 

While the goal of protecting minors from unwanted and unlawful contact is laudable, states have sought to impose several guardrails and prohibitions that will inevitably restrict user access to online speech. In situations where this is the case, courts are likely to follow a similar line of reasoning to the district court in the AADC injunction. In the opinion granting a preliminary injunction, the federal district court judge found that 10 provisions of the AADC would likely be ruled unconstitutional under the intermediate scrutiny Central Hudson test. 

In particular, the court took issue with requirements to perform a data protection impact assessment, to estimate the age of users within reasonable certainty, to set high default privacy and safety settings, and to enforce published terms, policies, and community standards. The court also took issue with the AADC’s restrictions on dark patterns, on profiling minors, and on collecting, selling, sharing, and retaining children’s data. While the reasoning is nuanced, the court found that the AADC’s goal of protecting minors did not fit the actual restrictions the California legislature sought to impose, as the AADC was overbroad and could chill a substantial amount of commercial speech not directly relevant to protecting minors.

The AADC preliminary injunction is the second injunction this year brought about by litigation from NetChoice, an industry group that has called children’s privacy and safety laws into question.


What's coming next?

While injunctions may prevent the enforcement of certain laws, the underlying desire to establish guardrails for teen online privacy and safety will not lose its political salience. 

Just this past summer, the White House brought teen online mental health issues to the forefront by establishing an interagency task force and a call to action focused on these very issues. And just this week, the National Telecommunications and Information Administration (NTIA) put out a call for stakeholder comments regarding protecting children and teens’ safety online. So, all eyes remain on what to do next in this critical space.

The bottom line: Legislators, regulators, and plaintiffs are still seeking avenues to heighten privacy protections for minors and establish accountability for digital platforms. 

A few important policy trends to watch:

  • A class action lawsuit for child privacy violations against Google has been allowed to proceed in California Federal District Court after a finding that COPPA (Children’s Online Privacy Protection Act) did not preempt plaintiffs from making such claims under state law. 
  • State legislators in Maryland and Minnesota are still looking to introduce their own Age-Appropriate Design Codes that learn from the constitutional issues that have held up enforcement of the California law.
  • Delaware recently passed a comprehensive consumer privacy law that requires opt-in consent to direct targeted advertising to or sell data from, minors under 18.
  • The Federal Trade Commission (FTC) has been looking to expand its use of its Section 5 unfair or deceptive practices authority to bring cases like its case against Epic Games, where design defaults were scrutinized as unfair for allowing adults to contact minors, including teenagers who are not protected under COPPA. FTC Commissioner Alvaro Bedoya has called on the agency to hire child psychologists to better shape investigations on the basis of harm to minors.


Given this avalanche of activity focused on teen privacy and safety, businesses that operate an online platform should seek to implement a form of privacy by design that both mitigates risks to minors and protects the business from legal and reputational risks.

As a recognized leader in children’s privacy since the founding of the Children’s Advertising Review Unit (CARU) in 1974, BBB National Programs can help companies comply with the law and data privacy best practices and pave the way forward for a strengthened framework for protections for minors’ data through TAPP.

To learn more about joining TAPP, contact us at





Press Release

Justin Connor Named Executive Director for The Center for Industry Self-Regulation, a Foundation Created by BBB National Programs

McLean, VA – May 17, 2022 – Recognizing a timely opportunity to promote and grow the next generation of independent industry self-regulation programs, The Center for Industry Self-Regulation today named Justin Connor as its inaugural Executive Director. The announcement was made by Eric D. Reicin, President...

Read the Press Release