BBB National Programs Issues Compliance Warning for Use of AI in Child-Directed Advertising and Data Collection

McLean, VA – May 1, 2024 – BBB National Programs’ Children’s Advertising Review Unit (CARU) today issued a new compliance warning on the application of CARU’s Advertising and Privacy Guidelines to the use of Artificial Intelligence (AI).

The CARU compliance warning puts advertisers, brands, endorsers, developers, toy manufacturers, and others on notice that CARU’s Advertising and Privacy Guidelines apply to the use of AI in advertising and the collection of personal data from children. 

The warning states that CARU will strictly enforce its Advertising and Privacy Guidelines in connection with the use of AI and the potential risks that its use may pose in terms of manipulative practices, including influencer marketing, deceptive claims, and privacy practices. Additionally, according to the warning, marketers should be particularly cautious to avoid deceiving children about what is real and what is not when engaging with realistic AI-powered experiences and content. 

"Our Compliance Warning stresses the importance of responsible advertising and data privacy practices in the children’s space, particularly in AI-driven advertising and data collection," said Dona Fraser, Senior Vice President, Privacy Initiatives at BBB National Programs. "We call on marketers, brands, and developers to prioritize transparency, safety, and compliance with CARU’s Advertising and Privacy Guidelines as we all work to maintain the well-being of children as AI becomes commonplace.”

CARU’s Guidelines are widely recognized industry standards designed to assure that advertising directed to children is not deceptive, unfair, or inappropriate for its intended audience and that, in an online environment, children’s data is collected and handled responsibly. CARU monitors child-directed media to ensure compliance with its Guidelines, seeking the voluntary cooperation of companies and, where necessary, referral for enforcement action to an appropriate federal regulatory body, usually the Federal Trade Commission (FTC), or to a state Attorney General. 

 

CARU’s Advertising Guidelines 

CARU’s Advertising Guidelines apply to advertising in all media, regardless of whether AI is used to create or disseminate the ads, including digital worlds where altered, simulated, and synthetic content is powered by AI. 

Brands using AI in advertising should be particularly cautious of the potential to mislead or deceive a child in the following areas:

  • AI-generated deep fakes; simulated elements, including the simulation of realistic people, places, or things; or AI-powered voice cloning techniques within an ad. 
  • Product depictions, including copy, sound, and visual presentations generated or enhanced using AI indicating product or performance characteristics.
  • Fantasy, via techniques such as animation and AI-generated imagery, that could unduly exploit a child’s imagination, create unattainable performance expectations, or exploit a child’s difficulty in distinguishing between the real and the fanciful.
  • The creation of character avatars and simulated influencers that directly engage with the child and can mislead children into believing they are engaging with a real person.

 

CARU’s Advertising Guidelines also noted that advertisers should take measures when using generative AI to depict people to ensure the depictions reflect the diversity of humanity and do not promote harmful negative stereotypes. 

 

CARU’s Privacy Guidelines

CARU’s Privacy Guidelines apply to online data collection and other privacy-related practices for online services that target children under 13 years of age, and to operators that have actual knowledge they are collecting personal information from children under 13 years of age.  

Because AI offers unique opportunities to interact with children who may not understand the nature of the information being sought or its intended use, brands using AI in online services should be particularly cautious in the following areas:

  • Requirements and responsibilities when collecting personal information from a child under the Children’s Online Privacy Protection Act (COPPA).
  • Reliance upon third-party generative AI technology to operate and process data, which may require verifiable parental consent (VPC).
  • Operators who input a child’s personal information into an AI system and receive a deletion request from a parent, which may be nearly impossible to retrieve and delete. 
  • AI-connected toys and online services must collect VPC and properly disclose their collection practices in their Privacy Policy, prior to any collection, use, or sharing of children’s personal information through their own online service or with a third-party generative AI service. 

 

View CARU’s AI Compliance Warning here.

 

Subscribe to the Ad Law Insights or Privacy Initiatives newsletters for an exclusive monthly analysis and insider perspectives on the latest trends and case decisions in advertising law and data privacy.

 

 

 

 

Latest Decisions

Decision

National Advertising Review Board Finds Glad ForceFlex MaxStrength Trash Bag “25% More Durable” Claims on Packaging Not Misleading

New York, NY – May 23, 2024 – A panel of the National Advertising Review Board (NARB) determined that The Glad Products Company’s “25% more durable” claim for Glad ForceFlex MaxStrength bags, as it appears on the packaging of 45- 34- and 20-bag sizes, is not misleading.

Read the Decision Summary
Decision

Direct Selling Self-Regulatory Council Recommends Healy World Discontinue Health-Related and Earnings Claims in Compliance Inquiry

McLean, VA – May 21, 2024 – The Direct Selling Self-Regulatory Council (DSSRC) has recommended that Healy World, a direct selling company that markets consumer health and wellness products, discontinue certain health-related product performance claims made on social media by its salesforce members.  

Read the Decision Summary
Decision

National Advertising Division Refers Problem Pregnancy Center to the MA Attorney General and Social Platforms for Review

New York, NY – May 16, 2024 – The National Advertising Division (NAD) referred advertising claims made by Problem Pregnancy, a crisis pregnancy center, to the Massachusetts Attorney General and social media platforms after the company failed to respond to NAD's inquiry.

Read the Decision Summary
Decision

National Advertising Division Finds Certain Compostability Claims for HoldOn Bags Supported; Recommends Others be Modified or Discontinued

New York, NY – May 16, 2024 – The National Advertising Division determined that HoldOn Bags has a reasonable basis to claim that its trash bags break down in compost environments, but recommended other claims be discontinued. 

Read the Decision Summary