Continuing to Evolve: the 10s, 20s, and the Future of CARU

May 29, 2024 by The Children's Advertising Review Unit

(Did you read CARU in the 90s and 00s?)

The confluence of social media, apps, and digital advertising in the 2010s and 2020s generated new issues that inspired multiple revisions to the Children’s Advertising Review Unit (CARU) Advertising Guidelines to keep pace with advancements in technology, as well as compliance warnings to address new platforms breaking onto the scene. 

Starting in 2004, social media took the stage, followed closely by the Apple iPhone and the coordinating app store, which launched in 2008. By 2010, the term “app” was officially the word of the year. Later in 2012, Google joined and launched its Google Play Store. 

 

Advertising in the App Age

At this time, companies were innovating quickly and CARU was monitoring the marketplace to ensure that compliance with advertising and privacy guidelines was front of mind for companies marketing their cool new products and services to children under age 13. 

Here are some of CARU’s notable advertising cases in the early 2010s:

May 2013 (CARU Case #5594): Toy Box Apps, LLC Mall Girl App allowed users to shop for and purchase items at a virtual mall with game currency and real money, including pets where messages such as “Your pet is going to be taken away by the SPCA for animal neglect! Pay a fine of 6 Cash to keep your pet” appeared, pressuring users to spend money.

  • CARU Guideline: Advertising should not urge children to ask parents or others to buy products. Advertisers should avoid using sales pressure in advertising to children, e.g., creating a sense of urgency by using words such as “buy it now.”

 

February 2014 (CARU Case #5685): OUTFIT 7 – Talking Tom Cat 2, an early version of an advergame, featured advertisements for other game apps, including large pop-up advertisements, none of which were labeled as ads.

  • CARU Guideline: On websites directed to children, if an advertiser integrates an advertisement into the content of a game or activity, then the advertiser should make clear, in a manner that will be easily understood by the intended audience, that it is an advertisement. 

 

Filling Privacy Gaps

At this time, the emergence of social media and mobile apps created critical privacy gaps for younger users. As a result, CARU’s monitoring of the child-directed marketplace for its Privacy Guidelines became even more important. 

Here are some of CARU’s notable privacy cases from the 2010s:

April 2018 (CARU Case #6171): Resulting in a landmark FTC case and settlement, CARU referred to the FTC the Musical.ly App, today known as TikTok, a social media service requiring account creation to view and create content (email address, username, password) without an age screening mechanism, allowing the disclosure of PII without notice and verifiable parental consent.

  • CARU Guideline: Among others, operators must obtain “verifiable parental consent” before they collect personal information that will be publicly posted, thereby enabling others to communicate directly with the child online or offline.

 

March 2019 (CARU Case #6268): HyperBeard’s KleptoCats app, in coordination with the Digital Advertising Accountability Program, referred HyperBeard to the FTC for not participating in the self-regulatory inquiry.

May 2019 (CARU Case #6274): Facebook app, where children were falsifying their age due to a lack of an effective age-gating mechanism.

  • CARU Guideline: Among others, age-screening mechanisms should be used in conjunction with technology, e.g., a persistent cookie, to help prevent underage children from going back and changing their age to circumvent age-screening.

 

Revising the Guidelines

In the 2020s, it was time for change. Not only was technology continuing to advance at a rapid pace, but social media had begun to influence and impact younger generations of internet users. 

In 2021, CARU, in partnership with CARU Supporters – companies dedicated to improving the advertising landscape for children – revised its Advertising Guidelines

The revised guidelines reflected the growth in online platforms and new immersive forms of child-directed interactive media over the past decade, more specifically addressing digital media, video, influencer marketing, apps, in-game advertising and purchase options in games, social media, and other interactive media in the children’s space. 

Among other additions, the revised CARU Advertising Guidelines:

  • Hold advertisers accountable for negative social stereotyping, prejudice, or discrimination.
  • Contained a new section dedicated to in-app and in-game advertising and purchases.
  • Incorporated updated FTC guidance on endorsements and influencer marketing.
  • Raised the upper bound of the age the guidelines are applicable to from 12 to 13.
  • More clearly spell out the factors that determine when an ad is primarily directed to children under age 13.

 

In August 2022, CARU brought its first case related to its new provision on negative social stereotyping and discrimination: Moose Toys Fail Fix Total Makeover Doll (CARU Case #6443).

In this case, advertising messages in commercials, influencer social media posts, and product packaging directed to children under age 13 were found to propagate negative stereotypes regarding girls’ personal appearance and beauty standards as well as to portray outdated and harmful racial and ethnic stereotypes. 

CARU’s Guidelines implore advertisers to remember the responsibility they have in the child-directed marketplace and to recognize the importance of diversity and inclusion in both ad creative and product development. 

 

Warning: the Metaverse and AI

At around the same time, CARU had another technological advancement to address: the metaverse. 

In August 2022, CARU issued a compliance warning regarding advertising practices directed to children in the metaverse, putting advertisers, brands, influencers and endorsers, developers, and others on notice that CARU’s Advertising Guidelines apply to the metaverse just as they have applied to every platform and online environment before it. 

Following the warning, knowing that brands would need guidance on how to properly apply CARU’s Guidelines to this new world, CARU and CARU Supporters got to work to develop clear guardrails for child-directed advertising and privacy in the metaverse. Those were released a year later, in October 2023, providing companies with realistic, actionable recommendations and best practices to responsibly develop metaverse experiences directed to children and comply with existing advertising and privacy law. 

Over the last year, technological advancement has not slowed down. This year, all eyes are on AI. In May 2024, as a reminder to brands that CARU’s Guidelines apply to any platform or new technology, CARU issued a compliance warning for the use of AI in child-directed advertising and data collection. 

CARU knows that most companies marketing to children want to do the right thing – they just need direction. So, as with the Metaverse Guardrails, CARU and CARU Supporters will once again come together to develop guardrails for the use of AI in child-directed products, services, and campaigns. 

Interested in joining those brands working to strengthen responsible child-directed marketing and privacy practices in the marketplace? Become a CARU Supporter

To be the first to know when the new guardrails are available, subscribe to the Children’s Corner monthly newsletter or follow us on social media (@CARUAdReview  /  LinkedIn).


Interested in diving deeper into CARU’s cases? Read summaries of all CARU cases in the Case Decision library. To access the full case decisions for all cases, including CARU’s earliest cases, subscribe to the Online Archive

Suggested Articles

Blog

Old MacDonald Had an Engagement Farm: Lessons Learned from FTC v. NGL

Capturing user engagement is the foundation of internet commerce. And while the incentives to prompt greater engagement are certainly understandable, the recent NGL Labs case from the FTC raises important questions about the ethical and legal ramifications when companies try to artificially generate engagement among their userbase.
Read more
Blog

Independence Day Edition: CBPR Framework Offers “Checks & Balances”

Going, Going, Gone Global, a webinar on the CBPR Global Forum, delved into how privacy impacts businesses’ brand reputation and builds trust with key stakeholders, discussed the purpose of the Global CBPR, and its value to Global Forum members.
Read more
Blog

Industry Self-Regulation: Part of the Solution for Governing Generative AI

The spotlight on generative AI remains bright. The benefits and risks continue to be ever-present in the minds of business and political leaders. No matter the timing or the setting, the creation of transparency, accountability, and collaboration among stakeholders is key to successful industry self-regulation as is the importance of setting standards and best practices.
Read more
Blog

The Demise of “Chevron Deference”: Who Will Fill the Regulatory Gaps?

The Supreme Court's 1984 ruling in Chevron v. NRDC held that courts should defer to federal agencies’ interpretations of ambiguous federal laws so long as those interpretations are reasonable. So given the court’s decision to overturn it, where does that leave companies that want a level playing field and perhaps even to raise the bar, instead of racing to the bottom?
Read more