Know the Rules: How to Be Age Appropriate in the Metaverse

Aug 23, 2022 by Rashida Gordon, Attorney, Children’s Advertising Review Unit, BBB National Programs

The metaverse is often described as the next digital frontier. On any given day, you can find articles about the metaverse offering opinions on what it will be and what it will look like. Some discuss what it is, arguing that the metaverse is already here. 

In a recent Forbes article, Eric Reicin, President and CEO, BBB National Programs, cited a McKinsey report claiming that spending on metaverse advertising is on track to total between $144 billion and $206 billion by 2030.

According to David Kleeman, Senior Vice President of Global Trends at Dubit, “there’s no actual metaverse yet.” Yet Kleeman states, “there are several platforms we might call ‘proto-metaverses’ – Roblox, Fortnite, Minecraft, and Meta’s Horizon Worlds, for example. But they’re each standalone, unattached to the others, and the metaverse will be distributed and connected.” 

Whether it is already here or arriving soon, the ‘proto-metaverse’ or the metaverse of the future, an open experience where users can jump between various worlds and experiences with a consistent identity, will undoubtedly have a profound impact on the digital footprint and online experiences of children. And the presence of children in the metaverse means that there are several considerations that advertisers, operators of online services, and developers should bear in mind as they begin to navigate this space and the risks it can present to children.  

So where do you start?

Since 1974, from TV advertising to today’s online environment, the Children’s Advertising Review Unit (CARU) Self-Regulatory Guidelines have provided guidance to advertisers, helping them to follow industry best practices. As technology has shifted, the Guidelines have been revised to also help operators and developers focus on the new spaces where children are active in today’s digital world, including the metaverse.

CARU’s Guidelines for child-directed advertising are designed to protect children from deceptive or inappropriate advertising in any media. CARU’s Privacy Guidelines are also designed to ensure that, in an online environment, children’s data is handled responsibly. 

The foundation of the Guidelines is that children have limited knowledge, experience, sophistication, and maturity and therefore that advertisers and operators in online spaces, including in the metaverse, have special responsibilities to ensure they are engaging with children appropriately. This includes determining if their advertising or online service is directed to children, meaning either that it is directed primarily to kids under age 13 or it targets a “mixed audience” of both kids under age 13 and older teens or adults. If it is, advertisers and operators must understand the additional responsibilities and legal requirements that come with advertising to children or collecting and handling their personal data.  

Here are a few things to keep in mind:

 

Make it your business to know your audience. 

Even if you did not intend to target children under 13, it can sometimes become apparent that children are engaging with your online service or advertising. Either way, you should take care to comply with CARU’s Advertising and Privacy Guidelines and the Children’s Online Privacy Protection Act (COPPA), to ensure your practices align with important truth-in-advertising principles and privacy rules. 

 

Identifying advertising becomes even more complex in the metaverse.

Basic truth-in-advertising principles require that advertising be identifiable as advertising. And yet in today’s complex digital world, advertising is often presented in a way that it is difficult for users, especially children, to make a distinction between advertising and non-advertising content, commonly referred to as “blurring.” Brand-sponsored worlds and experiences are common in virtual reality spaces and the metaverse. Often, these experiences are designed as advergames or are blended seamlessly into organic content to the point that it is not apparent to children that it is advertising. 

CARU’s Advertising Guidelines point to best practices that can help avoid confusion. They state that to prevent blurring of advertising and content, advertisers should use design techniques such as text size and color, positioning, and other visual or contextual clues such as borders or background shading to clearly delineate advertising from non-advertising content. In addition, where disclosures are needed to identify an advertising message or to distinguish advertising from non-advertising content, the Guidelines require those disclosures to be clear and conspicuous, considering not only the advertising format and media used but also children’s limited vocabulary and level of cognitive and language skills. 

In the metaverse, advertisers and operators should anticipate and stay aware of how the child audience will interact with the metaverse experience, including how, when, and where ads will be shown to them and/or how influencers will engage in the space. This process will help you to determine such things as where applicable disclosures should go, what clear and conspicuous disclosures should look like, and how frequently they should appear.   

 

When collecting data, get familiar with and comply with COPPA.

Metaverse spaces are likely going to involve the collection of loads of personal information. Any operator collecting, using, sharing, or storing children’s data should be aware of and comply with COPPA’s requirements. If an advertiser or operator is collecting, using, or disclosing personal information from children under 13, COPPA requires they first obtain verifiable parental consent before such collection, use, or disclosure.

And in data privacy, the details matter. COPPA requires operators to provide privacy notices of their collection, use, and disclosure practices with respect to children, written in clear, complete, and easily understood language, and to post the notices in places where parents can easily locate them. Even if privacy practices are operating smoothly on the back end, they must be clearly stated upfront.

 

Advertising to children should be age appropriate. 

If an operator’s site was initially designed for older teens and young adults, but it becomes apparent that a large majority of the audience is now children, say between the ages of 8-11, the operator has a responsibility to ensure that the advertising shown is appropriate for the child audience, such as determining whether what is being depicted condones or encourages practices that are detrimental to children’s health or wellbeing. 

 

Know, and understand, the rules of the road. 

As of today, there is no determined standard for the type or size of platforms that exist in the metaverse. It is likely that the metaverse will be a mix of both closed platforms and open, shared spaces with sandboxes. This mix of platform types within the metaverse means that when it comes to compliance with applicable rules and regulations, there is not now – nor is there ever likely to be – a one-size fits all approach. What we do know is that children will be active in the metaverse, so advertisers and operators need to plan now and begin to tailor their advertising and privacy practices in the metaverse with children in mind. While metaverse spaces are still evolving, CARU’s Self-Regulatory Advertising and Privacy Guidelines are a great place to start to be sure you get it right. 

Is the metaverse already here? Probably. Is the metaverse coming soon? Absolutely. Will the metaverse evolve? Of course, it will. And as the metaverse does evolve, CARU will continue to advise businesses on implementing CARU’s Advertising and Privacy Guidelines and age-appropriate design approaches to ensure children’s safety and wellbeing are considered and protected. CARU also will be monitoring these online spaces and taking action when practices there violate our guidelines. 

Suggested Articles

Blog

Old MacDonald Had an Engagement Farm: Lessons Learned from FTC v. NGL

Capturing user engagement is the foundation of internet commerce. And while the incentives to prompt greater engagement are certainly understandable, the recent NGL Labs case from the FTC raises important questions about the ethical and legal ramifications when companies try to artificially generate engagement among their userbase.
Read more
Blog

Independence Day Edition: CBPR Framework Offers “Checks & Balances”

Going, Going, Gone Global, a webinar on the CBPR Global Forum, delved into how privacy impacts businesses’ brand reputation and builds trust with key stakeholders, discussed the purpose of the Global CBPR, and its value to Global Forum members.
Read more
Blog

Industry Self-Regulation: Part of the Solution for Governing Generative AI

The spotlight on generative AI remains bright. The benefits and risks continue to be ever-present in the minds of business and political leaders. No matter the timing or the setting, the creation of transparency, accountability, and collaboration among stakeholders is key to successful industry self-regulation as is the importance of setting standards and best practices.
Read more
Blog

The Demise of “Chevron Deference”: Who Will Fill the Regulatory Gaps?

The Supreme Court's 1984 ruling in Chevron v. NRDC held that courts should defer to federal agencies’ interpretations of ambiguous federal laws so long as those interpretations are reasonable. So given the court’s decision to overturn it, where does that leave companies that want a level playing field and perhaps even to raise the bar, instead of racing to the bottom?
Read more