BBB National Programs’ Children’s Advertising Review Unit (CARU) has released new Guardrails for Child-Directed Advertising and Privacy in the Metaverse. As explained in a BBB press release, the Guardrails are intended to provide companies with best practices as they navigate the complexities of engaging with children in metaverse experiences. The Guardrails offer “actionable recommendations” on developing metaverse experiences directed to children, complying with existing advertising and privacy law, and engaging responsibly with children online. These guidelines build on earlier CARU guidance regarding metaverse activities.
As explained in greater depth below, the key takeaways are:
- Be transparent. All advertising should be easily recognizable as advertising.
- Understand when content becomes advertising (and what is not advertising).
- Know when, how, and how often to properly disclose advertising, including influencers and endorsements.
- Protect children’s privacy.
- Become familiar with the requirements of the Children’s Online Privacy Protection Act.
- Understand platforms’ privacy practices.
- Be aware of relevant state-level and international privacy regulations.
Turning first to advertising, the Guardrails urge companies to be transparent in their materials, noting that because of the child’s age, maturity, and development, companies may have to use more obvious forms of disclosures – such as stating “this is an ad” – to ensure that kids can more easily distinguish between advertising and editorial content. Notably, the Guardrails state that companies should conduct their own diligence to determine whether they comply with advertising truth and transparency laws – namely, determining when content becomes advertising. Companies should consider, among other things, (1) whether the content functions as an endorsement of a specific product or brand, (2) whether there is a call-to-action or sales messaging, and (3) whether the content is sponsored. The line between advertising and non-advertising content is thin and sometimes blurry, and the Guardrails require consideration of the subject matter through the lens of a child to ensure compliance. Companies should also consider whether the advertising is safe and appropriate for children – this inquiry stretches beyond form and considers subject matter, legality, and the emotional response the advertising may have on children.
Further, companies should consider best practices for disclosures, paying special attention to disclosures in social media, disclosures for ads specifically meant to promote products, and disclosures for branded products. Companies should consider erring on the side of disclosure, remembering the age of their target audience, using clear, simple language, and clearly separating advertising from non-advertising content.
Finally, the Guardrails consider child privacy in connection with the Children’s Online Privacy Protection Act (COPPA). COPPA offers guidelines for companies to determine whether their material is targeting children directly or whether the audience is considered “mixed.” Companies should determine whether they fall into the “directed at children” or “mixed audience” category and subsequently consider which guidelines apply to them. The FTC provides a set of factors to consider when determining whether a site or Metaverse platform is child-directed and, therefore, subject to more stringent requirements for data retention and processing.
Hunton’s team has extensive experience helping our clients design and deploy child-directed advertising and child-privacy compliant experiences in the metaverse and beyond.