A Not-So-Sweet Sixteen? Teen Online Privacy and Safety Faces New Policy Dilemmas
Aug 17, 2023 by BBB National Programs Privacy Initiatives Team
Pop culture powerhouse Barbie teaches us that corporations can have a long-lasting impact on children and teens, and the Federal Trade Commission (FTC) seems to agree, adopting an aggressive stance on children’s and teen privacy in the last few months.
For instance, the FTC’s recent complaint and proposed order against Amazon for its Echo Dot product line alleges that Amazon violated the Childrens Online Privacy Protection Act (COPPA) by failing to adhere to deletion requests and data minimization requirements with regard to child voice and geolocation data. The FTC has also been actively enforcing its children’s privacy authorities in the gaming industry, as seen in the recent complaint against Microsoft for practices related to Xbox Live services, which echoed many of the alleged violations in last year’s $275 million Epic Games settlement.
While FTC enforcement under COPPA and Section 5 of the FTC Act has dominated recent reporting on children’s online privacy and safety efforts, the recent 9th Circuit decision in Jones v. Google demonstrates that, under US law, states and private litigants may also play a significant role in shaping how teens and children are protected from online harms.
In Jones, a lawsuit representing a class of children alleged that Google violated various state tort and consumer protection law theories that substantially overlap prohibited acts under COPPA. Google argued that COPPA preempts these state laws as COPPA contains no provision for a private right of action (where private litigants would be able to enforce the prohibitions under the law instead of the government).
While COPPA does contain an explicit section on state law preemption, a panel of the 9th Circuit ruled that COPPA does not bar states from creating laws that supplement the protections and remedies contained within COPPA, merely prohibiting states from adopting laws that are inconsistent with Congress’ objectives to protect children. As such, 9th Circuit precedent now supports the notion that states may build legislation and enforcement regimes on top of COPPA’s extant and exacting requirements.
This decision (which Google is seeking to appeal en banc at the 9th Circuit) could not have come at a more uncertain time in children’s privacy, as a handful of state legislatures have already taken the plunge and adopted their own stringent requirements on designing online services for minors. So, now what? We provide an overview of the evolving teen privacy policy landscape, followed by a set of clear areas of focus for companies seeking to comply.
The Changing State Teen Privacy Landscape
The main categories of actions taken at the state level to combat online harms to minors include:
- Comprehensive state consumer privacy laws that include specific provisions on rights and protections for minors.
- Online trust and safety laws narrowly focused on regulating social media platforms.
- Law with implications for content moderation.
Comprehensive Consumer Privacy Laws
In May, the Florida Legislature passed a consumer Digital Bill of Rights. While the bill has several unique elements unseen in previous consumer privacy legislation, provisions related to protecting children from online harm have been compared to last year’s California Age-Appropriate Design Code Act (AADC).
Section 2 of the Florida statute establishes prohibitions specific to the processing of personal information from children—defined as users under the age of 18. Where a company knows that children use its platform, the company is prohibited from processing that data if it results in substantial harm or privacy risk.
Unlike the California AADC, the Florida Digital Bill of Rights does not create a general duty for companies to act in the best interest of children. Instead, Florida defined explicit and reasonably foreseeable categories of harms and privacy risks to minors, including mental health harms, addictive behaviors, violence and harassment, sexual exploitation, and deceptive or unlawful marketing. Much like the AADC, however, Florida’s Digital Bill of Rights comes into effect in July 2024.
Meanwhile in Connecticut, recent amendments to S.B. 3, an Act Concerning Online Privacy, Data and Safety Protections, now require consent from a minor (or their parents if under the age of 13) before their personal data can be processed. Safeguards and tools must be established to protect minors from certain communications as well as enable them to exercise their data subject rights. Further restrictions are placed on targeted advertising, profiling, selling personal data, and any automated decision making.
Social Media Trust & Safety Laws
Some states have taken a narrower focus on minors’ protections, adopting specialized legislation regarding the use and design of social media platforms.
In May, Montana’s Governor Greg Gianforte signed S.B. 419 into law, banning the download and use of the blockbuster social media app TikTok on all commercial and personal devices within the state. This is notably a step further from previous federal proposals to ban TikTok on all government employee devices and from similar executive orders issued by a handful of governors from other states.
Immediately after adopting a comprehensive privacy law in June, the Texas legislature passed the Securing Children Online through Parental Empowerment Act (SCOPE Act). This law seeks to strengthen parental autonomy over minors’ data and assign stringent legal duties to social media companies to prevent practices that could harm known minors. For example, the SCOPE Act provides parents the ability to submit specific requests regarding their minor’s data, including access, download, and deletion requests.
Similarly, the recent Utah Social Media Regulation Amendments passed in May requires parents to have access and to provide consent for data processing of minors under 18, including information about their account. A copycat bill mimicking Utah’s legislation has emerged in Arkansas’ Social Media Safety Act, which prioritizes express parental consent, age verification, and data minimization regarding any personal information used to verify age.
Laws with Content Moderation Consequences
Both California and Texas have specialized data privacy laws with First Amendment consequences. While facing legal challenges, the California Age-Appropriate Design Code Act (AADC) is an example of a law in the children’s tech space undergoing First Amendment scrutiny for potential content moderation. Such scrutiny is also addressing the effect the AADC’s Impact Assessment requirement might have on companies’ First Amendment rights, a requirement not addressed in federal bills. Texas’ SCOPE Act requires the company to create a strategy to prevent exposing the minor to harmful and inappropriate content, which in and of itself may set artificial boundaries.
It remains to be seen how government regulation of individual social media use will challenge our norms, practices, and plans. Yet the above examples will create important precedent in the United States around tech accountability, regulation, and governance.
How to Prepare for the Patchwork
BBB National Programs is among a handful of organizations studying the issues of minors’ privacy protections, releasing the TeenAge Privacy Program (TAPP) Roadmap to help companies responsibly collect and manage teen data, minimizing the likelihood of risks to minors. Even as the growing state patchwork for kids and teens privacy issues continues to evolve, the Roadmap evolves with it, offering companies operational considerations for their privacy practices to streamline compliance efforts and stay ahead of the curve.
Here are the principal areas where new teen privacy laws are having an impact on a company’s day-to-day privacy practices:
- Parental Consent: Some states (e.g., TX) require parental consent for minors under 18 to sign up for specific types of social media services, while other states (e.g., CT) require parental consent prior to processing data from users between 13-15. Parental consent generally must be explicit and provided by a verified parent before a child may use a social platform.
- Geolocation Data: Collection standards vary, but most states are specific about the types of geolocation data that are acceptable, and precise geolocation data is limited in some way under each state law. Some states (e.g., CT) allow limited geolocation data collection only if such data is needed to provide the digital product or service and only for the period that the respective product or service is being used, while other states (e.g., TX) do not permit geolocation data collection of minors at all.
- Targeted Advertisements: Some laws propose outright prohibitions against targeting advertisements to children and minors under 18 (e.g., UT) while others (e.g., TX) allow targeted advertising if it is needed to provide the online service so long as the digital platform obtains parental consent first. Florida has no restrictions on targeted advertising, but it does require that such practices be clearly stated in a privacy notice, if applicable.
- Knowledge Standards: Most laws (e.g., FL and CT) use an actual knowledge standard where the digital provider is aware that minors use its services. Other laws (e.g., CA) implement a constructive knowledge standard, which requires the provider to err on the side of caution if the service is “likely” to be used by a minor.
- Data Minimization: Most laws mention data minimization practices but vary in language describing standards. Some require companies to only collect personal information that is “reasonably necessary,” while others require a “compelling reason” to collect the data.
- Impact Assessments: While some states have required impact assessments in their comprehensive data privacy bills (e.g., CT and FL), California’s AADC stands alone in requiring an impact assessment for any child or minor-targeted online products or services.
- Age Verification: Some states require companies to implement methods to estimate age, some (e.g., TX) require that a parent or guardian register the minor’s age with the digital platform, and others (e.g., AK) require third-party age verification methods.
- Reporting Features: Some states (e.g., CA and TX) require companies to implement tools or strategies to address potential harms to children, while others (e.g., UT) go so far as to provide a private right of action for children who have been harmed on a social media platform.
- Parental Controls: Texas and Utah have the strictest requirements regarding parental controls. Other states do not generally require parents to have access to minors’ accounts.
At BBB National Programs, we will continue to closely monitor legislation affecting teen privacy. We are also actively updating our TAPP Roadmap as the teen online privacy and safety landscape continues to evolve. We do this work so that businesses will have a trusted guide on how to design safe and meaningful digital experiences for teen-age users. If you have questions about our TAPP Roadmap or what these latest changes in state policy might mean for your business, contact us at TAPP@bbbnp.org.
Article Contributors:
Dona Fraser, Senior Vice President, Privacy Initiatives, BBB National Programs
Dr. Divya Sridhar, Director of Privacy Initiatives, BBB National Programs
Miles Light, Counsel, Privacy Technology, BBB National Programs
KimberMarie Faircloth, Legal Intern, Privacy Initiatives, BBB National Programs