Teens Online: Responsible Data Considerations For Business Leaders
Aug 25, 2022 by Eric D. Reicin, President & CEO, BBB National Programs
As we spend more time online, anyone with teenagers in their lives knows just how firmly they are planted in the digital world. Eszter Hargittai, sociologist and professor, chose to describe this demographic as "digital natives." Technology and social media scholar Danah Boyd adds that “Familiarity with the latest gadgets or services is often less important than possessing the critical knowledge to engage productively with networked situations, including the ability to control how personal information flows and how to look for and interpret accessible information.”
But just because teenagers grew up with technology as “digital natives” does not mean they are all digitally literate and truly aware of the risks they face in digital spaces.
Experts agree that teens are more risk-seeking than younger children but less competent than adults at managing online risks. Though there are benefits to the teenage adopted technologies, there are justifiably heightened concerns around teen privacy that are not yet being fully addressed.
In the current landscape of privacy protections there exist some unique potential harms specific to teens that businesses looking to responsibly engage with teens should consider.
Currently, the only federal privacy law intended to protect a certain age group is the Children’s Online Privacy Protection Act (COPPA), which is designed to protect children under age 13. Though several pending pieces of proposed legislation (e.g., the American Data Privacy and Protection Act, the Children and Teens’ Online Privacy Protection Act, and the Kids Online Safety Act) attempt to protect teen’s privacy online, it is unclear whether any will be enacted into law during this session of Congress.
On the state level, the California Senate Judiciary Committee has been discussing the California Age Appropriate Design Code Bill (AADC), which requires companies to “prioritize the privacy, safety, and well-being of children over commercial interests,” as well as the Social Media Platform Duty to Children Act, which would empower California to take “reasonable, proportional, and effective steps to ensure that its children are not harmed by addictions of any kind.” As of the date of this article, similar to federal legislation, it is unclear whether either will be enacted into law this year.
What is clear is that California legislators modeled the AADC after the U.K.’s Age Appropriate Design Code, which went into effect last fall and has made an initial impact in the U.K. For example, "Google made SafeSearch its default browser for minors, YouTube turned off autoplay for those under 18, and TikTok and Instagram disabled direct messages between children and adults they do not follow." (Full disclosure: Google is one of our National Partners.)
Potential Harms Unique To Older Teens
In addition to harms outlined in my recent article on the metaverse, such as the overcollection of data, age-inappropriate content, and cyberbullying, thoughtful business leaders engaging with teens in the digital space also should consider the following:
- The creation of a digital footprint outside of a teen’s control or awareness: According to a report from Youth Tech Health, “Teens tend to first disclose and then evaluate consequences. The way teens learn how to manage privacy risk online is often very different from how adults approach privacy management. The process is more experiential in nature for teens.” Given teens’ propensity for doing now and evaluating later, they can be more vulnerable to creating a digital footprint outside of their own control.
- Amplification of interests or insecurities in a way that intensifies harmful thoughts or behaviors: Teens can be more susceptible to the impact of instant validation. As Dr. Ana M. Hernandez-Puga put it, “They can watch social media accounts nearly 24/7 to check how many likes appear on a post or photo. If those likes and shares fail to materialize, they are often left feeling unloved, imperfect, and insecure.”
- Inadequate information provided for teen comprehension/safe use of the product: Given all the unique concerns and vulnerabilities of teens, one would think that companies would specifically design their products with these users in mind. However, the authors of a recent report found the opposite: “Companies focus largely on their imagined ‘average’ user, which tends to be younger (but still adult), white, and male.” This focus often leads to the introduction of features that are addictive and prey on teens’ insecurities and desire for social approval.
Establishing Responsible Data Ethics
So, what do responsible data ethics for teens look like? To address that question, here are four guiding considerations for business leaders to consider.
- Build awareness of data privacy among teens. I believe businesses have a responsibility to help educate teen consumers about privacy risks and how to manage their privacy choices. To do this, consider disclosing (in terms that teens can easily understand) what types of personal data the business will collect and how the teen user can control their personal information. Also, when appropriate, businesses can also provide parents with educational resources to support their teens’ privacy awareness.
- Responsibly process data from teens. Even if your product or service is not targeted directly to teens, if there is any potential for teenage consumers, consider proactively looking at your privacy practices from the lens of a teen and their unique needs. What do your default settings look like? How are you using sensitive personal information? How accessible are your privacy choices? Whenever possible, require teens to provide affirmative opt-in consent to the collection of personal information.
- Build guardrails for teen interactions. When a system facilitates the sharing of information among people, trust and safety should be considered with special regard for the unique needs of teens.
- Focus on appropriate content for teens. With any system that pushes out content, especially content that is tailored based on the interests and behaviors of an individual, teens’ unique risks and harms should be considered. For example, consider avoiding targeted content to teens using a single factor that might amplify existing insecurities.
Business leaders should not ignore the risks their products might pose to teens and should proactively address these risks and design their products with their most vulnerable potential users in mind.