Fifty Shades of Consumer Health Data: How a Risk-Based Approach Provides More Clarity

Jun 14, 2023 by Divya Sridhar, Ph.D., Director, Privacy Initiatives, BBB National Programs

As previously discussed, the FTC’s recent efforts to build up a record about the problematic activities happening in the overlapping digital consumer health and ad tech spaces is a laudable plan of action, likely to generate both positive and negative unintended consequences in the near term. 

On one hand, we know that industry will need to act in an agile manner to review the themes from the enforcement actions and the Washington My Health, My Data Act to determine the stronger protections it will need to have in place regarding the treatment, processing, sharing, and sale of consumer health data. On the flip side, in the absence of guidance and clarity about where the lines are drawn, industry faces the challenge of reading between the lines and having to extrapolate commonsense reconnaissance to determine a path forward. 

Rather than stand by idly, letting the regulatory activity set off alarm bells – leading to companies barring data processing in certain states, creating distinct consumer experiences within individual states, or halting the marketing, research, development, and product deployment in the consumer health space – industry can look to third-party accountability agents such as BBB National Programs to provide privacy compliance expertise, clarity about potential priorities and next steps, and governance support to shape industry’s plans to reconcile all of the regulatory activities in the fast-changing consumer health privacy environment. 


Is there a way to categorize the risk of consumer health data?

We have pulled together a brief list of routine examples of consumer health information, that, at face value may have one level of risk. But, depending on the context and the risk associated with the use of that data, and whether it is combined with other data sources and data elements or made available in the public domain, it could lend itself to differing levels of regulation and enforcement activity. 

In addition, we have categorized some “prohibited” use cases of sensitive consumer health data, which are the result of recent findings in FTC enforcement action and appropriately convey the guidelines developed by industry thought leaders of the digital advertising principles for self-regulation and their perspective on the intersection between consumer health and advertising.

Categorizing Sensitive Consumer Health Data

Prohibitions on Sensitive Consumer Health Data

The data in these diagrams should be further reviewed for its risk-based nature, the appropriate consumer consent being gathered, and any actual harm it brings to the consumer based on a variety of factors, including:

  1. The risk of the data’s use case;
  2. The primary purpose for which the data is collected and the use of the data for one or more secondary purposes (and appropriate consents obtained, particularly in the realm of third-party advertising or tracking);
  3. Whether the data is combined with other data elements in a “higher risk” category; and
  4. Context in which the data is used, including use of the data in a sensitive health-related location, such as a reproductive healthcare clinic. 


Other factors, such as how the data is retained and securely stored also matter and would need to be taken into consideration when determining the level of sensitivity of the data.  

The bottom line is, all things being equal, all consumer health data should not be viewed the same – there are clearly differing levels of risk associated with general purpose data use, its primary vs. secondary purpose, the risk profile or harms it generates, and more.


How can a risk-based framework specific to consumer health data support regulators?

Enforcement agencies have limited capacity to enforce the wide range of laws and regulations now manifesting in the patchwork of state consumer privacy laws. 

In the context of consumer health, companies need guidance that explains and refines best practices with regard to sensitive consumer health data, incorporating examples where these data elements would likely fall into scope, and contexts when, even when the data is in scope, it may present so low of a risk profile that it would be unnecessary to closely enforce the preliminary expectations set out in the Washington My Health, My Data Act, or similar laws. 

To supplement, industry accountability programs that review, reinforce, and verify these best practices can provide an early review and a potential backstop for enforcement.

The idea is not for the United States to follow in the same footsteps as the European Union and other countries, but instead to enshrine a reasonable consumer standard for privacy law and demonstrate that accountability matters. More robust data protection can be amplified by identifying the data’s risk profile, coupled with actual harms imposed, rather than hypothetical expectations of what could potentially cause or predict harm.

Concurrently, industry should obtain the appropriate levels of consent, fulfill its duty of care toward all consumer health data processed, and take reasonable steps – including incorporating impact assessments and risk mitigation techniques – to ensure the highest standard of protections for all consumer health data.

In the age of wearables, wellness apps, generative AI, the metaverse, and other quickly changing “moving targets” of the emerging tech world and its enthusiastic consumers, we must seek to build reasonable frameworks around data use based on its risk profile rather than focusing on a blanket approach that will chill innovation and limit new markets in research and development. 

Otherwise, we limit the potential to meet admirable goals, whether it is to crowdsource a cure for cancer, make consumer health applications and services accessible to special needs populations, or other innovations that are now baked into the 247-year-old pursuit of the American dream.

Similar to this, read Consumer health data: A risk-based approach to digital privacy at

Suggested Articles


Developing Principles and Protocols for Recruiting and Hiring with AI

Employing AI in the recruiting and hiring process voluntarily, under the auspices of independent industry self-regulation, is often far preferable to being forced to do so under a regime of top-down government regulation.
Read more

A Not-So-Sweet Sixteen? Teen Online Privacy and Safety Faces New Policy Dilemmas

Pop culture powerhouse Barbie teaches us that corporations can have a long-lasting impact on children and teens, and the FTC seems to agree, adopting an aggressive stance on children’s and teen privacy in the last few months. We break down what this means for companies in looking to engage a child or teen audience.
Read more

Spilling the Tea on AI Accountability: An Analysis of NTIA Stakeholder Comments

The NTIA recently issued a request for comment to gather stakeholder feedback on AI accountability measures and policies to assist in the crafting of a report on AI accountability policy and the AI assurance regime. Nearly 200 organizations responded and we pulled a diverse, representative sample of the responses to summarize stakeholder feedback on this important question.
Read more

A Privacy Review a Day, Keeps the Regulators Away

The FTC and HHS sent a warning letter to nearly 130 hospitals and health systems cautioning them about their data privacy and security practices. Notably, these are entities that comply with HIPAA. Aren’t companies in compliance with HIPAA already covering their consumer data privacy bases? Not completely.
Read more