Fifty Shades of Consumer Health Data: Unclear Expectations for Digital Privacy

May 30, 2023 by Divya Sridhar, Ph.D., Director, Privacy Initiatives, BBB National Programs

While momentum continues to build around what a regulated consumer health privacy landscape looks like, the environment remains shrouded in shades of gray. And, perhaps, gray is what U.S. laboratories of democracy are made of, especially in recent years on this topic where the Federal Trade Commission (FTC) has left the door open on its interpretation of what it means for companies to process “sensitive” consumer health data without being caught in the crosshairs of unfair and deceptive practices. 

And since the FTC has shared its intention to extend the application of the Health Breach Notification Rule to health app developers, we expect that net to widen.

Even with the passage of momentous consumer health privacy legislation in Washington state this year, industry has had to fill policy and regulatory gaps with commitments to best practices for treating sensitive data. Industry has also had to watch closely and pick up on regulatory themes across consumer health, advertising, and tech policy and in the Premom, GoodRx, BetterHelp, and Flo decisions (just to name a few).  

Meanwhile, California and Colorado are poised to begin enforcement around their new regulations, which far exceed what is in statute about the treatment of sensitive data, inferences, and health data as a subcategory. 

Some of these new state consumer privacy laws could further bring into scope provider websites and apps, as well as traditional entities that are covered by the Health Insurance Portability and Accountability Act (HIPAA). HIPAA-covered entities may believe they comply with HIPAA even while they are processing, sharing, or selling consumer health data in their patient intake forms, marketing sites, and general messaging portals, which may now be brought into scope or viewed under further scrutiny.

Yet, controversy on interpretation remains between state and federal regulators and the courts. Just this month, an Idaho federal judge hearing the FTC’s lawsuit against Kochava dismissed the company’s use of geolocation data as not reflecting “actual harm,” leaving the industry further confused about the implications for regulating sensitive data inferences, especially in the aftermath of states such as Colorado passing groundbreaking regulations that seek to define and regulate “sensitive data inferences” as its own category of data.


Washington’s Approach to Consumer Health Data

Washington has paved the way as the first state to create guardrails around consumer health data – likely soon to be followed by other copy-cat laws, including the quickly-moving Connecticut Senate Bill 3, that could fill the void left by the landmark Dobbs decision overturning Roe v. Wade. 

What makes the recent legislative and regulatory framework particularly challenging is that companies and consumers alike need to read between the lines to understand what these lawmakers mean when it comes to upholding robust protections for “consumer health information,” a blanket term that has come to encompass a wide array of health-specific data and health-related data, including social determinants of health, geolocation, and other interwoven data elements. 

For example, for publishers of content about health data and health websites offering resources, consumers may be merely conducting a “search” about their health or wellness activities. But Washington’s law would bring such publishers into scope, basically placing the same expectations on these publishers as the law would place on a company that is processing genetic or biometric data used to identify a specific individual.

And, even within subcategories of data, there are further fractures. Biometric data is defined so broadly in Washington’s health law that it would capture a variety of applications: 

  1. Biometric data identifying a specific individual – such as iPhone biometrics used for identification, and 
  2. More general AR/VR and health and wellness applications, which use biometric data to match facial geometry to someone’s physical characteristics to pair consumers with the right product (for example, products providing sunglasses or makeup application – though many such applications don’t store or retain the biometric device beyond the point- in- time interaction).


Per Washington’s law, all “products or services that are targeted to consumers in Washington” fall under the same scope, without a definition for the term “targeted” to consumers. Entities regulated by the new law are not scoped by revenue or data processing thresholds, but instead are scrutinized by data measured and processed about the consumer, if that data is processed in the state of Washington.


Would a risk-based approach to health data help alleviate confusion? 

To date, a risk-based approach to consumer health data does not (to our knowledge) exist. The ten state consumer privacy laws use their own baseline definitions of sensitive data, which, in some cases, parse health data separately or include health data as part of the definition of “sensitive data” and/or “sensitive data inferences.” In the case of so many differing definitions, one common risk-based framework would help with interpretation.

We are seeing similar models that may provide useful benchmarks and more reasonable indices of risk in the AI space. For example, the National Institute of Standards and Technology is building the AI Risk Management Framework to support more trustworthy practices in AI development, design, and deployment, an interesting model that the consumer health space could use to further parse the sensitivities across different types of consumer health data.

On the other hand, when we look to HIPAA, the data is uniformly treated as part of the same confidential category of “protected” health information - all communications strictly reserved for the provider-patient relationship (and any respective business associates). Other countries have created a blanket opt-in approach to regulating data, including the most common framework of opt-in in the GDPR, but, as we know, research demonstrates the clear shortcomings.

Academics have looked at the general consequences of regulating data and have shared their perspectives. Professor Dan Solove argues in his recent piece “Data Is What Data Does: Regulating Use, Harm, and Risk, Instead of Sensitive Data” that we should consider the impact of the risk and use of the data, rather than trying to regulate the data itself. This would ensure we are future-proofing our approach, mitigating the harms raised by the data in real time, and appropriately scoping the risks so that they are manageable rather than creating an unreasonable expectation to regulate all risk – particularly as it concerns the burden on regulators.

As of now, we believe a sliding scale for the risks carried by consumer health data – at least at face value – should exist, to avoid chilling innovation and placing unreasonable expectations for compliance.

Suggested Articles


American Privacy Rights Act: A Primer for Business

Was it the recent series of natural phenomena that prompted Congress to move on a bipartisan, bicameral federal privacy bill? We can’t say with certainty, but we can outline for you what we believe to be, at first glance, the most compelling elements of the American Privacy Rights Act of 2024 (APRA).
Read more

Take Care of Your “Health-Lite” Claims

Some advertisers believe they can avoid scrutiny when making health-related claims by making their claim “softer.” But context is key. Health benefit claims must comply with the FTC’s Health Products Compliance Guidance. The substantiation bar is not lowered by changing the approach to the health-related claim.
Read more

Bullish but Cautionary: A Balanced Way to Approach the Impact of AI

Business and nonprofit leaders in the U.S. may not feel so weighty a responsibility in assessing the global impact of AI, but we must realize AI’s power to impact our organizations, our local economies, our sectors, and our nation.
Read more

New Rules of the Road Can Sustain US Leadership on Interoperable Digital Data Flows

President Biden closed February 2024 with an EO that signaled an important development for how the U.S. plans to position and guard itself from global adversaries, and speaks volumes about how the U.S. views the next-generation impacts of data flows on the digital economy and how our nation can be better equipped as a global leader. Read our takeaways and future considerations.
Read more