Fifty Shades of Consumer Health Data: Unclear Expectations for Digital Privacy

May 30, 2023 by Divya Sridhar, Ph.D., Director, Privacy Initiatives, BBB National Programs

While momentum continues to build around what a regulated consumer health privacy landscape looks like, the environment remains shrouded in shades of gray. And, perhaps, gray is what U.S. laboratories of democracy are made of, especially in recent years on this topic where the Federal Trade Commission (FTC) has left the door open on its interpretation of what it means for companies to process “sensitive” consumer health data without being caught in the crosshairs of unfair and deceptive practices. 

And since the FTC has shared its intention to extend the application of the Health Breach Notification Rule to health app developers, we expect that net to widen.

Even with the passage of momentous consumer health privacy legislation in Washington state this year, industry has had to fill policy and regulatory gaps with commitments to best practices for treating sensitive data. Industry has also had to watch closely and pick up on regulatory themes across consumer health, advertising, and tech policy and in the Premom, GoodRx, BetterHelp, and Flo decisions (just to name a few).  

Meanwhile, California and Colorado are poised to begin enforcement around their new regulations, which far exceed what is in statute about the treatment of sensitive data, inferences, and health data as a subcategory. 

Some of these new state consumer privacy laws could further bring into scope provider websites and apps, as well as traditional entities that are covered by the Health Insurance Portability and Accountability Act (HIPAA). HIPAA-covered entities may believe they comply with HIPAA even while they are processing, sharing, or selling consumer health data in their patient intake forms, marketing sites, and general messaging portals, which may now be brought into scope or viewed under further scrutiny.

Yet, controversy on interpretation remains between state and federal regulators and the courts. Just this month, an Idaho federal judge hearing the FTC’s lawsuit against Kochava dismissed the company’s use of geolocation data as not reflecting “actual harm,” leaving the industry further confused about the implications for regulating sensitive data inferences, especially in the aftermath of states such as Colorado passing groundbreaking regulations that seek to define and regulate “sensitive data inferences” as its own category of data.


Washington’s Approach to Consumer Health Data

Washington has paved the way as the first state to create guardrails around consumer health data – likely soon to be followed by other copy-cat laws, including the quickly-moving Connecticut Senate Bill 3, that could fill the void left by the landmark Dobbs decision overturning Roe v. Wade. 

What makes the recent legislative and regulatory framework particularly challenging is that companies and consumers alike need to read between the lines to understand what these lawmakers mean when it comes to upholding robust protections for “consumer health information,” a blanket term that has come to encompass a wide array of health-specific data and health-related data, including social determinants of health, geolocation, and other interwoven data elements. 

For example, for publishers of content about health data and health websites offering resources, consumers may be merely conducting a “search” about their health or wellness activities. But Washington’s law would bring such publishers into scope, basically placing the same expectations on these publishers as the law would place on a company that is processing genetic or biometric data used to identify a specific individual.

And, even within subcategories of data, there are further fractures. Biometric data is defined so broadly in Washington’s health law that it would capture a variety of applications: 

  1. Biometric data identifying a specific individual – such as iPhone biometrics used for identification, and 
  2. More general AR/VR and health and wellness applications, which use biometric data to match facial geometry to someone’s physical characteristics to pair consumers with the right product (for example, products providing sunglasses or makeup application – though many such applications don’t store or retain the biometric device beyond the point- in- time interaction).


Per Washington’s law, all “products or services that are targeted to consumers in Washington” fall under the same scope, without a definition for the term “targeted” to consumers. Entities regulated by the new law are not scoped by revenue or data processing thresholds, but instead are scrutinized by data measured and processed about the consumer, if that data is processed in the state of Washington.


Would a risk-based approach to health data help alleviate confusion? 

To date, a risk-based approach to consumer health data does not (to our knowledge) exist. The ten state consumer privacy laws use their own baseline definitions of sensitive data, which, in some cases, parse health data separately or include health data as part of the definition of “sensitive data” and/or “sensitive data inferences.” In the case of so many differing definitions, one common risk-based framework would help with interpretation.

We are seeing similar models that may provide useful benchmarks and more reasonable indices of risk in the AI space. For example, the National Institute of Standards and Technology is building the AI Risk Management Framework to support more trustworthy practices in AI development, design, and deployment, an interesting model that the consumer health space could use to further parse the sensitivities across different types of consumer health data.

On the other hand, when we look to HIPAA, the data is uniformly treated as part of the same confidential category of “protected” health information - all communications strictly reserved for the provider-patient relationship (and any respective business associates). Other countries have created a blanket opt-in approach to regulating data, including the most common framework of opt-in in the GDPR, but, as we know, research demonstrates the clear shortcomings.

Academics have looked at the general consequences of regulating data and have shared their perspectives. Professor Dan Solove argues in his recent piece “Data Is What Data Does: Regulating Use, Harm, and Risk, Instead of Sensitive Data” that we should consider the impact of the risk and use of the data, rather than trying to regulate the data itself. This would ensure we are future-proofing our approach, mitigating the harms raised by the data in real time, and appropriately scoping the risks so that they are manageable rather than creating an unreasonable expectation to regulate all risk – particularly as it concerns the burden on regulators.

As of now, we believe a sliding scale for the risks carried by consumer health data – at least at face value – should exist, to avoid chilling innovation and placing unreasonable expectations for compliance.

Suggested Articles


Industry Self-Regulation: Part of the Solution for Governing Generative AI

The spotlight on generative AI remains bright. The benefits and risks continue to be ever-present in the minds of business and political leaders. No matter the timing or the setting, the creation of transparency, accountability, and collaboration among stakeholders is key to successful industry self-regulation as is the importance of setting standards and best practices.
Read more

The Demise of “Chevron Deference”: Who Will Fill the Regulatory Gaps?

The Supreme Court's 1984 ruling in Chevron v. NRDC held that courts should defer to federal agencies’ interpretations of ambiguous federal laws so long as those interpretations are reasonable. So given the court’s decision to overturn it, where does that leave companies that want a level playing field and perhaps even to raise the bar, instead of racing to the bottom?
Read more

Robust Dispute Resolution: A Quiet Enforcer for Privacy Compliance

ICYMI, a procedural rule change to update the GDPR has been agreed upon by the European Parliament to provide EU citizens with greater legal certainty regarding enforcement of GDPR, improve the dispute resolution process, and streamline the handling of cross-border cases.
Read more

How Will Customers Know They Can Trust Your Business?

When customers trust you, they are more likely to do business with you. It is well past time for business leaders to “galvanize around trust and transparency.” When it comes to enhancing consumer trust, responsible business and nonprofit organizations can – and must – lead the way.
Read more