Fifty Shades of Consumer Health Data: Unclear Expectations for Digital Privacy

May 30, 2023 by Divya Sridhar, Ph.D., Director, Privacy Initiatives, BBB National Programs

While momentum continues to build around what a regulated consumer health privacy landscape looks like, the environment remains shrouded in shades of gray. And, perhaps, gray is what U.S. laboratories of democracy are made of, especially in recent years on this topic where the Federal Trade Commission (FTC) has left the door open on its interpretation of what it means for companies to process “sensitive” consumer health data without being caught in the crosshairs of unfair and deceptive practices. 

And since the FTC has shared its intention to extend the application of the Health Breach Notification Rule to health app developers, we expect that net to widen.

Even with the passage of momentous consumer health privacy legislation in Washington state this year, industry has had to fill policy and regulatory gaps with commitments to best practices for treating sensitive data. Industry has also had to watch closely and pick up on regulatory themes across consumer health, advertising, and tech policy and in the Premom, GoodRx, BetterHelp, and Flo decisions (just to name a few).  

Meanwhile, California and Colorado are poised to begin enforcement around their new regulations, which far exceed what is in statute about the treatment of sensitive data, inferences, and health data as a subcategory. 

Some of these new state consumer privacy laws could further bring into scope provider websites and apps, as well as traditional entities that are covered by the Health Insurance Portability and Accountability Act (HIPAA). HIPAA-covered entities may believe they comply with HIPAA even while they are processing, sharing, or selling consumer health data in their patient intake forms, marketing sites, and general messaging portals, which may now be brought into scope or viewed under further scrutiny.

Yet, controversy on interpretation remains between state and federal regulators and the courts. Just this month, an Idaho federal judge hearing the FTC’s lawsuit against Kochava dismissed the company’s use of geolocation data as not reflecting “actual harm,” leaving the industry further confused about the implications for regulating sensitive data inferences, especially in the aftermath of states such as Colorado passing groundbreaking regulations that seek to define and regulate “sensitive data inferences” as its own category of data.

 

Washington’s Approach to Consumer Health Data

Washington has paved the way as the first state to create guardrails around consumer health data – likely soon to be followed by other copy-cat laws, including the quickly-moving Connecticut Senate Bill 3, that could fill the void left by the landmark Dobbs decision overturning Roe v. Wade. 

What makes the recent legislative and regulatory framework particularly challenging is that companies and consumers alike need to read between the lines to understand what these lawmakers mean when it comes to upholding robust protections for “consumer health information,” a blanket term that has come to encompass a wide array of health-specific data and health-related data, including social determinants of health, geolocation, and other interwoven data elements. 

For example, for publishers of content about health data and health websites offering resources, consumers may be merely conducting a “search” about their health or wellness activities. But Washington’s law would bring such publishers into scope, basically placing the same expectations on these publishers as the law would place on a company that is processing genetic or biometric data used to identify a specific individual.

And, even within subcategories of data, there are further fractures. Biometric data is defined so broadly in Washington’s health law that it would capture a variety of applications: 

  1. Biometric data identifying a specific individual – such as iPhone biometrics used for identification, and 
  2. More general AR/VR and health and wellness applications, which use biometric data to match facial geometry to someone’s physical characteristics to pair consumers with the right product (for example, products providing sunglasses or makeup application – though many such applications don’t store or retain the biometric device beyond the point- in- time interaction).

 

Per Washington’s law, all “products or services that are targeted to consumers in Washington” fall under the same scope, without a definition for the term “targeted” to consumers. Entities regulated by the new law are not scoped by revenue or data processing thresholds, but instead are scrutinized by data measured and processed about the consumer, if that data is processed in the state of Washington.

 

Would a risk-based approach to health data help alleviate confusion? 

To date, a risk-based approach to consumer health data does not (to our knowledge) exist. The ten state consumer privacy laws use their own baseline definitions of sensitive data, which, in some cases, parse health data separately or include health data as part of the definition of “sensitive data” and/or “sensitive data inferences.” In the case of so many differing definitions, one common risk-based framework would help with interpretation.

We are seeing similar models that may provide useful benchmarks and more reasonable indices of risk in the AI space. For example, the National Institute of Standards and Technology is building the AI Risk Management Framework to support more trustworthy practices in AI development, design, and deployment, an interesting model that the consumer health space could use to further parse the sensitivities across different types of consumer health data.

On the other hand, when we look to HIPAA, the data is uniformly treated as part of the same confidential category of “protected” health information - all communications strictly reserved for the provider-patient relationship (and any respective business associates). Other countries have created a blanket opt-in approach to regulating data, including the most common framework of opt-in in the GDPR, but, as we know, research demonstrates the clear shortcomings.

Academics have looked at the general consequences of regulating data and have shared their perspectives. Professor Dan Solove argues in his recent piece “Data Is What Data Does: Regulating Use, Harm, and Risk, Instead of Sensitive Data” that we should consider the impact of the risk and use of the data, rather than trying to regulate the data itself. This would ensure we are future-proofing our approach, mitigating the harms raised by the data in real time, and appropriately scoping the risks so that they are manageable rather than creating an unreasonable expectation to regulate all risk – particularly as it concerns the burden on regulators.

As of now, we believe a sliding scale for the risks carried by consumer health data – at least at face value – should exist, to avoid chilling innovation and placing unreasonable expectations for compliance.

Suggested Articles

Blog

CFBAI and CCAI Publish the 2023 Annual Report on Participant Compliance and Program Progress

BBB National Programs has released the Children’s Food and Beverage Advertising Initiative (CFBAI) and Children’s Confection Advertising Initiative (CCAI) 2023 Annual Report. The report notes excellent compliance by the 22 CFBAI participants and the six CCAI participants in 2023.
Read more
Blog

The Case for Teaching Industry Self-Regulation in Law, Business, and Public Policy Schools

Law schools, business schools, and public policy programs have a unique opportunity to shape the future of corporate behavior by teaching students the importance of soft law and independent industry self-regulation.
Read more
Blog

5 Missteps to Avoid When Applying or Recertifying to the DPF Program

Each year, participants in the DPF Program need to recertify with the Department of Commerce. To help companies navigate it, our Global Privacy Division has outlined five key recommendations to keep in mind to avoid common missteps with the process.
Read more
Blog

Sharing Holiday Cheer (but Not a Child’s Personal Information)

Not surprisingly, cell phones, connected toys, and toys advertised on social media top wish lists of kids everywhere. To help ensure your holiday shopping experiences are as safe as possible, the team at CARU put together some holiday tips.
Read more