Protecting Consumer Health Data Privacy Beyond HIPAA

Jun 15, 2022 by Eric D. Reicin, President & CEO, BBB National Programs

If you look at the apps on your phone, chances are you have at least one related to your health—and probably several. Whether it is a mental health app, a fitness tracker, a connected health device, or something else, many of us are taking advantage of this technology to keep better track of our health in some shape or form. Recent research from the Organization for the Review of Care and Health Applications found that 350,000 health apps were available on the market, 90,000 of which launched in 2020 alone.

While these apps have a great deal to offer, it is not always clear how the personal information we input is collected, safeguarded, and shared online. Existing health privacy law, such as the Health Insurance Portability and Accountability Act (HIPAA), is primarily focused on the way hospitals, doctors’ offices, clinics, and insurance companies store health records online. The health information these apps and health data tracking wearables are collecting typically do not receive the same legal protections.


Why This Is Potentially Troubling

Without additional protections in place, companies may share (and potentially monetize) personal health information in a way consumers may not have authorized or anticipated. As Sara Morrison explains in Recode, “The app economy is based on tracking app users and making inferences about their behavior to target ads to them. ... That means health apps collect data that we consider to be our most sensitive and personal but may not protect it as well as they should.”

Take GoodRx, for example—an app that helps users save money on prescription drugs by finding price comparisons and coupons. While this app was helping millions of people save money, in early 2020 Consumer Reports found GoodRx to be sharing these personal details with tech and marketing companies. And some of that data was shared further. The company has made changes since then.

More recently, in 2021, Flo Health faced a Federal Trade Commission (FTC) investigation. The FTC alleged in a complaint that “despite express privacy claims, the company took control of users’ sensitive fertility data and shared it with third parties—a broken promise that left consumers feeling ‘outraged,’ ‘victimized’ and ‘violated.’” Flo Health and the FTC settled the matter with a Consent Order requiring the company to get app users’ express affirmative consent before sharing their health information as well as to instruct the third parties to delete the data they had obtained.


Current Landscape Of Health Data Protections

Section 5 of the FTC Act empowers the FTC to initiate enforcement action against unfair or deceptive acts, meaning the FTC can only act after the fact if a company’s privacy practices are misleading or cause unjustified consumer harm. While the FTC is doing what it can to ensure apps are keeping their promises to consumers around the handling of their sensitive health information, the rate at which these health apps are hitting the market demonstrates just how immense of a challenge this is.

The FTC chair is speaking out on this issue. In April, during her first public remarks on privacy issues since becoming chair last year, Lina Khan said that the agency would continue to use its existing statutory authorities and its power to police unfair and deceptive data practices to "take swift and bold action" against companies that misuse or fail to adequately secure consumers' personal information.

As to the prospects for federal legislation, commentators suggest that comprehensive federal privacy legislation seems unlikely in the short term. States have begun implementing their own solutions to shore up protections for consumer-generated health data. California has been at the forefront of state privacy efforts, first with the California Consumer Privacy Act (CCPA) of 2018, and more recently by establishing the California Privacy Rights Act (CPRA). Virginia, Colorado, and Utah have also recently passed state consumer data privacy legislation, and other states are considering legislation as well.


The Path Forward

Recently, my organization was selected to implement and house a self-regulatory program for the implementation of the Consumer Privacy Framework for Health Data, released by the Executives for Health Innovation (EHI) and the Center for Democracy and Technology in February 2021.

I think the most critical step for many businesses is to recognize that they are collecting health data and to become familiar with the legal potholes that exist in the landscape. These companies must have robust privacy practices in place and should always err on the side of caution.

For instance, when collecting and sharing consumer health information of any kind, carefully consider whether your privacy practices require opt-in consent. With a lack of clear guidance on certain non-HIPAA data collection and use, choosing an opt-out model may have negative downstream effects for your organization.

Be specific about the data you are collecting. For example, the Digital Advertising Alliance (DAA) Self-Regulatory Principles require opt-in consent for the collection of data regarding “pharmaceutical prescriptions” or “medical records.” (Disclosure: BBB National Programs’ Digital Advertising Accountability Program serves as an accountability agent for DAA, and we are compensated for our work.) And the FTC, in providing privacy best practices for mobile health app developers, also indicates that apps or devices collecting health data should get a user’s “affirmative express consent” before collecting or sharing that data.

In addition to proper consent procedures, companies collecting health data should ensure that their apps and devices that collect consumer health information comply with the FTC’s Health Breach Notification Rule.

While the pandemic certainly contributed to an increased reliance on technology to track personal health data, the use of digital technology to help us stay in tune with our health is not likely to slow down. With any new technology, there is always a period of assessment by the market and a watchful eye cast by regulators. By building in greater privacy protections from the outset, companies can avoid having to make changes down the road in response to future regulation.

Originally published in Forbes.

Suggested Articles


CARU in the 90s and 00s: Privacy & the Internet

The Children’s Advertising Review Unit (CARU) Privacy Guidelines helped form the foundation of COPPA. The arrival of Y2K brought with it an accelerated pivot from traditional advertising to online advertising and experiences, and new challenges in privacy compliance. Read about CARU's notable cases in children’s data privacy.
Read more

Why Trust is Essential to Success in Business

Trust cannot be imposed by the government, nor can it be proclaimed by a single company operating in a vast marketplace, and that has been true for decades. The building blocks of trust must come not just from businesses themselves but ideally from the industries of which they are a part.
Read more

Renewal Season: 5 Tips to Ensure a Smooth Data Privacy Framework Process

U.S. companies in the Data Privacy Framework Program (DPF) program recertify each year with the Department of Commerce to assess and account for how they handle and process personal data that originates in the EU, U.K., and/or Switzerland. Here are 5 tips for making it a smooth process.
Read more

The Evolution of CARU: Laying the Foundation in the 70s and 80s

For the last 50 years, companies marketing to children have held each other to a high ethical standard. The Children’s Advertising Review Unit (CARU) was established in 1974 as the U.S. mechanism of independent self-regulation for protecting children from deceptive or inappropriate advertising. Spanning decades, CARU’s early cases reflect the evolution of the children’s advertising and marketing space.
Read more