The glut of healthcare information that is being generated through electronic medical records, wearable devices and centralized databases offers potential to improve care, lower costs and drive efficiencies.
But where should the limits be for the collection and use of all that data?
[ Related: What happens with data from mobile health apps? ]
A team of privacy experts, in collaboration with the Health Data Consortium, a collective of public and private organizations working to improve healthcare through data, has produced a set of reports tackling those issues, dividing the health IT landscape into three primary sectors: consumer tech, government and healthcare providers regulated by the federal HIPAA privacy statute.
Chris Boone, executive director of the Health Data Consortium, framed the inherent tension between the rush to amass vast stores of health information and the attendant privacy implications during an online presentation Thursday.
“Data-driven and information-based systems have quickly become the new paradigm for American healthcare, providing valuable insights on treatment, quality, safety, efficiency in public and personal health,” Boone says. “These developments offer a wealth of opportunities to increase wellness, but raise serious privacy and security questions.”
How is your collected health information being used?
Researchers at the Center for Democracy and Technology (CDT), a privacy advocacy group, worked with the California HealthCare Foundation to develop a set of recommendations patterned after the longstanding fair information practice principles to address areas like notice, choice, consent and security.
The report acknowledges the distinct contours of the collection and use of health data in each of the three sectors the researchers evaluated, but nonetheless concludes that consumers should have more visibility into how their information is collected and used, and a greater ability to set limits on that information, including the right to compel commercial providers to delete items from their electronic records.
“There needs to be an underlying right that individuals have the right to this data,” says Michelle De Mooy, deputy director of the CDT’s consumer privacy project.
Additionally, the report highlights the security concerns that arise when organizations stockpile sensitive consumer data, and suggests some basic practices like limiting the scope of information collected, as well as the duration for which it is stored.
Those issues, De Mooy argues, apply more or less in equal measure to health IT companies that operate outside the strictures of the HIPAA statute that governs care providers, insurance companies and others.
[ Related: Will Healthcare Ever Take IT Security Seriously? ]
“Many of the privacy issues that face traditional healthcare entities in the big data era of course also apply to app developers, to wearables device manufacturers and other entities that aren’t necessarily a traditional part of the health ecosystem. As such, questions of data minimization and retention and secondary use … come into play,” she says.
The question of secondary use is perhaps among the thorniest, as so much of the promise of big data is staked on the unanticipated uses and insights that could come from cross-referencing different datasets or developing novel mining algorithms.
Gautam Hans, policy counsel and director of CDT’s San Francisco office, sums up that argument this way: “It’s hard to say what we’re going to do in the future, and therefore it’s hard to know how to communicate that in an effective way.”
And, indeed, Hans allows that any privacy framework must offer providers some flexibility to repurpose patient information for research and other legitimate uses, but that process must entail certain de-identification mechanisms, and, importantly, notify patients up front about how their data might generally be used, offering them the chance to decline.
“Transparency doesn’t necessarily need to mean that a provider needs to explicitly define what it might do in the future, but rather that there might be future non-treatment uses of the data,” Hans says. “So, for example, the provider could explain that the data could be used for research purposes” without having to specify the exact nature of what type of research might come down the road.
Using encryption to protect health data
On the security front, the researchers recommend encrypting health data both in transition and at rest, noting that companies in the health space have become popular targets for cyber criminals.
“Health data has become incredibly valuable,” De Mooy says, citing recent high-profile breaches at insurance providers as a “telltale sign” that hackers are gunning for the healthcare sector.
On the public-sector side, she notes that the government is in a unique position, acting at once as a payer and a provider, as well as the largest single steward of citizens’ health data. In that light, both federal agencies and their contractors have a heavy burden when it comes to privacy and security.
“We recommend that all government in the United States be held to the highest possible standard when it comes to the transparency of the collection and use of individual health data,” De Mooy says. “It’s particularly important for entities that provide health-related services to government but do not directly interface with citizens to also clearly describe their data practices and privacy statements available on their website and we know that’s imperfect, but we still believe that privacy policies remain one of the sole systems of accountability for these practices.”