Home – Many Healthcare Apps Fail to Protect Privacy, HHS Held a Competition to Find Fixes

Many Healthcare Apps Fail to Protect Privacy, HHS Held a Competition to Find Fixes

You can’t swing a stethoscope these days without hitting a mobile app that promises to improve your health and well-being.


The fitness conscious can track how many steps they take a day and how many calories they burn. Some apps manage medications and share the status of a patient’s disease with their doctor. Sleep apnea sufferers who strap their face to a CPAP machine at bedtime may come to learn that the contraption shares their use of the machine with their insurer as a condition of coverage. These sleep-challenged folks can monitor their own use of the CPAP by, you guessed it, downloading a free app.


But how safe is the information these apps are sharing? Do users know what is being done with, and done to protect, the information? Based on several studies and the opinions of experts, there is room for improvement.


 “Many app developers overlook privacy and security by failing to do one of the most basic first steps of data protection—informing consumers of their practices,” writes Daniel Solove, a renowned privacy law expert and professor at the George Washington University Law School.     


Solove pointed to a 2016 study published in the Journal of the American Medical Association, which found that 81 percent of 211 diabetes apps examined did not have a notice informing consumers about privacy practices. And, the JAMA study reveals, in the case of 41 of those apps, or 19 percent, with privacy policies, “not all of the provisions actually protected privacy,” adding that more than 80 percent collected user data and 49 percent shared that data. “Only four policies said they would ask users for permission to share data,” JAMA found. Specifically, the study said, permissions users must accept to download an app-authorized collection and modification of sensitive information, including location tracking, modifying and deleting information, and activating the device’s camera and microphone. “In the transmission analysis, sensitive information from diabetes apps . . .  was routinely collected and shared with third parties . . .  Of the 19 apps with privacy policies that shared data with third parties, 11 apps disclosed this fact, whereas eight apps did not.”


“Patients might mistakenly believe that health information entered into an app is private (particularly if the app has a privacy policy),” the JAMA article said, “but that generally is not the case.” JAMA said doctors should consider the privacy implications of health apps before encouraging patients to use them.


Solove also noted a much broader study of nearly 18,000 free apps conducted by researchers at Carnegie Mellon University’s CyLab® Security and Privacy Institute. That team found that nearly half of these apps didn’t even have a privacy notice. In the CyLab post about the study, it said that “even though 71 percent of those appear to be processing personally identifiable information and would thus be required to explain how under state laws such as the California Online Privacy Protection Act . . .  even those apps that had policies often had inconsistencies. For instance, as many as 41 percent of these apps could be collecting location information and 17 percent could be sharing that information with third parties without stating so in their privacy policy.”


A third study Solove noted was one from the Future of Privacy Forum in 2016, which revealed that “only 70 percent of top health and fitness apps had a privacy policy.” Sleep-aid apps get a special shout-out in this study, which says that only 66 percent of them provide a privacy policy, falling below the top apps overall (76 percent) and the average health and fitness apps (70 percent).


“These numbers are very problematic,” Solove wrote. “Having a privacy notice is such a fundamental step for protecting privacy. Beyond informing the consumer, the process of creating a privacy notice forces developers to think about the privacy implications of their technology, and it informs experts, NGOs and regulators about what the technology is doing. This is essential for accountability.”


Seeing the challenge with these apps, the U.S. Department of Health and Human Services held a competition called the Privacy Policy Snapshot Challenge calling for designers, developers and privacy experts to create an online open-source Model Privacy Notice generator.


Solove entered the competition with R. Jason Cronk, founder of the Enterprivacy Consulting Group, which consults on “privacy by design.” Solove and Cronk won the competition with a tool that easily produces, as Solove describes it, “policies that are clear, comprehensible and visually appealing.”


You can read more about what Solove has to say about the tool—and access it as well—by reading his post here. In addition to teaching at George Washington University Law School, Solove founded TeachPrivacy, which educates individuals and companies about privacy and security. He is also the co-founder and co-chair of the Privacy + Security Forum, an annual event that takes place in Washington, D.C., each October for hundreds of privacy and security professionals from technology companies, other large corporations, government agencies, law firms and more.