Written by Susan Walberg, JD MPA CHC
Healthcare apps have become increasingly prevalent, with people using them for counting steps, monitoring their calories, or linking to various medical devices, to name just a few examples. Since the COVID outbreak, however, and the explosion of telehealth as a healthcare option, these apps have proliferated at an insane rate. As of 2020, there were 325,000 healthcare apps on the market, with more coming all the time.
Whether you are a consumer who uses such apps, or a provider who wants to develop an app for patients to use, it’s important to understand some of the privacy and security risks that may accompany the use of such tools and what to watch out for.
What Are Healthcare Apps?
An ‘app’ is a small program that can be loaded onto a phone or mobile device to perform a specialized function. There are two main types of healthcare apps in terms of privacy and security regulations, and the rules governing them vary accordingly.
The first type are the applications that are used by your healthcare provider. They may be used to store your lab or radiology results or might be integrated with a medical device for tracking/monitoring purposes, such as an electrocardiography device that monitors heart activity. Or they may be used to coordinate your care.
The second type are personal or private healthcare apps, those that an individual can get at an app store to track and manage their diet, exercise, or specific health conditions. There are apps for mental health, diabetes, and, of course, COVID, to name just a few. Many of these apps are free.
Nothing in Life Is Free
First, let’s talk about those ‘free’ apps.
Free apps, how cool is that? Depending on your view, an application that tracks and shares your personal information might not really be ‘free’.
If you go online and look for a free app to help you count calories or manage your diet, for instance, the odds are good that there are advertisements on the app, right? Well, most of those ‘free’ apps, with the ads included, will be sharing your information with the advertisers and perhaps even with other companies, such as the ‘big tech’ companies or other stakeholders or investors.
You may expect this, and you might not care. After all, any online Google search leads to targeted Facebook ads relating to that same subject matter, as many of us have noticed. We may not like it, but we are getting used to the fact that our online activity is not really private.
But when you choose one of those apps, think about what information you are entering, because it is probably not private. How much of your medical information is being collected in order to help you manage your diabetes or exercise program? And do you know where that information might be shared? You may accept the fact that your use of the app is not private, just like your Google searches seem to have a direct pipeline to Facebook. But think about the data collected, because that’s not private either. And that’s not illegal in this situation.
How can this health information NOT be private? There must be regulations protecting your privacy, especially when it comes to your healthcare information, right? We hear all the time about HIPAA (The Health Insurance Portability and Accountability Act of 1996) and how your health information can’t be shared.
Just to be clear, in a nutshell, HIPAA only applies to those apps that are used and offered by your healthcare provider or insurance company (or some similar organization that is regulated by HIPAA). Those organizations are subject to the HIPAA Privacy and Security regulations (as well as the HITECH and Omnibus laws that followed), so any product they offer in conjunction with their regulated services would typically be subject to the same laws. This does not mean that if your doctor tells you there are apps in the marketplace to monitor your diabetes that they would be subject to HIPAA. But if your insurance company, for instance, offers you a tool as part of your plan that will help you manage a chronic health condition, HIPAA would generally apply. You may not be sure, so it’s important to ask.
If the app is, indeed, regulated under HIPAA, that means that privacy and data security controls must be in place. There should be a privacy/security policy that you can review, and you have specific rights with respect to your information and how it’s used. There are limitations around, for instance, how your data can be used or shared for marketing purposes. It also means that there must be a designated privacy and security ‘official’ who has oversight of compliance with these regulations. A company that provides a healthcare app to physician practices, insurance companies, or similar organizations would be considered a ‘Business Associate’ of that provider or insurance company, which means they are subject to the same requirements. HIPAA does provide a broad range of protections, but they are limited to those specific scenarios.
The reality is that few laws govern the privacy of information you voluntarily share in one of these publicly-available apps, so if you go online and pick an app to track or monitor your own health condition or information…most are not subject to privacy laws.
Apps Provided by Your Physician, Insurance Company, Etc.
The apps used by your doctor’s office or insurance company are subject to much tighter regulation, but also often contain more personal data. Especially with the increased use of telehealth services, provider’s offices are relying on various applications and platforms to facilitate the provision of healthcare services. These apps, and the companies that offer them, are covered under HIPAA as ‘Business Associates’ of the provider or insurance company if the app uses, stores, or transmits patient health information on behalf of the healthcare organization.
Due to COVID, the government has loosened up the privacy regulations in order to allow greater flexibility in providing telehealth services. While this is good news for providers and the patients needing those services, it also means more potential risk to protected health information (PHI). It’s important to keep in mind that, in addition to whatever information you enter online, a telehealth application likely has requested permission to access your calendar, camera, and microphone.
The good news is that, although providers may have been using some of the less secure apps in the beginning of COVID, just out of necessity, those providers who plan to continue providing telehealth services are working to ensure compliance with privacy and security requirements. App developers are busy developing apps to accommodate this changing market, and compliance is a top concern.
Apps as Mobile Devices
There is one type of application which is actually considered by the Food and Drug Administration (FDA) to be a medical device, in addition to being covered under HIPAA (because they are provided in conjunction with healthcare services). Those are the apps that are intended to be used ‘for the diagnosis of disease or other conditions, or the cure, mitigation, treatment, or prevention of disease, or is intended to affect the structure or any function of the body of man’ under section 201(h) of the Food, Drug, and Cosmetic Act. In general, if the purpose or function of the app is to assist in performing a medical device function, it will be treated as a medical device under the FDA. For instance, if the app can be run on a smart phone or other hand-held device and analyzes and interprets EKG waveforms to monitor cardiac irregularities, it would be considered analogous to those software programs that perform the same function and are otherwise regulated as a medical device.
The intent of the FDA is to ensure patient safety related to the use of those devices that could compromise or risk patient health. This oversight is limited to those devices marketed and offered to perform these medical device functions.
Although the FDA purview is not privacy or data security, the FDA jurisdiction is noteworthy in terms of regulatory oversight. For purposes of HIPAA, these devices would typically be subject to the Privacy and Security rules as they are used in conjunction with your provider or insurance company, as discussed above.
How Do You Know if Your Data Is Secure?
You can also check an app’s automatic settings and look for those that impact privacy, such as location tracking. Beware, though, that in some instances turning those options off will make it more difficult to use the app.
The bottom line here is caveat emptor…buyer beware. Especially if you’re not ‘buying’ and it’s ‘free’.
How Can Data Be Compromised?
Even when providers, insurance companies, and app developers are focused on compliance with the various privacy and security requirements, PHI can still be compromised, but it is less likely. Common mishaps occur in a number of ways:
- Employee errors. Human errors can occur in any setting. It can be an employee discussing patient information out loud in a non-private setting, clicking on a link that allows a virus or ransomware attack, or accidentally entering an incorrect phone number and sending information to the wrong person. This isn’t limited to technology-related issues but privacy in general.
- Poor access controls. There needs to be a solid process, that is followed religiously, to ensure that only individuals who need access are given access, and that former employees or business associates are promptly removed when they no longer have a need for access. This also includes business partners who have employees who need access in order to provide services to another company or practice. These employees need their own access, not a universal access that cannot be tracked.
- Failure to monitor. Any organization that maintains PHI electronically should have a process for routinely reviewing who is accessing sensitive information and following up on any questionable access. Audit trails are part of any good security structure.
- Failure to securely store data. Not only should data be stored in a secure manner, it should also be consistently destroyed/removed when applicable retention periods have expired.
- Inadequate encryption.
- Workstation and device security. Applications should time-out when not in use, rather than rely on users to remember to do so.
- Failure to conduct a comprehensive risk assessment that includes the various apps and networked devices where PHI is stored or transmitted.
- Increased remote workers. Employees working from home are more likely to use personal devices that don’t have proper levels of encryption and that are, by definition, less private due to the offsite location. Access is much harder to control and networks may not be secure.
The above issues do not pertain only to apps, but in general to information privacy and security, especially in the new era of increased telehealth services. Those issues are also the types of failures HIPAA was designed to prevent and would likely be considered violations, depending on the specific facts. If you are considering using an app or electronic platform where personal information will be entered, it’s recommended that you ask your provider or insurance company who is offering this tool what their privacy and security policies are. If their organization is using and recommending such a tool, they have almost certainly done the review of privacy and security controls. And if you are a provider considering using an app, or an app developer, the above list is for you. You should have designated ‘privacy and security officials’ who ensure the above risk areas are addressed.
What Are the Risks?
Most people care about the privacy of their health information just because it’s private and not other people’s business. But there are actual risks to consider, which users of these apps should understand:
- Data is shared with third parties for sales and marketing, increasing the targeting of ads you receive.
- Even information that is supposedly ‘de-identified’ can include enough information to make users identifiable, and it may be very sensitive information, for instance relating to mental health or substance abuse.
- Medical identity theft, which can result in someone using your identity to receive free healthcare services or to file fraudulent claims. Healthcare data is valuable for those reasons, which is why it is often targeted by hackers.
- Additional outside companies, such as Facebook or Google, may acquire the information and build user profiles. Once the information is out there in that environment, there is little control over it and it’s difficult to know who could access it or how it could be used.
- Your PHI could be acquired by insurance companies or other healthcare companies that could use it against you in underwriting or pricing determinations. Who else would you not want knowing your private information? An employer? The possibilities are frightening, especially considering that once the information is out there, it’s out there. You can’t put the genie back in the bottle.
Telemedicine and the use of online applications has exploded in recent years, particularly in relation to the COVID pandemic and the resulting changes in the delivery of healthcare. The regulatory framework has not necessarily caught up to technology yet, so while HIPAA laws apply to some applications, many that are out there being used by consumers are not regulated in terms of protecting sensitive information. Health information can be bought and sold in the marketplace, it has a value for advertisers, thieves, and others.
For consumers, just be aware of the potential risks before you start using an app; check the app’s privacy and security policies and consider carefully what information you are comfortable exposing. If the app comes from your provider or insurance company, ask about the security controls and how they are protecting your data.
For providers, consider your own liability in terms of recommending an app and make sure your organization has done its due diligence to ensure proper security measures are in place. You should have your own privacy and security experts evaluate the tool before offering it to patients.
For app developers, be aware that technical security isn’t your only concern; you will want to have assistance from someone with healthcare privacy and security regulatory expertise. This will be something that potential clients and investors will be asking about.
Susan Walberg is a healthcare consultant who works with providers and healthcare start-ups. She can be reached at https://www.susanwalberg.com/