Chris Apgar, CISSP, CCISO

1 2 3 5

Return from Remote Work: How do you secure remotely used data & devices?

As things ease up, and slowly people return to the office, what steps do you need to take to make sure data and devices are secure? It’s not quite a reversal of what covered entities (CE) and business associates (BA) went through when everyone who was non-essential was required to go to remote work, but there are some similarities.

Back to HIPAA as Usual

After the national emergency ends, so does OCR’s enforcement discretion.

Reassess telehealth vendors. That means if you made an in-the-moment decision to move forward with a non-HIPAA compliant video conferencing vendor for telehealth, you need to reassess.  Either discontinue telehealth and telework or find a vendor who will sign a business associate agreement. If you continue to use a non-HIPAA compliant vendor and there’s a breach, it’s all on you.

Stop sharing (BAs). When enforcement tightens up again, BAs won’t be permitted to disclose PHI to public health and health oversight agencies.  Only CEs will be permitted to disclose PHI to these agencies.

Teach employees how to create strong wireless passwords. One of the steps CEs and BAs may not have thought to take when remote work and remote health suddenly became the norm was to require that employees strengthen their home wireless network passwords.  Take that step now if you want to continue with some remote work and telehealth, or if providers conduct telehealth from home.

CEs and BAs may require some training on the how-to of creating a strong wireless password. Plus, not all employees will know how to check their wireless network passwords. Remember, wireless carriers often set the password, and employees don’t reset when setting up their home router. That means these passwords may be easy to crack. If employees know that carriers set their network passwords, they’ll want to reach out to their carriers for instructions on how to change the home router password to meet strength protocols.

Clean, Patch and Update Remote Work Devices

Check device security settings and hard drives. As employees return from remote work and bring company laptops and tablets back to the office or clinical setting, check these mobile devices to ensure all security settings are where they should be. Also, the lack of timely patches on the devices may leave you open to cybercrime. For example, employees may have turned off device encryption, not updated anti-malware frequently enough or, if employees’ devices are not locked down, there may be non-approved applications installed.

With employees working from home, a number have pulled double duty – work remotely, make sure the children are taken care of, and keep them up on their classwork. That means a good likelihood that company-owned mobile devices were used for something other than work.  Again, check hard drives. Children are quick to tap and install; ensure they didn’t install an application not approved for use on the device.

Clear off sensitive personal data. The above are also good reasons to remind remote-work employees to delete any sensitive personal data stored on those devices. Now that mobile devices are returning to use at the office, and in clinical settings, there’s a chance that personal data may be exposed during routine scanning, patching, and repairing the company-owned mobile devices.

Put PHI in Lockdown. Some CEs and BAs locked down company devices used remotely in such a way that the user couldn’t print, make screenshots, or plug in USB drives. A number likely have not. If employees were able to print at home, remind employees not to print PHI there, and if they have, to properly shred the paper. It’s a good time to lock those devices down so you make sure no one can print PHI at home or plug in a personal USB drive that may not be encrypted and may have malware present on the drive.

Hold Remote Work Training – Phishing, Telework

Run a Mock Phishing Exercise. If you haven’t run a mock phishing exercise recently or at all, now is the time. During the COVID 19 outbreak, cybercriminals have been actively spreading malware, setting up phishing campaigns, and so forth. Mock phishing exercises do a couple of things: (1) they educate employees or at least the ones who clicked a bad link, and (2) they help you assess risk – how many employees clicked on bad links. All it takes is one to jeopardize your organization, your network, and your PHI.

Review and vet your telework policy and telework agreement. Many CEs and BAs scrambled when remote work became the norm.  A telework policy may not have even existed, much less an agreement, because no one thought it would be needed. Take some time now to figure out what worked and what didn’t, what’s enforceable, and what’s not. After thinking that through, adopt or update your telework policy and your telework agreement. And after that, be sure to (1) educate your workforce on the updated telework program and (2) make sure you can enforce it.

It Happened Once – It can Happen Again

Review your business continuity plan (BCP). If you didn’t have a solid BCP before the pandemic, you were likely scrambling when all non-essential workers were required to work from home. Now that you’re slowly getting back to normal, dust off that plan. Check if it worked or if you need to make changes because of what went wrong. After any major disaster or disruption, like a pandemic, you need to take a moment to examine your plans and update them to reflect on the fact that it may reappear in the future. Start now to put the lessons learned to work and place your organization back on solid ground.

When all is said and done, great job!  Everyone did what was necessary to continue the important work of healthcare, plan, or no plan. You know what else it’s a good time to do? Thank all of those who kept the ball rolling, taking care of patients, and supporting patient care. Again, great job!

Video Hijacking Have You Worried? Try these 5 Steps from the FBI

The healthcare industry reports that video hijacking, or teleconference hijacking, emergence on the rise as telehealth appointments replace typical in-person ones during the COVID-19 crisis. The FBI has received multiple reports of conferences being disrupted by pornographic images, hate images and threatening language. Yet another reason that, even though OCR has indicated it will not enforce prohibitions on the use of non-HIPAA compliant video conferencing platforms like FaceTime and Skype, covered entities and business associates still need to exercise due diligence to avoid breaches of electronic protected health information (ePHI).

[Read our article on PHI during COVID-19]

Although the press release from the FBI mentions Boston and the New England area, the threat is nationwide. The FBI recommends applying due diligence and caution to cybersecurity efforts. They also provide smart steps that can be taken to mitigate teleconference hijacking threats, per below.

5 Steps to Help Reduce Video Hijacking Risks

  1. Do not make meetings or telehealth appointments public. If you are using Zoom, there are two options to make a meeting private: require a password or use the waiting room feature and control the admittance of patients or clients.
  2. Do not share a link to a teleconference or telehealth appointment on an unrestricted publicly available social media post. Provide the link directly to specific people.
  3. Manage screen-sharing options. In Zoom, change screen-sharing to “Host Only.”
  4. Ensure users are using the updated version of remote access/meeting applications. In January 2020, Zoom updated their software. In their security update, the teleconference software provider added passwords by default for meetings and disabled the ability to randomly scan for meetings to join.
  5. Ensure that your organization’s telework policy or guide addresses requirements for physical and information security.

Look at this situation as an ideal opportunity to educate your workforce – or remind them – about the how-tos of solid privacy and security practices that can protect your organization, patients, or clients. The greatest risk is not associated with the technology. The risk lies with the people.  That’s where solid, and ongoing, education comes in.

There’s another thing to look at while you’re distributing security reminders about how to stay cyber safe. Double-check that your telehealth and telework policies are clear, concise, up-to-date and communicated. We’ve run across a few clients who have a telework policy in place but it’s not been clearly communicated to staff. In some cases, the telework policy includes requirements that aren’t being enforced. To avoid this recipe for an ePHI breach disaster, update your telework policy and get it out to your workforce ASAP.

Extensive remote working situations are exposing more risks than many companies previously realized. Not the least being how to be sure your policies and procedures cover this situation properly. Are you not quite sure where to start with updates? We can help. Whether you’re updating current policies and procedures, or you’ve never finished the “work from home” ones. Give us a call at 503-384-2538 to get things moving. 

Attention Business Associates! New OCR Announcement re PHI during COVID-19 Relates to You

On April 2, 2020, the Office for Civil Rights (OCR) at the U.S Department of Health and Human Services (HHS) announced that effective immediately, it will exercise its enforcement discretion and will not impose penalties for violations of certain provisions of the HIPAA Privacy Rule against health care providers or their business associates for the good faith uses and disclosures of protected health information (PHI) by business associates for public health and health oversight activities during the COVID-19 nationwide public health emergency. The notification can be found here.

Why is this further “enforcement discretion” a new thing? Because the HIPAA Privacy Rule already permits covered entities to disclose PHI for public health and as it relates to communicable diseases. It doesn’t permit business associates to do the same, though. However, during the COVID-19 pandemic now BAs may disclose PHI to public health officials or health oversight agencies without fear of being penalized.

What types of Business Associates can disclose PHI?

AKA, Does the OCR “enforcement discretion” apply to you?

Business partner Julia Huddleston and I had to think a bit about what types of business associates would be in a position to disclose PHI under this new relaxing of the rules.  We identified several who may be able to make these disclosures:

  • Telehealth vendors
  • Population health vendors
  • Group health plan third party administrators (among others)

That said, business associates will still need to pay attention to disclosures! Enforcement relaxation is not intended to give BAs broad permission to disclose PHI. This disclosure is only to be associated with treating those impacted by COVID-19, reporting where cases are appearing and so forth. Even then, if it is possible, the PHI should be de-identified. At the very least such disclosures need to be kept to the minimum necessary.

During the pandemic, covered entities and business associates have more latitude when it comes to the use and disclosure of PHI. Keep in mind that this is a temporary situation. After the national emergency is lifted, enforcement will resume. This means that business associates will no longer have the latitude to disclose PHI to public health officials and health oversight agencies. The current action is similar to the relaxing of enforcement related to the use of platforms like FaceTime for telehealth. For more information about OCR’s COVID 19 resources click here.

Are your policies & procedures up to the risks of a suddenly extended remote workforce? Now is a great time to double-check how relevant yours are for security standards, device use and more. Please call or email if you need help – and stay safe!

When It’s OK to Share: OCR’s Novel Coronavirus Disease (COVID-19) Limited Waiver

Novel Coronavirus, aka COVID-19, is on track to stretch our healthcare system to the breaking point, and our healthcare providers along with it. In effect as of March 15, 2020, the OCR’s published a Limited Waiver of HIPAA Sanctions and Penalties that during this National Emergency could give care providers one less source of anxiety as they work to save lives.

What the Limited Waiver means to hospitals, emergency rooms & you

Although HIPAA remains in force, the very nature of responding to care demands places a huge strain on healthcare providers. Extraordinary circumstances call for extraordinary measures.

To help reduce the concern of potential financial penalties, the HHS Secretary has (as per the issued publication) “exercised the authority to waive sanctions and penalties against a covered hospital that does not comply with the following provisions of the HIPAA Privacy Rule”:

  • the requirements to obtain a patient’s agreement to speak with family members or friends involved in the patient’s care
    See 45 CFR 164.510(b)
  • the requirement to honor a request to opt-out of the facility directory
    See 45 CFR 164.510(a)
  • the requirement to distribute a notice of privacy practices
    See 45 CFR 164.520
  • the patient’s right to request privacy restrictions
    See 45 CFR 164.522(a)
  • the patient’s right to request confidential communications
    See 45 CFR 164.522(b)

Don’t forget the defining word is “limited.” The limited waiver only applies until the President of the United States or the HHS Secretary terminates the national emergency status. From that point on, the HIPAA Privacy Rule and associated potential penalties are reinstated. Also remember that national emergency or no, disclosures of personal information are allowed to disaster relief organizations, like the American Red Cross. That leniency lets them notify loved ones of your location.   Also keep in mind that the waiver applies only to hospitals, including their emergency rooms.  Other covered entities – like doctors and health plans, still must comply with all Privacy Rule requirements.

Other resources:

Contact Apgar & Associates for consulting expertise in privacy, information security, HIPAA, HITECH and regulatory compliance. We also guide you through the what and the how of preparation for HITRUST, SOC2 and ISO certifications.

Teleworking Safely: Precautions for Working Remotely during COVID-19

As we cope with the COVID-19 pandemic, it’s important to take a few extra measures to protect your organization, your patients and clients, and your data. Teleworking, where more and more individuals are working remotely, is widely accepted to prevent further spread of the virus. Now is a good time to address the risks that come with working remotely, especially if workstations are not owned by your organization.

Minimum Employee Needs for Secure Remote Work

As you prepare yourself and your teams for expanded teleworking here’s a checklist of what you need to do to reduce the risks associated with mobile device use that may be outside of what you would normally permit.  If employees will be using their own devices and working remotely, at a minimum they need the following:

  • Secure wireless router that’s cabled or wireless secured with WPA 2
  • A strong home router password
  • A strong device password
  • Up to date antimalware and firewall
  • Up to date patching on the device used
  • If connecting to your network, a secure connection to the network (e.g., VPN, TLS, HTTPS)

If employees are using a company laptop, you need to require the use of a secure connection with a strong password.  It would also be a good idea to make sure if company workstations will be used that all of the above are addressed. Patching is important to prevent vulnerabilities from being exploited by cybercriminals.

One last caution: Phishing. Now more than ever employees need to be reminded to beware of phishing activity. There are a number of known phishing attacks associated with COVID-19.  Social engineering can result in a breach, ransomware attacks and other damage to your infrastructure and data. It’s a good idea to point employees to the CDC and other reputable sources so they know what sites are safe to visit. That way they can remain up to speed on what’s happening with the pandemic, with less risk.

As was said every episode of Hill Street Blues, stay safe out there!

When you’re making on-the-fly revisions and updates to your policies and procedures during this critical time, you want to help them stick. A tip: make sure they state what you will do, not just what you can do. “If you say it, do it. If you do it, write it down.” Call on Apgar & Associates at 503-384-2538 for help with privacy and information security fundamentals as well as strategic planning.

Are All Ransomware Attacks Breaches?

It’s one of those questions that never goes away.  The answer is, “Maybe” and very definitely, “Not always.” Contrary to popular belief, even after ransomware attacks, the safe harbor still applies when it comes to breaches.  If your PHI data was encrypted prior to the ransomware attack that encrypted (aka “held for ransom”) it, you may very well not have suffered a breach. Which means that there may be no need to conduct a four-factor risk assessment.

If only it could be so simple. However, per OCR’s weigh-in, you do need to ascertain that the data attacked was encrypted at the time. If it was encrypted, it’s a security incident, but not a data breach. I’ll dig into that shortly.  Far too often I see posts and blogs that adamantly declare, “If a ransomware attack occurs, it must be a breach.”  Not so fast. It’s not so black and white.

OCR has stated that it’s a fact-based determination as to whether or not a breach occurred. If a breach, then you do need to notify OCR, individuals and potentially, the media.  If you run into a consultant (and sometimes counsel) who states that all ransomware attacks absolutely equal a breach, get a second opinion.

Data Encryption & the Burden of Proof

Here’s the flip side – when encrypted PHI may become unsecure, representing a breach due to a ransomware attack. Keep in mind that when you’ve powered up and logged in to your laptop or other mobile device, data may be unencrypted at the time because you’re accessing the data. When ransomware hits and those files are unencrypted at the time of the attack, you may have a breach of unsecured PHI on your hands.

But – if you do use full disk encryption and your laptop was not turned on (which means your laptop wasn’t unencrypted), or if no files were unencrypted at the time of the attack, the PHI was not compromised. No breach occurred.

Also, if the ransomware attack hits your backup media, encrypted at the time of the attack, there is a high likelihood that no PHI breach occurred.  Triple-check to be sure and be able to prove it if OCR comes to call. The burden of proof lies with you.

The burden of proof is greater under other circumstances, like when a ransomware attack occurs and PHI is not encrypted.  At that point, you absolutely need to conduct a four-factor risk assessment.  It bears mentioning, though, that if you have top talent forensic analysts who can prove that no PHI was siphoned off, you still may not be required to notify OCR or individuals because the PHI was not compromised.

Clearly, it’s not a simple black and white, yes or no answer to the breach question. Be careful. Preserve all evidence. Look closely at the circumstances to make sure no breach occurred that requires notification. But if a consultant or counsel, going on the basis of a blog post, says that you absolutely must notify because ransomware attacks always equal a breach, don’t take my word for it. Just ask OCR.

Compliance Planning includes the “what to do” in the case of a security incident and data breach. Chris Apgar, CISSP and Julia Huddleston, CIPP, CIPM, work with clients nationwide on HIPAA privacy and security compliance, and address the need for assistance with expanded use of electronic health information exchange. They also prep clients for the rigorous process of HITRUST, SOC2 and ISO certifications.

Perimeter Security: It’s the Simple Things That’ll Get You

Are you sure your medical records aren’t accessible by outsiders? Maybe check your perimeter security. I’m not talking about fancy technical security gadgets, but the simple, obvious things like setting a password on your internet-facing applications.

Here’s why I ask. Did you hear about the 187 medical system servers not protected by passwords or necessary perimeter security measures? Thank the recent Pro Publica investigation for that bombshell.  An example: with just a simple data query, a MobilexUSA server exposed the names of more than a million patients!  The investigation uncovered the release of names, birthdates, and in some cases, social security numbers.

Get back to the basics. Avoid the obvious errors like

  1. leaving default passwords on servers (ask the State of Utah about their massive breach),
  2. not setting passwords at all and other blatant mistakes.

You lose patient trust, and you lose money.  There are notification costs, harm to your reputation, not to mention significant OCR fines.  Another big expense? The regulators’ imposed corrective action plans (CAPs).

Let’s look at the password issue alone. Basic perimeter security doesn’t stop at the need to change default server passwords, and to set up an original password.  Take it up a notch. Make sure the passwords you set aren’t easy to guess. Get complex. For cybercriminals, it doesn’t take a lot of computing power to crack a simple password.  Take it for granted that you need to set complex passwords on all of your devices.

Too often, it’s the simple things that get you.  If simple mistakes are why your data is exposed to the internet, you’re setting your organization up to an OCR finding of willful neglect.  That will definitely lead to civil penalties or monetary settlements.  Remember, fancy technology isn’t your biggest risk; it’s people and easy mistakes with significant implications.

No doubt, limited resources are an issue for smaller healthcare organizations like small clinics and health information technology (HIT) startups.  On the other hand, the adverse impact of not attending to even simple things can put smaller organizations out of business.  If you’re a smaller organization, or just not sure where to start, try the Office of the National Coordinator for Health Information Technology (ONC). There are plenty of no-cost resources available, like the toolkit for providers. Tackling perimeter security can be overwhelming, which is why it’s essential to start small, with the basics.

Chris Apgar, CISSP, CEO and president of Apgar & Associates, LLC is a nationally known speaker and author. He most recently authored the McGraw-Hill Healthcare Information Technology Exam Guide chapter on the regulatory aspects of health IT. Chris is also a nationally recognized expert and educational instructor on information security, privacy, HIPAA, the HITECH Act, state privacy law, and electronic health information exchange.  

RFI Vulnerability Lesson: Beware of Who You (try to) Hack

Isn’t it rewarding when a fellow security professional posts about an attempted hack of his personal website that he turned into a lesson in website security? And in the end, hacked the hacker? That’s exactly what happened with Larry Cashdollar, a senior security response engineer at Akamai. Cashdollar noticed something peculiar in the logs on his personal website. He dug further and turned up signs of someone scanning for remote file inclusion (RFI) vulnerabilities.

Before diving into the details, if you’re not sure what an RFI vulnerability is, definitely ask your web development and website management team if they’re aware of this type of vulnerability.  And if they don’t know, they need to do some research to prevent hacking attacks on your websites.  You can satisfy your curiosity – and share with your web team – this link to more information about it.

On to the Hacking Attempt

Larry Cashdollar told The Register his site’s logs showed that a  would-be attacker was probing for RFI holes to trick web applications into running a remote malicious script.  The hacker was trying to load a file using a custom tool that Cashdollar had created (!).

The hacker test was a generic test used against websites where they can figure out the input, supply a web address and see if they can execute on the input.  Unfortunately for the attacker, Cashdollar used the tool’s logs to trace back to the file that the attacker was trying to load. Then Cashdollar assessed that and other files the hacker had ready to execute to take over vulnerable websites, and was able to extract the criminal’s email address and their preferred language – Portuguese.

What was the purposes of the RFI vulnerability probe? The attacker wanted to install phishing pages that masqueraded as a legitimate bank’s login webpage, and then direct victims to the hacker’s page to collect bank account credentials.  This was a way around installing more sophisticated code to capture cryptocurrency.  It was just a matter of redirecting someone to a malicious site because the initial fake webpage looked legitimate.

3 Big Takeaways from the RFI Vulnerability Probe

Score one for the good guys! In this case the security professional caught and tracked down the attacker.  Now we need to take it as an alert to professionals who’re responsible for monitoring website security.  From Cashdollar’s account of the incident, the big takeaways for website administrators are the importance of:

  1. Diligently monitoring the audit logs
  2. Following a solid patching program for site management tools
  3. Writing web code that cannot be exploited for RFI and other known vulnerabilities.

If your website developers and administrators don’t know and don’t watch, you may not be as lucky as Cashdollar.

Chris Apgar, CISSP, CEO and president of Apgar & Associates, LLC is a nationally recognized expert and educational instructor on information security, privacy, HIPAA, the HITECH Act, state privacy law and electronic health information exchange.  A nationally known speaker and author, Chris authored the McGraw-Hill Healthcare Information Technology Exam Guide chapter on the regulatory aspects of health IT.

Business Associate or Conduit? Why a BAA likely applies to you.

Ever run into a vendor who claims to be a conduit versus a business associate (BA)? It happens all too often, in my experience. Here’s the problem: the conduit exception is a narrow one. If you’re storing PHI data, even encrypted PHI where you don’t have the encryption key, you’re a BA. Sign the Business Associates Agreement (BAA); it applies to you.

Not convinced? Let’s look at the preamble to the Omnibus Rule of 2013. HHS said, “The conduit exception is a narrow one and is intended to exclude only those entities providing mere courier services, such as the U.S. Postal Service or United Parcel Service and their electronic equivalents, such as internet service providers (ISPs) providing mere data transmission services. As we have stated in prior guidance, a conduit transports does so on a random or infrequent basis. Thus, document storage companies maintaining protected health information on behalf of covered entities are considered business associates, regardless of whether they actually view the information they hold.”

With that HHS summary in mind, you can see it’s pretty difficult to market services and storage to the healthcare industry without a BAA and think you won’t run afoul of HIPAA. Yet even as recently as a few years ago, our privacy and information security firm would encounter storage vendors and document sharing vendors who would not sign a business associate agreement. Again, just because you can’t access the PHI doesn’t mean you’re not a business associate.

In OCR’s May 2019 guidance, you’ll find a list of BA liabilities. Those remind BAs of their compliance responsibilities in regard to HIPAA regulations. OCR’s reminder list also notes that BAs have a duty to execute a business associate agreement with their BA subcontractors. What isn’t mentioned, but is required, is that covered entities (CE) and BAs must  execute a BAA with each other.  So if you’re not an internet service provider (ISP), or the US Postal Service (and the like), plus you store PHI, you need to execute a BAA to be in compliance with HIPAA regulatory requirements.

I’ll end with a cautionary note about vendors convinced they aren’t a business associate. Covered Entities, if your vendor is unwilling to sign a BAA, yet they have access to your PHI, it’s probably a good idea to find another vendor. It may be that your vendor who stores paper charts or other PHI doesn’t realize that they’re a business associate. Or it could be that, in the case of a storage unit, the storage facility owners simply don’t know what’s being stored. But if PHI is involved, then you need to execute a business associate agreement.

Whether you’re a physician practice, a medical transcription service, or a TPA providing a health plan with claims processing services, you’re dealing with HIPAA compliance. Give us a call: 503-384-2538 for help to assure you’re on top of it.

Should you trust Alexa with your health information?

By now you’ve likely heard that Amazon is moving into the HIPAA space with Alexa.  In conjunction with their partners, they’re launching what Amazon calls “HIPAA compliant” apps.  If only it were that easy to create a HIPAA covered app, or as Amazon calls it, skill.  As with Amazon Web Services (AWS) it’s ultimately up to the individual developers to honor the law.  While Amazon may well be a trusted third party, if developers don’t build apps or “skills” with privacy, security and HIPAA compliance in mind, I wouldn’t trust Alexa with any of my healthcare data.

Having worked with a number of software development companies in the healthcare space, I can tell you that more often than not developers want to create cool and useful things. The problem with that is cool and useful don’t always automatically align with security and privacy needs. In fact, in the development process you won’t often find security and privacy at the top of the priority list.

Time, Trust & Alexa Users

Trust but verify, folks. If I were the covered entity or business associate planning to eventually trust the Amazon platform to adequately secure protected health information, I would want assurances and proof that the developers of any app/skill have privacy and security baked in.

Granted, Amazon has indicated that trust takes time to build. That it will be a while before patients widely trust Alexa with sensitive health information.  Ok, let’s say we’re down the road a way.  Trust has been earned.  Now we face another potential issue that has nothing to do with the platform or any of its available apps/skills.  The concern lies with the end users.  End users are not always savvy when it comes to protecting sensitive personal information. I think this is where there’s a definite need for some education on the part of partners, Amazon and healthcare providers.

Right now, when Alexa is on, it listens to all voices in the room, all the time. How awful would it be if an end user thought he or she was taking advantage of touted HIPAA compliant solutions but instead was airing sensitive information using another Alexa app?  Hopefully Alexa will also be smart enough to not broadcast protected health information like “You need to follow up with your [insert-private-condition-here] specialist” to the whole house.

Alexa (Amazon), are you listening?

1 2 3 5