Chris Apgar, CISSP
Isn’t it rewarding when a fellow security professional posts about an attempted hack of his personal website that he turned into a lesson in website security? And in the end, hacked the hacker? That’s exactly what happened with Larry Cashdollar, a senior security response engineer at Akamai. Cashdollar noticed something peculiar in the logs on his personal website. He dug further and turned up signs of someone scanning for remote file inclusion (RFI) vulnerabilities.
Before diving into the details, if you’re not sure what an RFI vulnerability is, definitely ask your web development and website management team if they’re aware of this type of vulnerability. And if they don’t know, they need to do some research to prevent hacking attacks on your websites. You can satisfy your curiosity – and share with your web team – this link to more information about it.
On to the Hacking Attempt
Larry Cashdollar told The Register his site’s logs showed that a would-be attacker was probing for RFI holes to trick web applications into running a remote malicious script. The hacker was trying to load a file using a custom tool that Cashdollar had created (!).
The hacker test was a generic test used against websites where they can figure out the input, supply a web address and see if they can execute on the input. Unfortunately for the attacker, Cashdollar used the tool’s logs to trace back to the file that the attacker was trying to load. Then Cashdollar assessed that and other files the hacker had ready to execute to take over vulnerable websites, and was able to extract the criminal’s email address and their preferred language – Portuguese.
What was the purposes of the RFI vulnerability probe? The attacker wanted to install phishing pages that masqueraded as a legitimate bank’s login webpage, and then direct victims to the hacker’s page to collect bank account credentials. This was a way around installing more sophisticated code to capture cryptocurrency. It was just a matter of redirecting someone to a malicious site because the initial fake webpage looked legitimate.
3 Big Takeaways from the RFI Vulnerability Probe
Score one for the good guys! In this case the security professional caught and tracked down the attacker. Now we need to take it as an alert to professionals who’re responsible for monitoring website security. From Cashdollar’s account of the incident, the big takeaways for website administrators are the importance of:
- Diligently monitoring the audit logs
- Following a solid patching program for site management tools
- Writing web code that cannot be exploited for RFI and other known vulnerabilities.
If your website developers and administrators don’t know and don’t watch, you may not be as lucky as Cashdollar.
Chris Apgar, CISSP, CEO and president of Apgar & Associates, LLC is a nationally recognized expert and educational instructor on information security, privacy, HIPAA, the HITECH Act, state privacy law and electronic health information exchange. A nationally known speaker and author, Chris authored the McGraw-Hill Healthcare Information Technology Exam Guide chapter on the regulatory aspects of health IT.
Ever run into a vendor who claims to be a conduit versus a business associate (BA)? It happens all too often, in my experience. Here’s the problem: the conduit exception is a narrow one. If you’re storing PHI data, even encrypted PHI where you don’t have the encryption key, you’re a BA. Sign the Business Associates Agreement (BAA); it applies to you.
Not convinced? Let’s look at the preamble to the Omnibus Rule of 2013. HHS said, “The conduit exception is a narrow one and is intended to exclude only those entities providing mere courier services, such as the U.S. Postal Service or United Parcel Service and their electronic equivalents, such as internet service providers (ISPs) providing mere data transmission services. As we have stated in prior guidance, a conduit transports does so on a random or infrequent basis. Thus, document storage companies maintaining protected health information on behalf of covered entities are considered business associates, regardless of whether they actually view the information they hold.”
With that HHS summary in mind, you can see it’s pretty difficult to market services and storage to the healthcare industry without a BAA and think you won’t run afoul of HIPAA. Yet even as recently as a few years ago, our privacy and information security firm would encounter storage vendors and document sharing vendors who would not sign a business associate agreement. Again, just because you can’t access the PHI doesn’t mean you’re not a business associate.
In OCR’s May 2019 guidance, you’ll find a list of BA liabilities. Those remind BAs of their compliance responsibilities in regard to HIPAA regulations. OCR’s reminder list also notes that BAs have a duty to execute a business associate agreement with their BA subcontractors. What isn’t mentioned, but is required, is that covered entities (CE) and BAs must execute a BAA with each other. So if you’re not an internet service provider (ISP), or the US Postal Service (and the like), plus you store PHI, you need to execute a BAA to be in compliance with HIPAA regulatory requirements.
I’ll end with a cautionary note about vendors convinced they aren’t a business associate. Covered Entities, if your vendor is unwilling to sign a BAA, yet they have access to your PHI, it’s probably a good idea to find another vendor. It may be that your vendor who stores paper charts or other PHI doesn’t realize that they’re a business associate. Or it could be that, in the case of a storage unit, the storage facility owners simply don’t know what’s being stored. But if PHI is involved, then you need to execute a business associate agreement.
Whether you’re a physician practice, a medical transcription service, or a TPA providing a health plan with claims processing services, you’re dealing with HIPAA compliance. Give us a call: 503-384-2538 for help to assure you’re on top of it.
By now you’ve likely heard that Amazon is moving into the HIPAA space with Alexa. In conjunction with their partners, they’re launching what Amazon calls “HIPAA compliant” apps. If only it were that easy to create a HIPAA covered app, or as Amazon calls it, skill. As with Amazon Web Services (AWS) it’s ultimately up to the individual developers to honor the law. While Amazon may well be a trusted third party, if developers don’t build apps or “skills” with privacy, security and HIPAA compliance in mind, I wouldn’t trust Alexa with any of my healthcare data.
Having worked with a number of software development companies in the healthcare space, I can tell you that more often than not developers want to create cool and useful things. The problem with that is cool and useful don’t always automatically align with security and privacy needs. In fact, in the development process you won’t often find security and privacy at the top of the priority list.
Time, Trust & Alexa Users
Trust but verify, folks. If I were the covered entity or business associate planning to eventually trust the Amazon platform to adequately secure protected health information, I would want assurances and proof that the developers of any app/skill have privacy and security baked in.
Granted, Amazon has indicated that trust takes time to build. That it will be a while before patients widely trust Alexa with sensitive health information. Ok, let’s say we’re down the road a way. Trust has been earned. Now we face another potential issue that has nothing to do with the platform or any of its available apps/skills. The concern lies with the end users. End users are not always savvy when it comes to protecting sensitive personal information. I think this is where there’s a definite need for some education on the part of partners, Amazon and healthcare providers.
Right now, when Alexa is on, it listens to all voices in the room, all the time. How awful would it be if an end user thought he or she was taking advantage of touted HIPAA compliant solutions but instead was airing sensitive information using another Alexa app? Hopefully Alexa will also be smart enough to not broadcast protected health information like “You need to follow up with your [insert-private-condition-here] specialist” to the whole house.
Alexa (Amazon), are you listening?
Consumers on the warpath to protect personal data privacy are making strides in state houses. For instance, here’s an update on Oregon’s Senate Bill 703 re selling health information. If you use Big Data at all, you’ve probably been following this Bill. It’s basically saying that anyone selling personal health information, although thoroughly de-identified, would need to pay the source for the privilege, i.e., you and me. As you may imagine, research groups and analysts who may touch any Oregonian’s de-identified PHI, not to mention privacy officers at the source of de-identified data, are watching this closely.
You can likely thank Facebook backlash for this Bill. Taking personal data and sharing it without user knowledge has caused huge problems for the social media giant. Now we’re hearing that they’re going to reel it all in, but how do you get the genie back in the bottle? The trust is gone, and SB 703 is just one instance of how outraged consumers are at how data is being used.
From a compliance perspective, the information, aka personal data, is already de-identified PHI, so that’s not the basis for the Bill. It’s a clear call for personal data privacy protection beyond the pale of what we’ve seen up to now. You can also look at the yet-to-go-live California Consumer Privacy Act as another example of privacy protection taken to the Nth degree.
This isn’t to say that personal data privacy isn’t important because it absolutely is, it’s merely to point out that the logistical reality of complying with either SB 703 or CCPA is a nightmare we’ve yet to face. You can hardly blame people for playing ostrich when confronted with such a daunting task. You can also hardly blame people for pushing back on companies being able to use personal information for free.
We’ll be writing more on CCPA and its potential effect on business operations both outside and within the healthcare environment. In the meantime, should Oregon’s SB 703 move further down the path to fruition, we’ll weigh in on that, too.
Need specialized insight on these and other data privacy and information security regulations? Contact Apgar & Associates, LLC at 503-384-2538. Our in-the-trenches knowledge and professional consulting will help you and your workforce with compliance and critical certification preparation.
When your goal is to protect PHI on laptops and mobile devices, keep in mind that information security is only as strong as its weakest link. Lenient information security standards exponentially increases the risk to sensitive healthcare data. It can also place you in non-compliance with the HIPAA Security Rule. On top of that the courts are likely to see it as a security failing in the case of data breaches. Now you’re looking at an expensive law suit!
An abbreviated overview of the HIPAA Security Rule’s general requirements calls for covered entities and business associates to do the following:
- Ensure the confidentiality, integrity, and availability of all electronic protected health information the covered entity or business associate creates, receives, maintains, or transmits.
- Protect against any reasonably anticipated threats or hazards to the security or integrity of such information.
- Protect against any reasonably anticipated uses or disclosures of such information that are not permitted or required under Subpart E of this part.
Can you demonstrate device encryption?
CEs and BAs, keep in mind, too, that you can’t take advantage of the HIPAA Breach Notification Rule safe harbor if you can’t demonstrate that stolen devices were actually encrypted at the time. If the device isn’t locked down, it’s hard to prove that the device was secure and no PHI or PII accessed when the device is lost or stolen. While Apple tablets and smartphones are natively encrypted, either end users or IT staff need to enable or turn on encryption for Android tablets and smart phones, Windows laptops, tablets and smartphones and Macs. Take the below steps to protect laptops, tablets and smartphones – and to protect PHI.
7 Steps to Laptop Data Security & Intrusion Protection
- Remove administrator privileges for all company-owned laptops and lock down devices
- Install and maintain mobile device management tools that support:
- Remote wipe of hard and flash drives
- Device tracking in the event a device is lost or stolen
- Enforce encryption of hard drives and flash drives
- Install and periodically update anti-malware applications
- Install and periodically update firewall applications
- Enforce strong passcodes or passwords and require periodic password changes
- Enable biometric authentication if available
- If using Windows, properly set share and Microsoft New Technology File System (NTFS) permissions to keep network snooping to a minimum and unauthorized users out of sensitive files stored locally
6 Ways to Protect Tablet & Smart Phone Security & Prevent Intrusion
- Remove administrator privileges for all company owned tablets and smartphones and lock down devices
- Install and maintain mobile device management tools (company owned and personally owned; BYOD) that support:
- Remote wipe of flash drives
- Device tracking in the event a device is lost or stolen
- Enforce encryption of flash drives
- Preferably – segregate company data from personal data on BYOD devices
- Install and periodically update anti-malware applications (Exception: iPhones and iPads)
- Install and periodically update firewall applications (Exception: iPhones and iPads)
- Require strong passcodes or passwords and regular password changes
- Enable biometric authentication if available
Device hardening is considered a reasonable security safeguard which means it’s a “must do” when it comes to HIPAA compliance and state law compliance in some states. Take the necessary steps to protect PHI and avoid the bad headlines, regulatory penalties, law suits and lost business. If you need to beef up compliance planning, conduct your security risk analysis, or just aren’t sure where to start with any of it, give us a call: 503-384-2538.
Malware attacks via phishing knocked it out of the park in 2018. Phishing attacks account for an inordinate number of the data breaches and compromised networks. In fact, the Identity Theft Resource Center (ITRC) reported that “one-third of all security incidents last year began with a phishing email.” As the cyberattacks get sneakier, everyone – workforce and consumers – are at ever-higher risk of breaching data privacy and security.
5 Pointers to Avoid Getting Hooked
- Conduct penetration testing, aka pen testing. Pen testers employ the same tactics as hackers, but to your benefit. You’ll discover how effective your firewalls and patches are as well as how well your workforce “gets” anti-phishing training.
- Conduct phishing-specific training. Human error continues to be a big gap in privacy and security effectiveness. One click or tap on a link or attachment opens the gate to phishing malware. Scenario-specific, interactive, out-of-the-box training sessions make the biggest difference.
- Stay on top of the latest phishing and smishing (mobile device phishing via text) techniques so you can take measures to prevent systems infiltration as well as keep your workforce alerted.
- Encourage transparency internally and externally. Whether it’s an employee who opened the backdoor or a third party partner, you need to know when security has been breached. Promote admitting, “I may have messed up” and what to do the second it happens (aka per security incident response).
- Keep anti-virus, anti-spam and anti-spyware software current. Hackers are smart cookies but if you’re not on top of essential technology safeguards, they don’t even have to try.
If we were going to choose one tech tip and one human error prevention tip to focus on in Q1, we’d select pen testing and anti-phishing training. One pen testing researcher is so intent on lighting a fire against phishing that he published his scarily successful pen test.
And should all prevention measures fail, you’ll need backup. Which brings us to: Keep recent backup system copies readily accessible. If phishing does get through, you’ll want to be able to quickly go back to a “safe” backup so you can get operations back up and running. With response measures in place, the sooner you know, the faster you can act.
A few days ago, after making multiple attempts on behalf of a client to verify and clarify how join.me supports HIPAA compliance, specifically participating in Business Associate Agreements, I found that they do not. In fact, they do not consider themselves subject to HIPAA regulations, regardless of the possibility of PHI being stored on the join.me platform. Therefore – as you’ll see in the exchange below – they “do not sign BAAs.”
So, a warning to those who use join.me and store recordings that include PHI on the join.me platform – join.me is unwilling to execute a business associate agreement with covered entities and business associates. If you need a video communications platform that supports the storage of PHI and is HIPAA compliant, it’s wise to look elsewhere.
Below is a reprint from a warning I posted on LinkedIn just the other day. Please feel free to share your experiences of similar situations and vendors with me in the comments area on that post. Here’s my email exchange with join.me.
I’m attempting to get an answer one last time. I represent a mutual customer who currently uses join.me who is required to comply with HIPAA. Given the fact that protected health information (PHI) may be stored on join.me‘s platform in the form of recordings, join.me is required to sign a business associate agreement with my client. If join.me is unable or unwilling to sign a business associate agreement, I need to recommend that my client change to another conferencing platform such as Zoom or WebEx who will sign a business associate agreement.
On Dec 2, 2018, at 6:42 PM, join.me Support wrote:
Thank you for contacting join.me.
We actually do not sign BAAs because our services are not HIPAA compliant as HIPAA compliance, per se, is applicable only to entities covered by HIPAA regulations (e.g., healthcare organizations).
That being said the technical security controls employed in the join.me service and associated host and client software can meet or exceed HIPAA technical standards. But again, we are unable to sign any BAA’s.
If we have answered your question, we will send you an email in the next few days asking for your feedback. We value your opinion and thank you in advance for taking the time to click on the survey link and letting us know how your experience was with our team.
Thanks again for using join.me.
L*** | Customer Support Representative
My reply to join.me
You (join.me) answered my question. My client will be looking for another vendor. While the functionality may be there to secure the data, my client would be violating HIPAA by continuing to use the join.me platform. As the US Department of Health and Human Services, Office for Civil Rights has stated, claiming to not be a business associate doesn’t mean you actually aren’t one. I also feel a need to remind covered entities and business associates they shouldn’t be contracting with join.me if the platform will be used to store recordings that contain PHI.
Chris Apgar, CISSP
Ultimately, I had to recommend to the client that they not use join.me but check into online video and document storage with vendors who will sign BAAs, such as Zoom or Webex. The instance serves as a reminder that no matter how technically secure a vendor professes to be, if you plan to use their platform or services for anything pertaining to PHI, there needs to be a BAA in place, documenting that they follow HIPAA regulatory requirements as relates to PHI protection. And as I indicated to the customer support representative above, claiming that you’re not a business associate doesn’t magically transform you into not being one!
Privacy & Security Forum Update: OCR Activity, Audit Protocols, Ransomware & the HIPAA Security Rule
Julia and I had the pleasure of attending the 2018 Privacy & Security Forum a couple of weeks ago. One of the sessions I attended was focused on what’s happening at OCR these days. The speaker was Roger Severino, Director of OCR, and the moderator was Adam Greene, partner at Davis Wright Tremaine, LLP. I heard about new OCR activity, got an answer to my question about the future use of the OCR audit protocols, and key OCR takeaways. I have the pleasure of passing the Forum’s highlights on to you.
OCR audit protocols use.
The big news to me was the answer to one of my questions about OCR audit protocols. For over a year, we’ve been saying that for investigations and enforcement activity that it’s likely the OCR will use the audit protocols that were updated from the phase 2 audits. I took the opportunity to ask the top authority at OCR about future use of the protocols. Mr. Severino confirmed – that’s just what OCR intends to do and may already be doing so.
Other OCR activity includes:
- Updating HIPAA/FERPA guidance (jointly with the US Department of Education)
- Issuing a notice of proposed rule making (NPRM) request for information (RFI) HITECH Act accounting of disclosures language (the last NPRM was not well received by the industry and privacy advocates)
- Evaluating ways OCR can distribute funds received as part of enforcement related civil monetary penalties and settlement agreements to victims of breaches of their PHI
That’s a fair amount of activity. The only caveat is we don’t know how soon “soon” is.
FBI and FTC weighs in on ransomware attacks.
I also attended a session that featured speakers from the FBI and the FTC. Along with Mr. Severino the FBI said the first step covered entities and business associates should take is to contact the FBI if you’re attacked by ransomware. The FBI has agents in place to investigate ransomware and help covered entities and business associates get their data back without paying a ransom. This is something to keep in mind when you’re updating your security incident response plans especially given local law enforcement may not have the resources to assist with an investigation.
Is the HIPAA Security Rule being updated?
There has been much talk over the past few years about the need to update the HIPAA Security Rule. The Director indicated that he things there is nothing fundamentally broken with security rule so it’s unlikely the rule will be amended any time soon. The Security Rule is technology neutral and is flexible. It hasn’t become obsolete due to changes in technology and there has been a lot of change since the rule was published in 2005.
OCR phase 2 audit results and plans for enforcement.
Mr. Severino shared that OCR was finalizing phase 2 audits and results will be published soon. As far as the audit program goes, he indicated that there would likely be no more formal audits. Instead, the audits would become part of OCR’s enforcement activity. He believes this promotes an enforcement mindset with a higher-level rigor, similar to enforcement activity conducted by the US Department of Justice.
An audience member asked if enforcement would continue unabated or would be curtailed under this administration. The answer: OCR is still on track with enforcement. Mr. Severino would like to see enforcement go down as a reflection of the expansion of a culture of compliance, which OCR has been pushing since 2011. He did add that the industry was far from there today.
Adam Greene asked Mr. Severino to provide three takeaways for the audience. The Director said:
- You need to treat PHI as if it was a bar of gold. That includes conducting periodic risk analyses, encrypting PHI and securing mobile devices.
- “We’re from the governments and we’re here to help” – tap into OCR resources through its website, the most popular website for the US Department of Health & Human Services.
- “Help us help you” – review NPRMs, RFIs, and other information OCR would like input from the industry about and provide feedback. Periodically check regulations.gov to check on opportunities to provide OCR feedback.
All in all it was a great conference and good to get information from the proverbial horse’s mouth. Julia will be sharing information about some of the sessions she attended. Look for more in the weeks to come!
Has this happened to your company? The sales team has a hot prospect who wants them to conduct an information security audit. Sales promises that not only can that happen, but also that it will happen by a specific deadline. The problem? No one checked with the C-suite or operations management before committing.
This communication – and timing – disconnect between sales and operations can cost companies both prospects and current customers. Information security is traditionally implemented and maintained behind the scenes. In today’s market, particularly for healthcare vendors, good market positioning means that information security has to be front and center.
As an example, the demand for a SOC 2 audit report is on the rise. Healthcare vendors and other service organizations are being asked for it as proof of a sound information security program. We work with clients as they prepare for and proceed through SSAE 16 SOC 2 audits. In cases where vendors engage a CPA firm conduct a SOC 2 audit, we find that the decision to go through an information security audit comes from two places: the C-suite and sales. The C-suite sees the audit as a way to retain current customers and to maintain marketability. The sales team looks at it as another strong sales point.
What happens when the sales team over-promises?
If the sales team sells a product or service based on the assumption an information security audit can be done without checking in with its IS department, they may find themselves in a huge bind. It’s even more problematic if the company executed a customer contract along with the promise to conduct a SOC 2 audit. Imagine how that will come back to bite the company when the customer demands a copy of the nonexistent report!
In one instance, a company we’ve worked with in the past lost out on a multi-million dollar deal based on an over-promise. Sales promised they would complete a SOC 2 audit, that they then delayed for a couple of years. The prospective client walked away from the table. Remember, the proverbial grapevine works well, healthcare industry or otherwise. If you’re doing a great job, people will hear about it. If you fall on your face, they’ll hear about it faster.
Sales teams like to run full steam ahead, promising results, valuable products and enhanced service. That’s a good thing. That’s how companies stay in business and continue to grow. Often, though, IT / IS is left trying to figure out how to keep the promises made.
Vendors for healthcare and other service organizations are under mounting pressure to prove customer data is safe and secure. Information security is a market driver. If sales and the information security team aren’t on the same page, the outcomes could be disastrous for business. So communicate amongst yourselves! Sales, IT and the information security team. Actively involve the C-suite. Then you can be assured the company is steered in the right direction, with the right resources. When promises measure up to delivery, everyone is happy.
I’m often taken aback by some of the marketing material I receive from privacy and security training vendors. This is clearly a “buyer beware” moment. The review of a training vendor’s material can give you some insight into their credibility. Particularly if you’re already somewhat knowledgeable of the material that needs to be covered in any privacy and security training session you’re looking to enroll in. The training risk comes when someone doesn’t have a good grasp of the material, because they may well be being fed outdated information or worse, partial truths about HIPAA.
I may be a little sensitive because of the type of privacy and security training that we and some of our partners provide. Timely, current event-relevant, regulation-sensitive training. But in this instance, we received a vendor mailing focused on email integration and texting in the healthcare communications environment. Sounds entirely reasonable, right? Unfortunately, the marketing copy reflected outdated or even misleading information.
Marketing hype or regulatory reality?
The vendor’s privacy and security training marketing materials included these topics and observations, presented as facts:
- Email and texting are in the early adoption stages in healthcare settings. Texting is becoming the preferred engagement, overtaking paging.
- Mobile phone use for texts or calls relating to payment, to provide critical healthcare information or other official purposes is a no-no for providers and violates HIPAA.
- Risk evaluation and management related to business communication that may or may not contain PHI is under scrutiny. Improper exposure may be considered an official breach.
- Violation enforcement can include fines up to $50,000 per day and more.
- Impacts of the Telephone Consumer Protection Act (TCPA) limit the use of cell phones for payment and healthcare purposes unless consent is obtained.
Let’s take it from the top. First of all, texts and emails are common in today’s healthcare environment. While the topic is worth addressing as part of ongoing training (and hopefully touches on serious email threats like phishing), it’s not a new issue.
Secondly, clarification is in order when it comes to texts. HIPAA doesn’t require covered entities to obtain consent before, say, sending an appointment reminder via text message. I do, however, think it’s a courtesy that should be extended because not everyone is comfortable with anything to do with their health being texted to them.
Now to take it a step further, if the email or the text message is encrypted, there are really no HIPAA consent requirements. If the individual requests texts and emails be sent unencrypted, covered entities do need to document that the individual making the request has been informed of the dangers associated with unencrypted transmission of PHI. That’s not the same as obtaining consent.
When it comes to risk evaluation and risk management, yes those are hot items. And while I do wonder what an “unofficial” breach is, I agree the improper exposure of PHI may result in a reportable breach. Please keep in mind that if the exposure is unintentional, like a misdirected email, it may or may not be a reportable breach. That’s where the HIPAA Breach Notification Rule’s four factor risk assessment comes into play.
Here’s where I seriously part ways with the material: the violation enforcement information and the penalties.
If you’re doing the right thing, discover a breach, follow the required investigation and notification process and you timely report the breach to OCR, you likely won’t be fined by OCR. Now, if there is a breach and OCR finds you haven’t conducted a risk analysis, haven’t adopted current and enforceable policies, haven’t trained your staff and so on, then yes, chances are higher that you’ll be paying in the form of a penalty or monetary settlement.
As far as the $50,000 per day, OCR can levy penalties up to $50,000 for a single violation up to a maximum of $1.5 million per calendar year. There’s no reference in any OCR guidance that violations are counted in days. They could in fact be counted as the number of records breached. If, as an example, 1,000 patients’ PHI was breached, OCR could count that as $50,000 X 1,000 (if you’re found guilty of willful neglect). Because the penalty amount calculated this way would exceed $1.5 million, the maximum penalty amount would be levied unless a lower amount was negotiated between OCR and the breaching entity.
Finally, the TCPA. I need to point out that the TCPA was enacted in 1991 – pre-HIPAA – and addressed robocalls. It had nothing specifically to do with text messages and healthcare.
The bottom line on healthcare privacy and security training.
Emails and texting to communicate healthcare information has been going on for years. Keep in mind that yes guidance from OCR (“Right to Access”) emphasizes the need for covered entities to communicate effectively with patients there is no reference to text messaging or emailing other than to state that patients can request communications be made using unencrypted email as long as the risks associated with it are clearly communicated. There is zero reference to text messaging in the guidance or in HIPAA itself.
I wholeheartedly agree that you need to regularly conduct privacy and information security training with your workforce. I also agree that you need up-to-date privacy and security training documentation.
I’m concerned that there are entities not up on the risks and how those risks are associated with patient communication. The first edict from HHS that applies to the use of email to communicate with patients dates back to January 2013 (the Omnibus Rule) and February 2014 (the HIPAA CLIA Rule) respectively.
Training vendors need to be vetted. If you or your staff are going to take your valuable time to attend any vendor-offered training, you need to know that it has more real-world application to privacy and security risks, engages employees on how they can protect ePHI, and accurately reflects regulatory requirements. More HIPAA realities, less marketing myth.