Sunday, February 16, 2014

mHealth and HIPAA breaches - where are they?

For the last few years we've seen dozens of HIPAA violations revolving around doctors and lost or stolen laptops. These laptops are either encrypted or unencrypted (more so than nought) that contain patient ePHI.

What we haven't seen, is a single reported HIPAA violation from a covered entity over a lost or stolen mobile device. If it has happened, I have missed it.


Back in December I made news on this blog by showing how easy it was (30 seconds of work) to exploit "Certified" mobile health Apps. These were Apps that paid good money to be certified as one criteria of being "safe and secure" - protecting ePHI.

But with the slightest bit of handiwork I demonstrated what anyone with a computer could have done, I was able to pull off unencrypted ePHI - I was even able to do a man in the middle attack and pull out unencrypted traffic to a cloud server.

Why is this important?

So, even a "Certifying" entity failed to properly validate security within mobile Apps. A certifying entity that was marketing to hospitals as being the "Authority" on recommending mHealth software.

But let's step back and look at some figures.

It has been reported since 2010 that 80% of doctors are now using their mobile phones while seeing patients. If we take the pool of over 500,000 docs out there, that is some 400,000 mobile devices floating around that are being used while seeing patients.

What I haven't found is any information breaking up the actual usage of mobile phones during patient visits. For example, there is a big difference between using a mobile app that is a reference vs one that stores patient ePHI on it.

Though, I have seen a staggering number of physicians that get paged via SMS text message, which include patient ePHI in them... Again, eek.

But, we know for a fact that doctors are using SMS and MMS to communicate with one another. Patient information, pictures, all over unencrypted communication. Which is why we are seeing a surge in "Secure Messaging" products out there (I think half the booths at the mHealthSummit were secure messaging platforms). In reality, doctors are trying to solve a problem and I doubt most of them realize that SMS and MMS are terribly unsafe. Then again, medicine still use miserable fax machines to send ePHI over, so is SMS or MMS any worse? No.

What does this mean?

I'll admit this article is a bit self serving. I am an advisor/partner in a mobile health startup and we have invested a lot of time into building a product that is rather tough to hack.
Note: The concept of an "unhackable" product is a farce. Even the Department of Defense and three letter agencies know this. The idea is that you make your software so time consuming to hack the juice isn't worth the squeeze.
But, our investments in building the technology aren't self serving, they are what mobile health should be. If a doctor has his or her phone stolen, there should minimal risk anything could be compromised.

There are generally a few sets of people building and writing mobile health software at the moment. We have indie developers writing software, we have doctors learning how to program and writing software, there are large companies that are entering the mobile market.

What most of these people writing mHealth software lack is any formal understanding of building secure software. This is where the vulnerabilities are introduced. Security is hard, it is expensive, it isn't easy to tackle. For most, it seems like an afterthought. "The device takes care of it" is a preposterous line of thinking.

With all that in mind, lets get back to the figures...

A reported 4.3% of smart phones are lost a year according to McAfee. That means there are about 17,000 physician devices lost each and every single year. Why is that scary? It gets scary because 57% of devices have no security features enabled. No PIN lock, no passcode, no encryption.

So, it means that there are potentially over 9,000 unencrypted and unprotected mobile phones lost or stolen, every single year. Seeing as BYOD is in full swing (even though most lack decent BYOD policies), many docs aren't hospital employees themselves, the idea of business owned and managed smart phones isn't here really.

Are you scared yet? You should be. A doctor gets a laptop stolen out of his car and reports are filed (as they should be). A doctor has his smart phone stolen and he just goes and buys a new one and no one thinks differently.

What can be done?

First, hospitals should have strict BYOD policies in place. MDM policies should enforce strict password and encryption policies at a minimum

But, the issue goes deeper than just that. We are still in the Wild Wild West of mHealth. How do hospitals and doctors offices know any mHealth App off the App Store is safe and secure? Even I don't know a good answer for that. Certification so far has been a complete farce and failure.

With MDM you should be able to view which Apps physicians are using and gauge from that where your risks are.

We have taken the approach of documenting all of our security practices, mainly on the device in a white paper we give to any client that asks. Our goals are to keep ahead of the curve in adding new security features - but it is an ever evolving field. There are more gifted hackers trying to exploit things than there are people who can write good security software.

If you are a doctor, a hospital administrator, you need to start understanding what products are being used by you and your staff and validating internally if they are safe and secure. You are at risk and you don't even realize it.

Lost and stolen mobile devices are a major threat to patient ePHI and it is time people stopped brushing the issue under the rug.

No comments:

Post a Comment