Tuesday, December 10, 2013

Certification for the lack of certification

A big swing and a miss.

This is the second iteration of this blog post. I had originally written this and given the App developers involved the traditional "notify and wait to publicly disclose" that White Hat Hackers do. Note: I don't consider myself a hacker by any means. I scrapped the entire article and this is a rewrite.

Happtique created a bunch of buzz earlier this year by pushing the digital health industry to self regulate rather than be forced into regulation by the FDA, ONC, etc. A noble goal, I'd much rather have less regulation in the industry.

Happtique has now released their first class of certified mobile health products. They have certified 19 Apps for a variety of criteria, of which, security interests me the most.

Due to my background of developing secure software within the Department of Defense and various other government agencies and building mobile apps for healthcare, security is priority number one in my mind.

Mobile devices used by physicians and end users will hold an immense amount of ePHI. As often as cell phones are lost by the average consumer, logic dictates that a staggering amount of iPhone and Android phones have been lost by physicians - all that has varying levels of data protection of ePHI stored within the Apps.

I cannot find a single instance of a HIPAA data breach concerning lost cell phones. Lost laptops seem to be the hot item that is lost and triggers notification of a data breach. Either entities are ignoring these data breaches or they are ignorant of them.


Happtique's certification sells itself as:

Happtique believes that in order for mobile health apps to be widely adopted into routine clinical practice and care management, healthcare providers and patients must have confidence in the apps they are using. Happtique developed HACP to help providers, patients, and others easily identify medical, health, and fitness apps that deliver credible content, contain safeguards for User data, and function as described.

I also brought up a host of issues I found with Happtique back earlier this year, communicated them with Happtique, and nothing has seemingly been fixed.

On December 2nd, Happtique announced its first series of Apps which they certified. I at random picked MyNetDiary's Diabetes Tracker

I installed the App on a jailbroken iPhone, which allows me to pull data off of the iPhone and see what I can find.

Case #1

I found two startling things: my username and password where stored in plain text, in a flat text file. The next was all the information I entered into the App was also stored in plain text, in a series of files that had JSON content.

Today I dug into the App again to see what else I could find. I wanted to validate to see if the App was subject to a man in the middle attack (MITM).

Man in the middle attacks are where a nefarious source intercepts your communication from the App to the server. They decrypt the SSL connection, pull out your data, and send the data on to the server. When data comes back, it is decrypted by whomever is doing the MITM attack, re-encrypted and returned to the requesting App. Example: When you walk into Starbucks and connect to an open WiFi connection, a nefarious agent could have setup a fake WiFi hub and can steal all of your data.

To my horror, this App that allows you to input glucose levels, was sending information to the server over HTTP, they didn't even use HTTPS. This means, user glucose readings are stored, sent, all in plain text. This means anyone could steal the data, by either stealing your phone or hijacking your WiFi connection.

I reached out to MyNetDiary on the 2nd and then reached out to them on Twitter a bit later. I haven't heard a single peep from them.

Case #2

Next on Dec 6th, I picked TactioHealth5. I performed the same series of tests on their Apps.

TactioHealth5 also stored username and password in plain text. There was one differentiator, TactioHealth5 stored their information in the iPhone Keychain. But: they debug logged the username and password to a plain text password in the file system.
Update 12/11/2013: Looking over the files again, Tactio stores the password in the keychain, but it also saves it in plain text to a plist file in the App Preferences. Password is still written out to a log file.

All of the user ePHI which was entered was stored in an unencrypted, in plain text, in an SQLite database.

Today I also performed the MITM attack on the App. TactioHealth5 is open to a MITM attack (They do use HTTPS), they send their username and password in plain text to the server to validate the user.

I also reached out to Tactio to inform them of these issues. I heard nothing back from them. I also ran into them at the mHealthSummit in DC and asked them why no one got back to me. Tactio's representative informed me that (summarizing) "The version in the App Store isn't the one certified and they are fixing some issues in January."

I call shenanigans on that, because Happtique's registry certifies version 5.0 as being the valid certified version of TactioHealth5, TactioHealth5 is currently on 5.1. So, why would TactioHealth5 have added logging of credentials, adding the MITM attack, and stored data in unencrypted formats?

The issues


The password and username are in plain text. They failed to even provide hashing or encryption server side for the credentials.

The Keychain

In iOS, the keychain is the "safe" place to store data. It encrypts it with AES-256 encryption based on the user PIN. The issue with the keychain is, it is only as safe as the PIN the user enters into the phone. It takes around 20 minutes to brute force crack the PIN on the iPhone, meaning you can derive all the data from the users keychain.


SQLite should be the file storage format of choice. Developers can leverage SQLCipher to provide AES-256 encryption for the data at rest. Generally, Apps force the user enter a PIN within the App every time they access the App.

Jailbreak detection

Apps should leverage jailbreak detection to prevent Apps from running on exploited devices, ever.

MITM attacks

Preventing MITM attacks requires a bit of work. There isn't a terribly easy way to do it, but generally it requires leveraging SSL Pinning to validate the SSL certificate returned by the server.


I changed my mind about waiting to publicly disclose these issues. The fact neither company has reached out to me and Tactio decides to tell me what seems to be nothing more than misinformation - means to me neither company cares.

I'll be very clear: this is a warning to anyone developing mobile products. If someone writes you to say "Hey I found some vulnerabilities within your product" - you contact them as soon as you read the email. Everyone will make mistakes, it is just the way development is. It is how you mitigate those issues that matters.

Both of these products are collecting ePHI, MyNetDiary advertises 2.7 million users. That is quite a bit of vulnerable ePHI sitting on many mobile devices.


Now we get to the bigger issue, Happtique. Happtique farms out the validation of the actual software to Intertek.

I cannot comprehend how both Happtique and Intertek failed to catch these litany of issues present in both products. Storing plain text passwords is unreal. Storing unencrypted ePHI is crazy. Sending ePHI over HTTP is inexcusable.

I had met and talked to Happtique's CEO this week at mHealth, but that was before I discovered the HTTP communication issue.

The industry, consumers, and anyone putting stock in Happtique's certification should understand these issues.

Happtique can and I am sure will try to correct these issues, but for the life of me - how did anyone miss such massive security holes? These aren't just things you miss.

Certifying your product so you can have a shiny sticker that says you are certified, is the worst possible reason to go through certification. Certification should mean something. Hasn't HIT learned enough from EHR certification?

Health IT doesn't need yet another entity siphoning off money and providing nothing in return for the investment. There is enough of that already.

I wouldn't put a single App I have developed through their certification process at this point.

Update 12/11/2013:
Files from the MyNetDiary App
Files from TactioHealth5


  1. Thanks for taking the time to survey for this stuff, and for writing about it. Count me in as interested in an updated post if you ever hear from either company.

  2. Tyler, no worries. I was on the fence about disclosing the names of the products, but their lack of responses and issues presented were pretty bad, IMHO - especially for personal health information. I have extended the courtesy of private disclosure plenty of times before - but those entities were very responsive.

    I'll update if I hear anything.

  3. Great piece, Tyler. We need you to write about these security problems on an ongoing basis. Thanks, DCK

  4. Harold,

    Thank you for the excellent article. I have two quick questions / comments with regard to your recommended security approach:

    1. Encryption of "at rest" data: iOS provides several data protection classes. The "NSFileProtectionComplete" class encrypts the data files unless the app is open. While it is still subject to brute force attacks, I thought it is simpler and more secure than SQLCipher? After all the SQLCipher key needs to be stored in the application itself.

    2. Preventing the app from running on Jailbroken devices. To me, this goes against the whole idea of "patient centeredness". Who are we to override patient preference, and dictate which device the patient must use? If a patient is willingly putting his own data at risk (she could write it on a paper journal after all), I am not sure the app developer should play "police man".

    Would love to get your thoughts on these. Thanks!


  5. #1 - NSFileProtectionComplete is subject to brute force which is the issue. Only 44% of people even use a PIN on their phone (I can't find the stats on people that use more than a 4 digit PIN) - so it leaves a huge vuln. With SQLCipher, you couple forcing a user to use a PIN within your product, hash+salt it and use that to secure your App with SQLCipher. It removes you from assuming the user has set a PIN for the device itself.

    Now, you could theoretically derive a key and store it in the keychain w/o forcing the user to enter the PIN. IMHO this is better than just relying on NSFileProtectionComplete, but you are still open to brute force attacks. At least with that, you wouldn't be open to people lazily pulling data from the phone unencrypted - at least your stored data is encrypted. You then have some protection at least.

    #2 - I am sort of on the fence on. I think I wrote this more towards the "doctor is storing patient data" on the phone - where they should be forced to not utilize a jailbroken device, because it opens up security holes for patient data.

    Perhaps you are correct that patients should have the right to choose if they want to risk their data. But, I guess, the question is, do people understand the risk of their data when they root or jailbreak. No simple answer.

    Great points!

  6. Thank you for your thoughtful response. I agree 100% that a clinician user must take all precautions to protect other people's data. A more difficult case is for apps we want patients to use themselves. As developers, we have to balance security, ease of use, and patient preference.

    I think a lot of these come down to educating the user about the risks. People do not set that 4-digit PIN for a reason -- they value convenience over security. In fact, the phone has a lot more sensitive data than PHIs (private photos, locations, cash cards in passbook, email etc). If the user does not care, it is difficult for the developer to do anything. ;) For instance, if you are forcing them to set a PIN to access an app, they will either use "0000" or simply do not use the app. Tough decisions. :)

    Speaking of which, I wish future iPhones will encrypt data using a hash from the fingerprint (if available). A hash from the fingerprint will be much harder to brute force.


  7. re: checking for root, it's trivial to check for root as best you can then fire a warning to the user that the app is not secure if the device is not adequately secured. Bear in mind you cannot always be sure the device isn't rooted, and you would be removing a lot of power users and potential early adopters from the market by shutting out rooted devices. And - in general - those who root are those who understand the dangers (or they quickly become those who own bricks and can't run any apps at all). That said, HTTPS _everything_, _always_. Encrypt at rest _everything_, _always_. Should be illegal not to, IMO.