Establishing Trust in the mHealth Marketplace

Consider this:

According to a recent Pew survey on the intersection of mobile and healthcare, 56% adults use a smartphone, and close to half report living with one or more chronic conditions. Seventy percent of of adults living with a single chronic condition engage in some sort of self tracking behavior related to weight, diet, exercise, sleep or health indicators such as blood pressure and glucose monitoring. This self tracking behavior jumps to 80% when considering a population which reports two or more chronic health conditions.

In the mobile space, Gartner predicts that mobile app projects will outnumber development projects for PCs by a 4-to-1 margin by 2015 and according to Juniper Research, the global cumulative healthcare cost-savings from mHealth monitoring and tracking is estimated to reach $36 Billion between 2013 and 2018.

These numbers come as both an encouragement regarding the potential health applications have on patient populations, as well as an exclamation point that users are expecting to see real impact on their individual health with the use of mobile technology.

Now consider this:

According to the Sophos report on mobile security “42% of devices, lost or left in an unsecure place, had no active security measures. A fifth (20%) also contained sensitive personal information such as national insurance numbers, addresses and dates of birth, and over 10 percent could have revealed payment information such as credit card numbers and PINs. Over a third (35%) of the lost devices had access to social networking accounts via apps or web browser-stored cookies.”

As the mobile health market develops, it’s imperative that app developers and their companies earn the confidence of their user base, and that users (both end-users and intermediaries) understand how companies store and use provided personal data. Currently, the majority of health apps in the mobile market fall into the “health and wellness” space, outside FDA oversight, are largely of a low-to-medium level of sophistication while neglecting to address privacy, security and regulatory concerns. It’s very much the wild west of health innovation as developers are in a race to see what sticks.

Enter the rise of mhealth app certification programs, such as Happtique. Founded in 2010 as a subsidiary of GNYHA Ventures (the Greater New York Hospital Association’s for-profit arm), Happtique has developed a mobile health app store, and more recently started an initiative to become the trusted certifying body for the mhealth to foster market confidence and safety. Last week they announced the release of their inaugural class of Happtique Certified mobile health apps, 19 in total that, according to the press release, underwent “both technical testing – the verification of privacy, security, and operability standards by global testing leader Intertek – and content testing, as completed by relevant, independent clinical experts.” Corry Ackerman, Happtique’s current President and COO, states “Happtique’s Health App Certification Program offers an objective way for users to determine if an app will protect personal information, operate as promised, and ensure that the clinical information included in the app has been documented and verified.”

Sounds great in theory, except what was discovered over the next few hours was that several of the Happtique Certified apps failed basic security 101 tests.

See: Happtique suspends mhealth app certification program after software developer exposes security shortcomings

Harold Smith, CEO of MonktonHealth, the developer in the above-mentioned article, discovered rather quickly that several of the certified apps ignored basic information security practices such as salting and hashing login credentials and encrypting user generated electronic personal health information (ePHI) stored on the device or on application servers. As he shows in video, more than one app stored usernames, passwords, emails, birth dates, pin numbers, and personal user data in unencrypted log files on the device that could be accessed in about three minutes of investigation. Furthermore, several of the apps failed to use SSL, HTTPS or any sort of encryption during data transfers and based their encryption keys on four-digit user provided pins opening themselves up to MITM attacks. Considering 55% of adults use the same password for everything, consider the security, privacy and identity ramifications if one of those users loses or misplaces their mobile device. Currently, Happtique does not certify at different levels, so the same quality and security criteria used to evaluate AmazingAbs is also used for Tactio, one of the enterprise focused apps currently in market.

At it’s core, Happtique is comprised of clinicians and social media advisors, but lacks leadership in technology or infosec roles. It’s not surprising that this oversight has happened, but in light of these observations, Happtique has done the right thing and has suspended its certification program, and is working with the community to develop a more robust infosec strategy. Kudos, Happtique. Accidents happen, it’s how we learn and grow as technologists, and how companies process postmortems says a great deal about the company culture and commitment.

What is particularly surprising out of all this is the lack of interest in security practices within the health IT/mhealth community, and the lack of interest in peer review when it comes to public security audits. It’s interesting to note that Silicon Valley has cultivated a robust white-hat community, encouraging exploit bounties for hacks, while enterprise health IT and the mhealth community seem entirely closed off to any criticism. When approached with information regarding these current security vulnerabilities, both companies dismissed the claims without even confirming what the holes were.

As Smith says, “Certification itself is not bad, but the last thing healthcare IT needs is another entity siphoning off more money and providing no ROI.” So, how do mhealth developers work to establish community trust and confidence in the wild west of mobile health development? First, following basic security practices is a great start. Encrypt user login information, securely transmit data with SSL or TSL and build in device data protection/encryption with the ability to remotely wipe if necessary. Use logging with caution. Maintain proper caching and session handling techniques, and keep in mind the Principle of Least Privilege making sure only those device permissions that are absolutely necessary for the app’s function are accessed. Consider and design against man-in-the-middle attacks, and develop strong server-side controls and back-end API calls. Validate user input data to combat core injection, SQL injection attacks and use multi-factor authentication and password strength verification.

This is not comprehensive list, but it’s a good start. The key take away is, as a mobile health developer, always be testing. Stay current on security technology, and exploit development. Stay transparent.

Taking mobile health security a step further, there are a few trusted ways to have an application undergo third-party validation and certification. On the federal side, the Federal Information Processing Standards (FIPS) from National Institute of Standards and Technology (NIST) is the mandatory standard applicable to all Federal agencies that use cryptographic-based security systems, and has been widely adopted in technology and financial sectors. The certification program that vetts systems against FIPS standards is the Cryptographic Module Validation Program (CMVP), and the certification process puts heavy emphasis on cryptographic modules, documentation, physical tamper resistance, and identity-based authentication. There are several different levels and focuses, depending on the application’s specific security needs. CMVP breaks re-certification and re-validation scenarios into five distinct categories, and recertification only needs to be completed where there is a greater than 30 percent change in relevant security items such as the core crypto layer. In comparison, Happtique required re-certification for any update, even minor design UI changes.

In the private sector, viaForensics is an industry leader in mobile security, computer forensics and electronic discovery. In addition to standard training, audit and analysis services, they offer public certification specifically tailored to mobile app development covering the standard best practices for mobile security. Andrew Hoog, CEO and Co-founder of viaForesnsics recommends developers and users alike look past boilerplate standards lists and instead give more weight to community involvement and references. Many of the well respected players in the infosec field contribute significantly to the community through opensource and free information. vaiForensics openly publishes a best practices guideHOWTOs on running self assessments and security audits with Santoku, all of their conference presentations and many more resources for the community.

At last count, with more than 40,000 health, wellness and fitness apps floating around the marketplace, isolating the signal from the noise is a huge hurdle for developers. With an anticipated compound annual growth of 61% by 2017, and the increasing popularity of physician prescribed apps, establishing community trust, is going to be a huge player when the dust starts to settle.

Developers, focus on clinical quality, real information security, and carefully evaluate the legitimacy and value any certification pursued. For those in the healthcare IT and mhealth communities, particularly in media and press, always be in the habit of fact checking and peer review before blindly passing along press releases and link bait. Patients and consumers rely on your information, and your stamp of approval, and it’s important for the success of the health innovation movement that members of our community turn a critical eye to claims from any organization. A little due diligence goes a long way.

The following two tabs change content below.

Lauren Still

Lauren Still currently resides in the San Francisco Bay area and is Healthcare Policy Advisor for DICOM Grid. Her work focuses on security and policy consulting, business planning, social media strategy and technology development.

, , ,