Aetna’s health platform CarePass, announced about three years ago, met an earlier demise at the end of August than many would have expected, with some important signals to the “health platform” marketplace. Among all the HealthKit and Apple Watch attention, it’s important to recall what’s really important with health and health data: giving users value in a way they can trust.
According to HealthData Management:
“’The closure of Aetna’s CarePass illustrates the struggles companies in the digital health space are experiencing and facing in developing and sustaining users, and business models to scale,’ said Frost & Sullivan’s mHealth/telehealth expert Daniel Ruppar.”
The closure highlights the difficulty that health insurers have in providing tools and solutions for users, and open platforms for developers. It shows there needs to be a bigger combined value proposition and strategy than providing sources of data for insurance companies. The user’s benefit must stay central and trust must be core to the products.
At the Healthcare Unbound conference in the summer of 2013, I asked a CarePass executive what they were going to do about trust because healthcare insurers are consistently ranked near the bottom for consumers. The executive’s answer made it clear that the whole issue was a bit of an afterthought, which was predictive. After hearing about all of the great things the platform was going to do, I was left with, ‘Well, that sounds great for Aetna, but what does it do for patients beyond what other apps can already do?’
I could clearly see what CarePass as a platform did in combing all this data, but I couldn’t see an overwhelming value proposition that would convince me to share my information with an insurance company. The only way to truly get there is to pay for outcomes and for insurers to dig deep into how people want to be healthy—on the users’ terms.
The trust, while improving, is still low. These reactions may change, albeit slowly, if consumers become convinced that insurers are on their side with more accountable care models. It will take time and words backed up by actions and outcomes.
Along Comes HealthKit
The news on CarePass led VentureBeat to ask “is this a bad sign for Google (Google Fit) and Apple (HealthKit and Apple Watch) ?”
My answer is maybe. As usual with data, particularly health data, and building a health data economy (yes, we may have arrived), it comes down to trust of who’s holding the data, and what they can or can’t be trusted to do with it.
Interestingly, whether people trust a company to keep data safe, and whether they trust a company are flipped upside down.
On the bright side for insurers, people appear to trust health insurers to keep their personal data secure, second-most by industry, but still low overall at 26% behind financial services firms, according to Gallup. Social networks and applications came in dead last at 2%.
Overall, there are very few industries or organizations we trust with our data, but perhaps surprisingly, these numbers get flipped on their head when we talk about which industries we trust. Trust in data security and trust in what companies will do with that data are very different. You may trust a bank to keep your money safe, but maybe that’s because you see it as their job.
On the opposite side of the spectrum, people trust technology companies as entities at 79%, but they don’t trust many of the companies to protect their data—financial services companies come in dead last.
So these are very different studies, and I don’t want to draw to many conclusions, but what it means to me is that there is a real opportunity for a technology company to manage health data in a way people trust, that’s the opportunity for Apple. It’s going to be difficult for Facebook and Google, whose business models depend on selling data.
While the timing could not have been worse for the widespread publicity of hacked celebrity iPhones and iCloud accounts, there is an opportunity here for Apple to win or lose trust. Apple hasn’t been known for selling personal data to their own ends, even if they make money off of apps that do. In one convincing bit of information in relation to Apple Watch and SmartPay, via mobilehealthnews:
“With Apple Pay, users can make electronic payments without Apple seeing what they bought, who they bought it from, or how much they paid. And the cashier doesn’t see the user’s name, credit card number or security number. The promise had shades of Apple’s recent ban on future HealthKit developers from selling data collected via the platform to third party data aggregators or targeted advertising systems.”
Apple is not reliant on traffic flows or user data, only indirectly through app developer success and the sale of hardware that supports those developers. Selling data has never been part of Apple’s DNA. As mentioned above, Apple smartly announced plans to prohibit app developers for HealthKit from selling or distributing data to advertisers or even storing health data in iCloud.
While that may not fill users with trust in the platform, it’s a smart move. If any company, historically, has been committed to (and has reaped the rewards from) giving consumers control over their environment and their data, it’s Apple. We’ll have to see if that’s enough trust and enough of a value proposition to get consumers to use HealthKit, but it’s a step in the right direction. Questions on whether they can control how the data are used remain open both in reality and in public perception.
Trust has three core elements: 1) consistency 2) competency and 3) alignment (do you have my back?). The problem for insurers has always been with alignment. It’s never been clear that health insurers have been on the side of the members they serve, not through any moral decay, just misaligned incentives. When consumers need the company to pay for services, and the company would often prefer not to pay (or, historically, delay payment), conflict and distrust develop.
It’s amazing how many companies forget trust when it comes to platforms, networks and user data, but it is fundamental. Facebook and Google may use my data, but the value is so high we tend to forgive and try to forget what’s happening, or tend not to dwell on it at least. I’m not so cavalier with my health data, which is 1000-fold more valuable to a hacker than credit card data (it’s used for identity theft). We all weigh perceived risks and consequences, often attempting to keep the perceived risk low in our mind’s eye, but that’s more difficult with health data, and requires a deeper commitment to trust.
I’ve found it helpful to think about three or four fundamental trust requirements for platforms to connect with people or entities:
- There must be a shared resource. Information on this resource can be made available to those who join, like car space (Uber or ZocDoc). Users can be expected to provide and share, but make clear what they’ll get in return.
- There must be trust mechanisms (like driver or physician ratings, or an app approval process) with clear motivations of the parties (provide ride, get paid). Apple has been very strong in ensuring that suspect apps are not allowed on the platform, to the chagrin of many developers.
- There must be an incentive – an overwhelmingly compelling value proposition – to connect the parties, a valuable offer at the right time and place.
Most organizations forget the importance of these fundamental pieces, thinking first about what users will provide to them.The parties have to know that the platform exists to be on ‘their side’ and can act as their agent. Google has elements of this (Google Maps), Apple even more so.
So ask yourself, “Will people trust me to be their agent? What do I need to do to encourage that trust?”or, alternatively, “What can I give away that will be so compelling users will forgive and try to forget?”