Sharing data is quickly becoming the norm in healthcare. Just in the past week J&J opened all their clinical trials database, including raw data. Meanwhile, major pharmacies are embracing Blue Button to improve patient data access to their prescription histories.
The benefits of sharing this data are just too great to be ignored on both a personal and population level, and people now approve of sharing medical data to improve care by an overwhelming majority.
According to Information Week: A “PatientsLikeMe survey showed that people with chronic conditions are willing to share their health information if it could help others. Of respondents, 94% would be willing to share to help doctors improve care; 94% would be willing to help other patients like them; and 92% would be willing to share to help researchers learn more about their disease.” The results of this and a similar Consumer Reports survey were reported in an Institute of Medicine (IOM) discussion paper.
I started this series relating our current predicament in sharing health info to the ingrained human nature E.O. Wilson describes in “The Social Conquest of Earth” to make the point that the culture of sharing is part of our genetic makeup. Humans are social animals and we owe a large part of our success to our ability to cooperate and communicate. Wilson writes, “Humans, it appears, are successful not because of an elevated general intelligence that addresses all challenges, but because we are born to be specialists in social skills. By cooperating through the communication and the reading of intention, groups accomplish far more than the effort by any one solitary person.”
This isn’t news, necessarily, but I use it as an example to show that our understanding of medicine is about to be revolutionized, driven by social tools and more open networks, scalable communications. The 23andMe case, including FDA and FTC action, serves as a watershed moment to contemplate these issues and develop a plan going forward that minimizes individual risks and maximizes societal benefit, that’s why it’s still news and the New England Journal of Medicine had a Perspective piece this week saying this info should be shared with patients “when it’s clinically validated.”
A few things are now clear:
We want to share for the benefit of medical science.
We have the opportunity to make this work with the right combination of privacy, security, transparency and consent.
There are specific steps needed to move our institutions in the right direction that consumers overwhelmingly support.
We have a unique opportunity to drive the Social Conquest of Medicine, to work together to solve big problems–the key will be to find the right middle line between transparency and privacy.
I don’t have all the answers, but several things have become clear while looking at 23andMe. Based on exploring the 23andMe case, here’s what I think we can do, largely related to genetic tests, but most concepts could be applied more broadly. Let’s take this as an opportunity to have more open discussions about how to move forward.
My top 10 list on how to enable the Social Conquest of Medicine
1. Make access to claims about health benefits open and transparent.
Ubiquitous connectivity and data gathering can allow us to develop and curate trusted information in new ways. Wikipedia, of course, is one such example, arguably more accurate than the Encyclopedia Brittanica (but for much lower cost). There are no longer knowledge providers and knowledge consumers. There are networks of knowledge. How we architect them is how we will be key to driving innovation, new cures and new economies.
According to my colleague, David Maizenberg of Biology Partners,
“Companies like Sage Bionetworks are building out web-based platforms where people can donate health data. In genetics there are open source efforts, like Promethease, allowing users to map their SNP data against the universe of published research. There are many other examples. The trend is clear: consumers are taking control of their health data, and donating and sharing that data, for the betterment of their own clinical outcomes as well as all of mankind, is a fundamental part of that process.”
Accordingly, we should encourage more “ground up” (as opposed to top-down) information collections in healthcare in a transparent manner as validated sources. (Relating this to Wilson’s work, communicating intentions is an important part of the social process. For tech communities and platforms, transparency can make intentions clear, leading to higher levels of trust and coordination.)
There’s an army of genetic counselors that would be willing to lend their expertise and self-police such a site so that there’s less of an argument about what an accurate claim is around SNP data. This is the approach that Promethease with their SNPedia have taken, and it’s the right way to go. In fact, it does a great job of showing how inaccurate and untrustworthy a lot of SNP data is. In an open environment, the lack of evidence and meaning becomes clear as all sides of an analysis are presented. Promethease also has a clear disclaimer on the use of their data.
FDA and FTC should look to products and solutions like SNPedia for future guidance on accuracy, not double-blind clinical trials. A good dose of sunlight can kill many false claims.
2. The first amendment, with a disclaimer. In the United States free speech is protected under the First Amendment, but political speech is different from commercial speech. We as individuals can go online and spout our opinions about how some magical juice cures cancer, and our right to voice our opinions, even if factually incorrect, is protected.
When commercial healthcare-related operations “speak” – especially if it involves therapies they are selling or diagnostic assertions – we enter a regulated area where the First Amendment bumps up against the federal government’s authority to create laws and establish regulatory bodies to protect the population from false claims or dangerous drugs. Nevertheless, as a matter of law and policy, we allow some companies – marketers of nutritional supplements for example – to make assertions of health benefits so long as there is a prominent disclaimer that the statements have not been evaluated by FDA.
A similar approach may make sense for speakers within healthcare social media: deference to 1st Amendment rights coupled with an FDA disclaimer where appropriate. I, along with some colleagues, will be writing in more depth about this issue, in particular as it applies to personal genomics, next month.
3. Real people, well enforced and understood privacy mechanisms. Successful communities have a life of their own, driven by community members. If we’re going to develop these kinds of communities around genetic tests and screening, let’s ensure they are opt-in and full of real people with identities verified, even if it’s an anonymous community. We can’t have trusted online community members exposed to marketing gimmics and false stories made by people who aren’t who they say they are.
Let’s not get into a Facebook situation where choices about privacy change. In addition, let’s follow part of Dr. Bob Kocher and Dr. Ezekial Emmanuel’s transparency imperative, “to protect privacy, the federal government should substantially increase the penalties for inappropriate patient re-identification.”
4. Focusing on the real problems: selling stuff people don’t need, harmful products, genetic theft and discrimination.
Again, I’ll quote my colleague David Maizenberg, “Consider: In the 20th century medicine became both professional and paternalistic. Books have been written on this subject and, for the most part, this trend was positive. The FDA was born to protect the population from the harmful patent medicines of the late 19th and early 20th centuries. The AMA was an effort to professionalize the practice of medicine. Strong arguments have been made against medical guilds and medical paternalism. From our vantage point now it seems absurd that throughout the 20th century doctors routinely kept critical information from their patients and patients couldn’t even obtain their own lab results without their doctor’s permission. Thanks to the Internet and some legislative and regulatory changes, that is all disappearing. Patients are now in charge of their health and lab data, and clinical and therapeutic data is just a click away on the Internet.“
Information alone is not harmful. Prognoses and diagnoses are not on their own harmful. Taking action is harmful. Taking a medication I don’t need is a problem with real consequences, but we have mechanisms already in place to prevent these: they are called physicians and prescriptions.
Let’s continue using those same mechanisms. Knowing I might have a 30% increase in heart disease based on a study for one genetic marker is not likely to change much other than an increased motivation to diet and exercise. Anything medically-related is sensitive, potentially scary, and people naturally become highly motivated to do something about medical information and advice, but as long as we have current mechanisms to prevent DTC sales and we improve transparency of information, people will have places to go to seek more accurate information, and we have to trust them to do so.
Genetic theft and discrimination: Meanwhile, all states should adopt laws for genetic theft (stealing someone’s DNA and analyzing information about them). There are real hazards to genetic data falling into the wrong hands. It’s valuable stuff, let’s increase penalties and enforcement on genetic theft while and the genetic information non-discrimination act, GINA.
5. Bill of rights around health data.
Like the US Bill of Rights, allowing someone to manage your country or your health data should come with some guarantees and founding principles. Let’s follow the Health Data Consortium’s (HDC’s) Bill of Rights as it relates to genetic data. This is in DRAFT form and can be viewed in full on the HDC website.
“RESOLVED that health data is vital to improving health and healthcare for all Americans. Accordingly, we declare:
It shall be the obligation of institutions and data holders to protect every individual’s right to privacy.
It shall be the duty of institutions and the privilege of individuals to share health data when the greater good can be served.
Health data shall be accurate, securely maintained, and made available in a manner which promotes productive use by others and respects the privacy interests of individuals.
Institutions have the duty to provide patient identified information to individual patients upon request.
Institutions have the duty to make public, in a responsible and timely manner, important insights and discoveries which improve health policy and our understanding of health and healthcare issues.
Health data costs should not be a barrier to non-profits, foundations, government agencies, universities, researchers, and innovators who would use the information to maximize social benefit.
Health organizations should be encouraged to utilize data to improve their products and services for members and consumers.
Organizations shall be discouraged from limiting appropriate data access to others unless clear and tangible issues of patient privacy or competitiveness exist.
Institutions and data holders should be transparent to data subjects and the public about their uses of health data to contribute to health research.
Institutions shall strive to utilize best practices and standards in order to improve accessibility and portability of health data by and for patients.”
6. Standardized statistical language
Open standards pulled the internet together. To build a true network of health data, we also need a standard language to pull information together to understand meaning and to enable network effects. As a New York Times reporter recently showed, how data is described can lead even physicians and experts, not to mention consumers, to very different conclusions. An 80% survival rate and 20% mortality rate can lead to different decisions. Let’s find a way to voluntarily and uniform way to describe health statistics. We can look to experts, like genetic counselors, to help reach standard language.
7. Better access to genetic counseling, better use of the standard of care.
The standard of care for understanding genetic tests already is genetic counseling. These are the experts in analyzing what data is relevant to what patients, expertise that even most physicians don’t have. Data is not of much value if you can’t make a decision from it, so let’s continue to include genetic counseling in clinical workflows and provide more access to patients through payers and providers alike.
8. Let’s encourage companies that are taking steps toward better transparency and opening up of data.
In regards to the recent J&J decision to release clinical trial data: “To be fair, the decision to share data is not easy. Companies worry that their competitors will benefit, that lawyers will take advantage, that incompetent scientists will misconstrue the data and come to mistaken conclusions. Researchers feel ownership of the data and may be reluctant to have others use it. So Johnson & Johnson, as well as companies like GlaxoSmithKline and Medtronic that have made more cautious moves toward transparency, deserve much credit. The more we share data, however, the more we find that many of these problems fail to materialize.“ We need to encourage more companies to show that building a community around data is more important than the data itself.
9. Let’s be clear about member-driven communities and citizen science.
There are a lot of people using the terms “community” as if everyone will get to share in the benefits, but often times the community is private. There should be some clear ground rules for declaring something a community and who has access to the data before joining, and how the rules can be changed.
10. Thinking ahead: We need to start thinking hard about enabling machine learning.
We are entering an age where the experts will be the algorithms. They can learn as fast or faster than we can. Many AI people will tell you “we don’t know what it knows”. Right now, AI as it’s used in medicine is just a starting point, say for what a correlation or causation might mean. Some day, we’ll be comfortable enough with AI and “Dr. Watson” to allow recommendations based on what he knows, not what we know he knows. How will we get there?
Finally, in regards to business models, in the near-term, let’s self-police. While combining data about genetic information, online and real world behavior and all of the analysis can be of great value, it also comes with great responsibility. A company with that controls this kind of information could quickly come under scrutiny by privacy advocates and anti-trust regulators. Let’s be very careful about what business models are proposed, and how transparent the models are to the public.
Note: Special thanks to David Maizenberg, JD, for his thoughts and assistance on this piece.