Editor’s Note: The following article is a Guest Column from Edgar Wilson, an independent consultant who writes and follows health, education, and environmental policy. You can contact Edgar on Twitter @edgartwilson. To learn more about submitting a Guest Column, Click Here.
Meaningful Use is nearing its end, EHRs approach ubiquity, and ICD-10 turned out to be, largely, a pain-free transition. A deluge of change has washed over the health sector, bringing it more fully into the digital age.
The buzz surrounding the potential applications and integrated security solutions in emerging healthcare technology may be distracting us from an important fact: most data breaches are still a result of human error.
The learning curve with respect to digital data management—from cloud storage to online patient portals to sharing information across the continuum of care—is not just centered around utilization. Integrating EHRs into existing workflows was a hallmark of the Meaningful Use era (and a headache for many), but just as critical is upgrading attitudes, behaviors, and policies with respect to security.
Although recognition of this problem is spreading, the focus is still on external solutions.
The government is considering how to amend HIPAA to accommodate cloud storage, anonymizing PHI to support research and population health data, and redressing Bring Your Own Device (BYOD) standards among health systems. Programmers and designers are under the gun to implement more intuitive, automated security that takes the burden off end users.
In spite of all this turmoil, the best solution remains better education and emphasis on end-user behavior.
Data breaches are often as much the result of opportunity as any other crime, and when the path of least resistance goes through users rather than walls of encryption and privileged sharing channels, the infrastructure becomes irrelevant. A lazy password or ostentatious conversation can sink the whole health data ship just as effectively as a gap in the firewall—more so, in fact.
Encryption, Role-Based Access Control, restrictions on sharing, and both internal and federal standards abound—yet basic training on security principles, behaviors, and standards are few and far between. Among those distressingly few institutions that recognize the need to supplement existing rules and systems with more robust end-user education and oversight, there is disagreement over who all should be targeted for such remediation.
Should it fall to physicians to take the lead on understanding and implementing security standards? Should all clinicians? Should all staff in a health system? The lack of uniformity in the answers to these questions yields a lack of true standards in knowledge and behavior. With interoperability receiving renewed emphasis in the field, this kind of disorder around how to train and guide staff on basic security measures compounds risks and vulnerabilities.
The wearables market adds a new wrinkle. Sharing personal health metrics—and correlating them with personal identification data—puts more people in the data network. When data is stored and shared across multiple agents, they will all be potential culprits in the event that data is breached. In such a case, it will be their behavior toward security—human systems, company policies, and a culture of vigilance—that determines who is held accountable, rather than looking for a hole in the system architecture.
The hype of technological potential in emerging health software and devices threatens to obscure that how we use them is as important as what we use them for. End-user behavior is the single-biggest determinant of security, not the tech itself.
Flawed design and security gaps may make the news, but behind every headline about a health data breach is a human error that no amount of programming could prevent.
In the commercial sector, lagging security infrastructure led to the mass-adoption of EMV chips in credit cards. Despite heaping a burden of infrastructure upgrades and point-of-sale modification that is still not entirely worked out (the anxiety and frustration of small business owners ought to be familiar to health staff who survived recent years of Meaningful Use and ICD-10 roll-outs), the fundamental technology remains, along with all its flaws, and the best defense is still individual behavior.
Fortunately, adapting workflows to better accommodate security considerations need not be as onerous or disruptive as, say, navigating the one-way click-through screens of a new MU-certified EHR. It is a matter of integrating security-awareness into the operational culture of clinics and health systems of any size. The basics are the basics for a reason, and they eliminate that critical window of opportunity that makes health data today such a soft target for hackers and thieves. Shore up how individuals think and behave, and the whole system becomes that much more resilient.
Technology will certainly be at the center of improving care, engaging patients, and managing data. The analytic potential of EHRs, the shift to personalize care and treatment using big data, and related causes that clinicians and bureaucrats alike are interesting in pursuing all depend greatly on the sophistication and utilization of new and emerging tech.
Our frontline defense and last, best hope to guard against data breaches is still only human.
Latest posts by Edgar Wilson (see all)
- Can health data be emotional? - October 18, 2016
- Mandating compliance in healthcare is a poor trigger for culture change - September 22, 2016
- Success of value-based care relies too much on physicians - August 23, 2016