The headline over at Politico was designed to be provocative ‒ and it certainly was:
I saw a few commentaries shortly thereafter ‒ here’s a sampling of two.
Now, I love a good conspiracy theory, and the Politico headline obviously did illicit strong reactions, but what are these “gag clauses” and what are they doing in healthcare?
A Politico investigation found that some of the biggest firms marketing electronic record systems inserted “gag clauses” in their taxpayer-subsidized contracts, effectively forbidding health care providers from talking about glitches that slow their work and potentially jeopardize patients.
So, no less than an full investigation was required to uncover these “gag clauses?” In fairness, Politico did review actual contracts, but they never actually shared any of the legalese as a way to substantiate their claim. Here’s how they characterized their “investigation.”
Politico obtained 11 contracts through public record requests from hospitals and health systems in New York City, California, and Florida that use six of the biggest vendors of digital record systems. With one exception, each of the contracts contains a clause protecting potentially large swaths of information from public exposure. This is the first time the existence of the gag clauses has been conclusively documented.
Really? This is the first time that “gag clauses” have been “conclusively documented?” It’s an interesting claim, but again, without the actual evidence, not very convincing. The reality is that “gag clauses,” are everywhere. My only surprise was the lone exception.
So just what is a “gag clause?”
Intrigue aside, it’s a clause used extensively in many legal contracts ‒ including enterprise software ones. These enterprise software contracts represent technology purchases that are often in the 10s or 100s of millions of dollars. Some are in the billions.
The actual term to legally bind (or “gag”) either side of a contract is often called a Confidentiality or Non‒Disclosure clause. In both of these cases, the term is typically followed by “Agreement” in that both parties to the contract must agree to the terms of the clause as a part of the whole contract. The price for the whole contract is based on dozens of individual clauses ‒ including ones around confidentiality.
These confidentiality clauses are perfectly legal, well understood and not only a common practice in enterprise software licensing agreements, but a best practice. It’s provocative to refer to them as “gag” clauses because it creates the perception of a victim and a criminal ‒ as if someone were being bound against their will into contractual submission.
So just why are these confidentiality clauses baked into most enterprise software contracts? While most consumers (and corporate end users) are unaware of this, software (of all types) is very easy to steal and software theft globally is a huge problem. According to their last annual study (2011), the Business Software Alliance estimated the value of stolen software globally at over $63 billion ‒ just for that one year.
In fairness, the vast majority of that theft is PC software, but enterprise software is also at risk. Here’s just one (strange but true) example.
An enterprise software company known as Apptricity filed suit against the U.S. Government ‒ for software piracy in 2012. The $50 million settlement by the Government in 2013 effectively acknowledged the theft ‒ even if it didn’t admit to the crime.
Apptricity claimed $224 million in damages for approximately 100 server and 9,000 device licenses the U.S. Army installed and fielded globally, but did not procure.
Also, there are legal safeguards built into all enterprise software license agreements ‒ specifically around bugs, design flaws and liabity. As the licensee, companies have certain rights and obligations under the software license agreement to report these bugs and flaws. That path is well established, openly encouraged and direct ‒ typically through IT departments on both sides. That may include the CIO and CEO if and when a critical bug or flaw warrants the escalation. Most don’t ‒ but certainly some do.
The Politico headline ‒ and the discussion more broadly around the public’s right to know of these design flaws ‒ surfaced in large part because of one nearly fatal software design flaw that was publicly disclosed by a doctor. According to the story (and as appropriate), the doctor did secure an agreement from the vendor for the disclosure of this flaw in a book that he authored. The doctor then disclosed the design flaw online (as an exerpt to his book) and both the book and the online exerpt included screenshots highlighting the nature of the flaw.
Did the doctor violate either the terms of confidentiality or the exclusion that he obtained from the vendor for his book? Possibly, but we don’t know the terms of the agreement so we really can’t say. Unless a law has been broken, individual terms of an NDA are really between the principals to the contract. In this exact case, the contract between the hospital and their EHR vendor (with any addendums) should prevail.
As the story goes (and within their legal right), the vendor asked for the screenshot to be taken down from the online publication ‒ and the doctor did not comply with that request.
Will the vendor pursue criminal charges? Not likely ‒ for a host of reasons ‒ but they probably could.
Something similar happened earlier this year in another large industry ‒ automotive. In this case, hackers exploited a software vulnerability in a late model Jeep Grand Cherokee and caused the car to come to a complete stop ‒ on the freeway ‒ with a fair amount of traffic. The YouTube video is chilling ‒ both in terms of it’s public disclosure and the dramatic effect. But the engineers who exploited the vulnerability didn’t disclose the technical details of the hack they used to exploit the vulnerability. Chrysler wound up recalling 1.4 million Jeep’s in order to install a software patch.
So, are doctors really “barred” from disclosing software bugs, design flaws and vulnerabilities in EHR code? In the end, I don’t think that’s a fair characterization in that they are well within their personal, professional and legal right to report bugs and flaws through both their IT department (where it’s legally encouraged) and through any written description they may wish to publish in the public domain. Like the story of the Jeep vulnerability, we’ve seen a fair number of these and they are very effective. The cost to recall 1.4 million vehicles is considerable ‒ but critically important as a safety measure that no one would ever argue.
The larger questions for the use and sale of EHR software are these:
Should public disclosure by employees be allowed to void clauses in contracts negotiated between their employer and another entity?
Should we outright ban Confidentiality or Non-Disclosure clauses in EHR contracts? Should Congress (who subsidized much of the EHR software purchased over the last few years) mandate their prohibition?
Perhaps, but like all contracts, don’t be surprised if removal of these rights results in higher priced software. If buyers want to negotiate the removal of those clauses ‒ they are perfectly free to do so. Most don’t, but either way, it’s not some mysterious “gag clause” conspiracy. It’s a common and customary legal clause in a negotiated contract between two companies at arms length.
The Doctor in this case is Dr. Robert Wachter and I cannot recommend his book ‒ The Digital Doctor ‒ highly enough. There were many great quotes in his book ‒ starting with this one in the preface.
The simple narrative of our age ‒ that computers improve the performance of every industry they touch ‒ turns out to have been magical thinking when it comes to healthcare. In our sliver of the world, we’re learning, computers make some things better, some things worse, and they change everything.
Latest posts by Dan Munro (see all)
- Obamacare in Gartner’s ‘trough of disillusionment’ - September 13, 2016
- Why Ireland has a National Patient Identifier (and we don’t) - August 31, 2016
- Digital health lessons from BART - April 26, 2016