Culture does not evolve to accommodate law; typically, the opposite occurs. Consider Prohibition in the United States: popular movements grew and escalated until they were able to achieve the passage of a Constitutional amendment—a rarity in the history of the country. Then, popular opinion reversed, and a mass movement was able to do an unprecedented (and never repeated) feat, passing a new amendment overturning Prohibition.
These transformations in law did little to disrupt the country’s relationship with alcohol; indirectly, of course, culture evolved around the law and gave rise to America’s organized crime systems and permanently established speakeasy bars. The point is, the law is a very poor instrument to use when culture needs to change. The law can codify cultural norms, but it can hardly create them—not deliberately, anyway.
In recent years, the governmental approach to healthcare has been to legislate cultural change. As always, this leads to either tacit compliance or withdrawal. Doctors, unlike bootleggers, are not in the business of providing black market medical care, so their withdrawal tends to amount to retirement or burnout. Those who comply with the regimented changes have not undergone a cultural conversion so much as they practice self-preservation alongside medicine.
You can lead a horse to water…
…but you can’t make it drink. Applied to healthcare, the analogy loses some of its folksy appeal but captures several evolutionary challenges we face: you can put EHRs in clinics, but you can’t make doctors take full advantage of them or even learn how to use them beyond basic attestation requirements. You can mandate health data entry, but you can’t make users meaningfully share or exchange data. You can create patient portals, but you can’t make patients engage themselves. You can write new quality metrics, but you can’t make caregivers realign their own sense of professional pride and worth overnight. You can say mental health matters, but you can’t make it a cultural norm by turning it into a covered benefit.
In medicine, and in society at large, mental health is a tertiary concern. Several laws have been passed with the aim to correct this: the Parity Act, ACA, a swath of similar, supplementary, and sometimes contradictory state laws, etc. The essence of all this lawmaking is to improve access and coverage, i.e. to lead more horses to water. The effect has been less than impressive, with actual utilization showing little improvement. The horses aren’t drinking.
Lawmaking doesn’t change a cultural standard that puts mental health behind the same glass as a fire extinguisher… break only in case of emergency. More often than not, emergencies are not identified proactively, but reflectively.
And unlike so many of the pressing issues in health policy and the practice of medicine, mental health is one area where the interests and challenges of patients and providers happen to align. Doctors and patients alike are struggling with mental illness, and are either unwilling or unable (or both) to get the help they need.
Not to disparage doctors, but when your profession tops the list of those with the highest collective rates of depression and suicide, you are not an obvious first choice for identifying others at risk for mental illness. Yet the default pathway to mental care goes right through primary care.
Physicians make notoriously poor patients. Clearly, they need as much mental health attention – more, really, considering the stress burden of their roles – as any other patient. Yet a combination of professional, social, and cultural factors obstruct them from receiving that care.
For one thing, the very progress being made technologically in healthcare is leaving mental health behind, rather than integrating it. As doctors spend more time and attention on their EHRs, their design leads them away from incorporating better mental health screening, much less treatment. Of course, the stigma of mental illness among physicians is magnified starting in med school, training young doctors to bury their emotional needs and vulnerabilities or risk failure, falling behind, and the end of their medical ambitions.
Fighting the norm of neglect are popular figures in medicine like ZDoggMD and Pamela Wible, who understand that changing doesn’t come from policy, but from cultural (or in their case, counter-cultural) dialogues and social movements.
Get ‘em while they’re young
Generally speaking, when we want to change the behavior of adults, we change laws. When we want to affect youth behavior, we look to our schools. We know laws don’t transform culture, but there may be hope that education can.
Data increasingly shows that America’s youth are facing an epidemic of mental illness that, without correction, becomes a burden carried with them into adulthood. This is, in part, why there is a growing movement to get more counselors in our schools. More counselors addresses the question of access that afflicts all ages and helps ingrain the idea of mental health as a priority and a matter of routine maintenance that should be addressed rather than stigmatized.
Our schools are microcosms of our society and underline the same problem we have in medicine with respect to identifying mental illness and intervening to prevent escalation. The cultural shift currently underway is one of growing awareness, tolerance, and integration, but is also devastatingly incremental. The most that can be said is that we recognize that mental health matters, but not to the extent that it is actually a top priority—at best it is a nice-to-have.
School-aged children are unlikely to self-select seeing a counselor or getting screened for a behavioral health problem. With more counselors and therapists in schools, however, both can be as common as checking for lice or getting aspirin from the nurse. It isn’t just the intervention that is important, but the normalization it instills.
When the presence of mental health discourse and treatment becomes conventional, their absence becomes objectionable. Then medical school, EHRs, primary care teams, insurance plans – all of the many places where mental health should be prominent, but is not – will be the cultural outliers.
It is difficult to know where to begin when the ultimate goal is broad cultural change. We can mandate having more counselors in schools, and we can mandate PCPs screening patients for mental illness by reading a questionnaire and checking a box in an EHR. But when laws are used to drive cultural change, it feeds bureaucracy and marginalizes conversation. Neither patient nor provider is served.
One thing is clear: lasting change is codified in law, not created by it. Penalizing stakeholders for not leading transformation seems less viable than providing support when and where something is working well. Before we go back to the drawing board to redesign health policy or press on with waves of revolution, we might reconsider whether our approach reflects, or contradicts, culture. Innovative solutions very often start with a simple conversation.when laws are used to drive cultural change, it feeds bureaucracy and marginalizes conversation Click To Tweet
Latest posts by Edgar Wilson (see all)
- Can AI in health IT save lives, yet simultaneously ruin your career? - April 5, 2017
- Culture clash? Healthcare with business characteristics - February 28, 2017
- How to change an unhealthy industry culture - January 25, 2017