There is a lot of noise today about a skills gap–a void between what experience and education job seekers bring to the market, and what employers and companies expect or need from their human resources.
The disconnect is ripe fodder for headlines, analysis, and political jousting. It has been blamed on everything from generational differences to technological disruption, but the net effect is that the work needing done is running short of people fully qualified and capable of doing it. Needs have changed faster than abilities—both at the entry level and among educators charged with equipping future generations to carry their industries forward.
One doctor’s comment on the challenges we face in healthcare might well be applied to any industry grappling with disruptive change in both standards, systems, and technology:
One of the challenges facing every industry that depends on state-of-the-art education and training is a corollary to Einstein’s definition of insanity. How do we guarantee state-of-the-art education and training when those providing the education are, in many cases, creators/products of the past?
Whether you are beginning your career or ending it, the skills gap will make itself felt.
Is There a Doctor in the House?
In healthcare, there is a similar disconnect, yet it seems largely to escape the “skills gap” brand. That may be because the skills gap has been a favorite reference point for advocates championing the value of STEM (Science, Technology, Engineering, and Mathematics) degrees and occupations. The logic here asserts that the jobs of the future (starting yesterday) require more students to forego liberal arts and humanities degrees. While healthcare is often excluded from definitions of STEM fields, however, it is also often not measured alongside the humanities.
As far as graduates still living with their parents, those trained in the medical arts are right in the middle: not the highest rate, but far from the lowest. Psychology students take the top spot in this metric, which is interesting considering the chronic deficiencies in mental and behavioral health care–but that is beside the point. The general trend using this kind of broad-strokes compartmentalization of degrees and jobs suggests that humanities careers offer less opportunity for financial independence, while STEM careers offer more. What healthcare occupations offer is more or less left out of the discussion.
That is particularly unfortunate considering that healthcare is facing not just a skills gap, but a human resources gap: numerous studies and projections anticipate a growing shortage of both physicians–especially in primary care roles – and nurses. On top of this, there is a lack of health professionals (with or without clinical experience) prepared to fill leadership roles, an acute need given the ongoing state of the industry with respect to political change, technological transformation, and financial strain across the board.
Despite the U.S. leading the world in doctor remuneration (depending on specialty and practice type, to some extent), the fact that medical school is largely self-financed means much of this income goes right back into the education system in the form of loan repayment.
The Future Ain’t What It Used to Be
Paradoxically, medical school has been even slower to change than legacy institutions like hospitals.
For their money, medical students get a curriculum that minimizes instruction in critical areas like diet and nutrition, mental health screening, hospital systems navigation and administrative functions, information technology, and lifestyle diseases that are central to population health. Doctors today are entering the field subject to compensation based on quality metrics in which they have little formal training, working with time and technology constraints that can be crippling even to seasoned professionals, and under the management of an administrative class whose background is divided between clinical or business experience with distressingly little overlap.
2016 was a year ripe for labeling. The Year of The Trump. The Year of the Data Breach. The Year of the Tweet, the Comic Book Superhero, the Monkey, of Virtual Reality or the Drone.
2017 may well turn out to be the Year of the Skills Gap, the critical inflection point in both education and industry owning their joint roles in matching the supply of talent with the demands of changing paradigms. As the new year unfolds, the challenge of matching ability with opportunity may prove even more acute than the already dire analysis has suggested.
Maybe Next Year
The skills gap is magnifying how we assign value and accountability everywhere.
In healthcare, we’ve finally begun to frame priorities in terms of value and shared responsibility. It is far from perfect, but so is it fork in the road we can hardly hope to turn away from. Use of medical resources is being tied to outcomes, with patients getting more financial skin in the game as well as caregivers. Broadly, use of human resources may be next: with automation infiltrating every industry and higher education costs continuing to outpace general inflation, we cannot afford to keep doing the same thing and expecting different results.
How we handle our current challenges in healthcare has the potential to guide transformation beyond the clinical space. Balancing technology, administration, talent, time, and of course, expense–healthcare has not been progressive historically, but its tensions are far from unique. We need new ways to train, support, and distribute talent, and to better match needs with human resources.
2017 is primed to be a sandbox for solutions.
Latest posts by Edgar Wilson (see all)
- Wasting the day away: EHRs continue to be a time suck - August 17, 2017
- Access and allies: The war over healthcare - June 13, 2017
- Can AI in health IT save lives, yet simultaneously ruin your career? - April 5, 2017