Why has health care lagged so far behind information technology in terms of innovation? Focusing on that question will allow us to move beyond our endless squabble over how many people have an insurance card in their wallet. The obsession with coverage, on both sides of the aisle, has diverted us from the more important questions: How good is the care that those cards provide? How much health does that care produce? And how much more health could we produce if we allowed health care to innovate in the way that information technology has done so spectacularly?
These questions are the focus of my recent study, “Fortress and Frontier in American Health Care,” published by the Mercatus Center at George Mason University. The 67-page study is here. A 4-page summary is here. A Tribune Papers op-ed summarizing the paper is here.
If an encomium matters to you, Jeffrey Flier, dean of the Harvard School of Medicine, just tweeted this about my paper: “Long read, must read - if you are interested in health and health reform. What we should be talking about.”
The study argues that the usual right-left divide in health care policy is an expensive and deadly red herring, stranding us between an unworkable Patient Protection and Affordable Care Act and a stack of thin-broth repeal-and-replace proposals. It argues that the real distinction between health care and information technology over the past quarter-century was that the two industries were governed by radically different worldviews, “the Fortress and the Frontier,” each enjoying a broad bipartisan consensus at the federal and state levels.
Health care existed in the Fortress, where public policy has two overarching goals. First is to imagine every horrid thing that can happen in health care and to devote massive resources to preventing those hypothetical adverse events. Second is to build a legal and regulatory structure that protects established insiders from outside innovators who might threaten their turf.
During these same years, information technology flourished on the Frontier. Consumers and producers were allowed to take significant risks. And outsiders like Steve Jobs were free to challenge and defeat insiders like IBM.
My study summarizes the results through a thought experiment. Imagine taking a time machine back to 1989. Tell a crowd about the advances in medicine over the past 25 years: statins, new vaccines, face transplants. Very likely, they’ll be pleased but not particularly shocked. They’ll be most surprised, perhaps, by the unraveling of the human genome, but I would argue that’s more IT than it is health care.
In contrast, tell the 1989 crowd about the changes in IT: iPads, smartphones, Kindle, YouTube, Siri, Skype, Google, Google Translate, iTunes, bitcoin. Tell them how many of these services come for free or a small price. They will gasp in disbelief.
The reaction to this thought experiment is often a reflexive assertion that health care and IT are worlds apart ethically—that health care is about life, death, pain, and suffering, while IT is just machines on a desk or in a pocket. But I argue that IT conveys life, death, pain, and suffering just as much as health care does. On the downside, IT provides new and terrifying modes of financial fraud, stalking, bullying, destruction of reputations, identity theft, and privacy violations. Failures of telemetry can be devastating in a hospital or on an airliner. Cellphones and the Internet were crucial elements of improvised explosive devices in Iraq and the devastation of 9/11. Conversely, they have given us previously unimaginable ways to save lives (e.g., hospital telemetry, OnStar, GPS-driven 911 services).
Once we shift our focus away from insurance cards and toward innovation, new worlds open up, as do new public policy questions that do not fall neatly into a left-versus-right pigeonhole. Note that in 25 years, IT has moved from the province of the wealthy to (borrowing a term from health care policy) universal coverage. Third World village children now carry smartphones whose power is beyond the world’s supercomputers from a generation ago.
Notice, too, that attaining universal coverage in IT did not involve decades of partisan rancor or a gigantic bureaucracy to oversee innovation. The iPad and smartphone were not victories for the left or right. They did not, by and large, come from old, established IT firms. The substantial dangers that they posed to people were dealt with quickly and efficiently as real problems arose.
Shift the health care policy in this direction, and lots of near-term bipartisan initiatives present themselves. For what it’s worth, my study offers a few dozen places where policymakers can begin building these political bridges. These initiatives would help shift us toward a new goal for health care policy: “to provide better health to more people at lower cost, year after year.” To make that goal a reality, we need to make health care as innovative in the next 25 years as IT has been in the past 25 years.