Healthcare News & Insights

EHRs won’t cut costs or improve care, med school professors say

Two medical school professors recently issued a warning about high expectations for health IT, claiming studies have shown technology won’t cut healthcare costs or improve care. 

All across the country, hospitals and physician practices are adopting electronic health records (EHRs) and other technology. At the end of 2011, more than half of physicians (55%) were using EHRs, according to a report from the Centers for Disease Control and Prevention. More recent estimates have put the number even higher, including a Medscape survey from August which found that 75% of doctors are currently using EHRs.

One reason EHR systems are being implemented is the incentive program created by the federal government to offset the costs of installing a system. Nearly half (44%) of doctors have applied for incentives for the meaningful use of EHRs, according to Medscape, and another 31% plan to apply within the next year.

Adoption rates are also likely to climb  much higher over the next few years, as practices and hospitals without an EHR system will be penalized in the form of reduced Medicare and Medicaid reimbursements.

Much of the justification for the incentive program has been that health IT will lower the cost of healthcare for providers and patients. However, two medical school professors are questioning those expectations.

In a recent Wall Street Journal opinion piece, Stephen Soumerai of Harvard Medical School and Ross Koppel of the University of Pennsylvania claim that EHRs won’t cut costs or improve care and that the savings and other benefits touted by the government and software vendors are “little more than hype.”

The authors cite a recent report from McMaster University in Hamilton, Ontario, which looked at thousands of studies conducted over the last few decades about the economic impact of IT in healthcare. The researchers found that in most cases, adopting electronic records failed to cut costs, improve the quality of care or reduce patient safety issues.

Soumerai and Koppel note in particular three large, randomized controlled studies, one of which found that EHRs actually increased costs by $2,200 per doctor per year. Another another found no change, and a third saw an insignificant reduction of $22 per doctor per year. Likewise, they say, studies haven’t found noticeable improvements in the quality of care after adopting EHR systems.

The authors estimate that $1 trillion will be spent across the U.S. on health IT investments, due to the false promises of cost savings.

Will IT cut healthcare costs?

Of course, not everyone agrees with Soumerai and Koppel, and the McMaster report is not exactly definitive. The researchers themselves noted that the “quality of the economic literature in this area is poor,” and that more study needs to be done.

Also, as Joseph Goedert of Health Data Management points out, researchers combed through several decades’ worth of studies. Their report contains data as old as 1992, and all but one of the studies cited were from before 2009. That’s the year the federal incentive programs were established, and EHRs and other health IT systems have evolved since then.

Goedert says it’s impossible to argue that the meaningful use program is a failure when it’s just two years old and only in its first stage. Stage 1 of meaningful use is focused on getting doctors acclimated to electronic records, while the future stages will require more sharing of information and communication between providers, which is meant to make care more effective and efficient.

What do you think? Are EHR systems and other IT tools a good investment for the healthcare industry? Or are expectations about cost savings unrealistic? Let us know your opinion in the comments section below.

Subscribe Today

Get the latest and greatest healthcare news and insights delivered to your inbox.

css.php