by Murray S. Feldstein, M.D.
June 6, 2018
A recent magazine article highlights what happens when automated algorithms go awry in the medical system. Hundreds of Medicaid beneficiaries in Arkansas had their state medical benefits drastically cut. No one, not even the state officials who implemented the algorithm, could easily explain how or why this happened, which explains why a judge has put a halt on its use.
Algorithms are defined as “a process or set of rules to be followed in calculations or other problem-solving operations.” They have been used for centuries and are nothing more than a series of logical steps that are taken based on the results of taking the preceding steps.
As our medical system grows more complicated, there has been a natural tendency to automate various algorithms used for the diagnosis and treatment of disease, as well as for designing healthcare policies in an increasingly impersonal and centralized system.
The science of medicine is not the science of theoretical physics. Medicine is far more complex, and our crude analytic tools far too imprecise, to produce the kind of algorithms that reliably land rockets on Mars. Here are some caveats based on my own experience in following an algorithm, called a clinical guideline, in my specialty of urology.
Many people without any symptoms whatsoever are found to have such a small trace of blood in their urine that it can only be found by examining a sample under the microscope. This is called asymptomatic microscopic hematuria, or AMH. The overwhelming majority of people with the condition are normal and have no serious underlying disease. However, a very small percentage are found to have a serious cancer or perhaps a dangerous urinary stone.
There’s anxiety, risk, and discomfort associated with the diagnostic procedures. AMH is very common, so the annual total costs to insurance companies and taxpayers amounts to millions of dollars. Experts have disagreed about which people with AMH should undergo evaluation, and exactly what tests should be done on them.
In 2012 my specialty society, the American Urological Association, published a guideline to help physicians deal with AMH. My experience in attempting to apply the guideline’s algorithm is informative. During the latter part of my career, I gradually transitioned out of my private practice in rural northern Arizona and worked increasingly as a consultant in an academic position at the Mayo Clinic in Phoenix.
In the course of a week I might see patients with AMH in vastly different settings: at a world-famous teaching hospital, my private office in Flagstaff, or on the remote Navajo/Hopi reservation in Tuba City. In each location I would explain the recommendations based on the guidelines, as well as the risks and the likely outcome of testing. What patients ultimately decided had more to do with their cultural preferences and personal priorities than either the science or the availability of health insurance. The joy and challenge of medical practice is that each patient is different.
Automated algorithms such as the one used in the Arkansas Medicaid program are not at the stage where they can deal with problems at an individual level. An algorithm is no better than the premises that underlies its logic. As the saying goes, “garbage in, garbage out.”
The “big data” used developing healthcare policies are nothing more than statistical averages and correlations. They may have been derived from reports or documents that were not designed to implement the tasks for which they are being utilized. The very data itself may have come from different sources with varying degrees of accuracy.
Until medical science achieves the precision of physics, we should be wary about placing too great a reliance on them, remembering that the humans who devise algorithms are only human, and the patients who live (or die) by them the same.
Murray S. Feldstein, M.D. is a visiting fellow in healthcare policy at the Goldwater Institute.