July 15, 2011
THOU ART HUUMAN:
The Quagmire: How American medicine is destroying itself. (Daniel Callahan and Sherwin B. Nuland, May 19, 2011, New Republic)
In 1959, the great biologist René Dubos wrote a book called Mirage of Health, in which he pointed out that “complete and lasting freedom from disease is but a dream remembered from imaginings of a Garden of Eden.” But, in the intervening decades, his admonition has largely been ignored by both doctors and society as a whole. For nearly a century, but especially since the end of World War II, the medical profession has been waging an unrelenting war against disease—most notably cancer, heart disease, and stroke. The ongoing campaign has led to a steady and rarely questioned increase in the disease-research budget of the National Institutes of Health (NIH). It has also led to a sea change in the way Americans think about medicine in their own lives: We now view all diseases as things to be conquered. Underlying these changes have been several assumptions: that medical advances are essentially unlimited; that none of the major lethal diseases is in theory incurable; and that progress is economically affordable if well managed.Posted by Orrin Judd at July 15, 2011 6:26 AM
But what if all this turns out not to be true? What if there are no imminent, much less foreseeable cures to some of the most common and most lethal diseases? What if, in individual cases, not all diseases should be fought? What if we are refusing to confront the painful likelihood that our biological nature is not nearly as resilient or open to endless improvement as we have long believed?
Let us begin by pointing to some unpleasant realities, starting with infectious disease. Forty years ago, it was commonly assumed that infectious disease had all but been conquered, with the eradication of smallpox taken as the great example of that victory. That assumption has been proved false—by the advent, for example, of HIV as well as a dangerous increase in antibiotic-resistant microbes. Based on what we now know of viral disease and microbial genetics, it is reasonable to assume that infectious disease will never be eliminated but only, at best, become less prevalent.
Then there are chronic diseases, now the scourge of industrialized nations. If the hope for eradication of infectious disease was misplaced, the hopes surrounding cures for chronic diseases are no less intoxicated. Think of the “war on cancer,” declared by Richard Nixon in 1971. Mortality rates for the great majority of cancers have fallen slowly over the decades, but we remain far from a cure. No one of any scientific stature even predicts a cure for heart disease or stroke. As for Alzheimer’s, not long before President Obama recently approved a fresh effort to find better treatments, a special panel of the NIH determined that essentially little progress has been made in recent years toward finding ways to delay the onset of major symptoms. And no one talks seriously of a near-term cure.
One of the hardiest hopes in the chronic-disease wars has been that of a compression of morbidity—a long life with little illness followed by a brief period of disability and then a quick death. A concept first introduced by James Fries in 1980, it has had the special attraction of providing a persuasively utopian view of the future of medicine. And it has always been possible to identify very old people who seemed to have the good fortune of living such a life—a kind of end run on medicine—and then dying quickly. But a recent and very careful study by Eileen Crimmins and Hiram Beltran-Sanchez of the University of Southern California has determined that the idea has no empirical support. Most of us will contract one or more chronic diseases later in life and die from them, slowly. “Health,” Crimmins and Beltran-Sanchez write, “may not be improving with each generation” and “compression of morbidity may be as illusory as immortality. We do not appear to be moving to a world where we die without experiencing disease, functioning loss, and disability.”
Average life expectancy, moreover, steadily increasing for many decades, now shows signs of leveling off. S. Jay Olshansky, a leading figure in longevity studies, has for some years expressed skepticism about the prospect of an indefinite increase in life expectancy. He calls his position a “realist” one, particularly in contending that it will be difficult to get the average beyond 85. He also writes that it is “biased” to assume that “only positive influences on health and longevity will persist and accelerate.” That view, he notes, encompasses a belief that science will surely keep moving on a forward track—a projection that is not necessarily true. Simply look at the “breakthroughs” that have been predicted for such scientific sure things as stem-cell technology and medical genetics—but have yet to be realized. These breakthroughs may eventually happen, but they are chancy bets. We have arrived at a moment, in short, where we are making little headway in defeating various kinds of diseases. Instead, our main achievements today consist of devising ways to marginally extend the lives of the very sick. [...]
[O]ur broader point is not really about policy changes such as rationing. It is, put simply, that substantial shifts will be needed in the way our culture thinks about death and aging. There is good evidence that if physicians talk candidly and with empathy to critically ill patients and their families, telling them what they are in for if they want a full-court press, minds can be changed. That, in turn, means that physicians themselves will have to acknowledge their limits, explore their own motivations, and be willing to face patients with bad news as a way of avoiding even worse treatment outcomes. The ethic of medicine has long been to inspire unbounded hope in the sick patient and the same kind of hope in medical research. Sobriety and prudence must now take their place.