I am slightly ashamed of how much I liked Burma when I visited it nearly a third of a century ago. What a delight it was to go to a country in which there had been no progress for 40 years! Of course it was a xenophobic, klepto-socialist, Buddho-Marxist military dictatorship run by the shadowy, sinister and corrupt General Ne Win, and so, in theory, I should have hated it. Instead, I loved it and wished I could have stayed.
Since then there has probably been some progress, no doubt to the detriment of the country’s charm. Burma (now Myanmar) is slowly rejoining the rest of the world, and one consequence of this will be the more rapid advance of treatment-resistant malaria.
A recent paper in the Lancet examined the proportion of patients in Burma with malaria in whom the parasite, Plasmodium falciparum, was resistant to what is now the mainstay of treatment, artemisinin, a derivative of a herbal remedy known for hundreds of years to Chinese medicine. The results are not reassuring.
There was a time, not so very long ago, when the global eradication of malaria was envisaged by the WHO, and it looked for a time as if it might even be achieved. The means employed to eradicate it was insecticide that killed the mosquitoes that transmitted the malarial parasites, but a combination of pressure from environmentalists who were worried about the effects of DDT on the ecosystem and mosquito resistance to insecticides led to a recrudescence of the disease.
At the same time, unfortunately, resistance to antimalarial drugs emerged. Control of malaria, not its eradication, became the goal; an insect and a protozoan had defeated the best efforts of mankind. And this is no small matter: the last time resistance to a mainstay of treatment for malaria, chloroquine, emerged in South-East Asia, millions of people died as a result in Africa for lack of an alternative treatment.
What most surprised me about this paper was the method the authors used to determine the prevalence of resistance to artemisinin in the malarial parasites of Burma: for I remember the days when such prevalence was measured by the crude clinical method of giving patients chloroquine and estimating how many of them failed to get better.
The genetic mutations that make the parasite resistant to artemisinin have been recognized. The authors were able to estimate the percentage of patients with malarial parasites that had mutations associated with drug resistance. Nearly 40 percent of their sample had such mutations, and in a province nearest to India the figure was nearly half. The prospects for the geographical spread of resistance are therefore high.
Nor is this all. Artemisinin resistance was first recognized in Cambodia 10 years ago but the mutations in Burma were different, suggesting that resistance can arise spontaneously in different places at the same time. From the evolutionary point of view, this is not altogether surprising: selection pressure to develop resistance to artemisinin exists wherever the drug is widely used.
One way of reducing the spread of resistance is the use in treatment of malaria of more than one antimalarial drug at a time, but this will only retard the spread, not prevent it altogether. As with tuberculosis, it is likely that parasites resistant to all known drugs will emerge. The authors of the paper end on a pessimistic note:
The pace at which the geographical extent of artemisinin resistance is spreading is faster than the rate at which control and elimination measures are being developed and instituted, or new drugs being introduced.
In other words, deaths from malaria will increase rather than continue to decrease, which is what we have come to think of as the normal evolution of a disease.
Apparently, many Americans would trust strangers more to make a medical diagnosis than a doctor according to the email I was sent today by a site called CrowdMed:
According to a recent study by CrowdMed [www.crowdmed.com (http://www.crowdmed.com/)] — a groundbreaking medical website that helps “crowdsource” solutions to the country’s most difficult medical mysteries — nearly one in five Americans (19%) has had to wait at least six months for a doctor to accurately diagnose a family member’s mysterious medical condition.
But what if you can’t wait that long? If you had a medical condition that baffled your doctor, would you be willing to get suggestions from perfect strangers?
According to the CrowdMed Medical Trust Census — a survey of 1,500 Americans on their attitudes toward traditional and nontraditional medical diagnosis — the vast majority of U.S. patients are interested in consulting others who are not necessarily practicing doctors. Noteworthy findings include:
>> 73% of Americans would trust a NURSE to suggest a diagnosis
>> 74% would trust an ALTERNATIVE MEDICINE PRACTITIONER
>> 84% would trust a RETIRED DOCTOR
>> 87% would trust a FORMER PATIENT WITH RELATED SYMPTOMS
>> 62% would trust a MEDICAL STUDENT
According to the site, you fill out a questionnaire and “collaborate With Medical Detectives” and then receive a report which includes the top diagnostic suggestions and solutions from the community. Given the problems with our healthcare system these days, it might be quicker to make a stop at this site than wait for ObamaCare to come through….
In the past, medical journals, pharmaceutical companies and researchers themselves have been criticized for publishing selectively only their positive results, that is to say, the results that they wanted to find. This is important because accentuation of the positive can easily mislead the medical profession into believing that a certain drug or treatment is much more effective than it really is.
On reading the New England Journal of Medicine and other medical journals, I sometimes wonder whether the pendulum has swung too far in the other direction, in accentuating the negative. To read of so many bright ideas that did not work could act as a discouragement to others and even lead to that permanent temptation of ageing doctors, therapeutic nihilism. But the truth is the truth, and we must follow it wherever it leads.
A recent edition of the NEJM, for example, reported on three trials, two with negative results and one with mildly positive ones. The trials involved the early treatment of stroke, the prophylaxis of HIV injection, and the treatment of angina refractory to normal treatment (a growing problem). Only the latter was successful, but it involved 104 patients as against 6729 patients in the two unsuccessful ones.
The successful trial involved the insertion of a device that increased pressure in the coronary sinus, the vein that drains the blood from the heart itself. For reasons not understood, this seems to redistribute the blood flow in the heart muscle, thus relieving angina. In the trial, the new device relieved and reduced angina symptoms, and improved the quality of life in the patients who received it compared with those who underwent a placebo-operation. The trial was too small, however, to determine whether the device improved survival, though even if it did not a reduction of symptoms and an improvement in the quality of life is worthwhile.
The trial of chemoprophylaxis of HIV was, by contrast, a total failure. The trial recruited 5029 young women in Africa who were given an anti-HIV drug in tablet or cream form, and others who were given placebos. The rate at which they became infected with HIV was compared, and no difference was found.
In large part this was because the patients did not take or use the pills or cream, though they claimed to have done so. A drug that few take is not of much use however effective it might be in theory, especially in prophylaxis rather than treatment. And this points to another problem of pharmaceutical research: in drug trials that require patients’ compliance with a regime, that compliance may be high during the trial itself (thanks to the researchers’ vigilance and enthusiasm) but low in “natural” conditions, when the patients are left to their own devices.
The trial of magnesium sulphate in the early treatment of stroke was also a failure. It had been suggested by experiments on animals that this chemical protects brain cells from degeneration after ischaemic stroke. It stood to reason, then, that it might improve the outcome in humans in ischaemic stroke, at least if given as soon as suspected.
Alas, it was no to be. The trial, involving 1700 patients, showed that the early administration of magnesium sulphate did not improve outcome in the slightest. At 90 days there was no difference between those who received it and those who had received placebo.
Is an idea bad just because it does not work? Could it be that those who discover something useful are just luckier than their colleagues? Perhaps there ought to be a Nobel Prize for failure, that is to say for the brightest idea that failed.
image illustration via shutterstock / PathDoc
For a perfect example of how hysteria governs modern debates over complex issues, witness what happened yesterday morning to Governor Chris Christie. For the apparently unpardonable offense of offhandedly suggesting parents ought to have some freedom to decide how their kids are vaccinated, the governor’s political career was declared over. The instantaneous eruption from America’s self-deputized thought police had the governor — only hours later — meekly offering “clarification” of his earlier comments.
The debate over vaccines, itself nearing pandemic proportions in the U.S., is following a familiar pattern. People are either pro-science or anti-; in agreement with the “consensus” or crazy “conspiracists” and “deniers.” Much like the debate over global warming, there’s no room for middle ground; preaching prudence is basically blasphemous. And just as many are calling for climate “deniers” to be ostracized and even arrested, critics and parents who question the conventional wisdom on vaccines are likewise condemned as threats against civilization itself.
Like most everyone else, I am neither a doctor nor even a scientist. But I am smart enough to know there are perfectly valid reasons to question conventional wisdom.
Take the current controversy over measles. From the looks of my Twitter feed and the comments sections under just about any vaccine-related article, you’d think we were talking about the bubonic plague. In fact, measles, despite being highly contagious, isn’t particularly dangerous. So long as your immune system is in decent shape, you’ll be fine. In fact, you might actually want it, as exposure leads to lifetime immunity.
Measles is basically a fever with an accompanying rash. It’s true that in the 1800s, outbreaks caused tragically large numbers of children to die — but these were concentrated in orphanages and hospital wards (places where malnutrition was rampant). As the world prospered, affluence spread, and health improved, in the U.S. the chances of dying after contracting measles dropped to 1-2 percent by the 1930s. By the time a vaccine was introduced in 1963, deaths from measles were virtually nonexistent. Asthma, according to “Vital Statistics of the United States, 1963,” claimed 56 times as many lives.
Today it’s popular to argue that measles would be totally defeated were it not for the Jenny McCarthys of the world. The only problem is that the MMR (measles-mumps-rubella) vaccine does not actually immunize — as most people understand the word — against measles. The most we can expect is temporary protection. That’s because vaccines are injected directly into the body, bypassing the body’s natural immune response. “Most disease-causing organisms enter your body through the mucous membranes of your nose, mouth, pulmonary system or your digestive tract – not through an injection,” explains Dr. Joseph Mercola. “These mucous membranes have their own immune system, called the IgA immune system.”
Initially described as lifelong insurance, health officials realized in the ’70s, when an uptick in measles diagnoses occurred among vaccinated high-school students, that the vaccine should probably be administered more regularly. The CDC now advises receiving the vaccine at 12-15 months, 4-6 years, and again as an adult. The U.S. is also using its third version of a measles vaccine, after the first two proved ineffective.
Which should probably make it no surprise that many of the people catching measles today were vaccinated. Today’s measles cases are occurring in heavily vaccinated populations. When a 2006 outbreak among college students in the Midwest struck, the fact that most of the affected were vaccinated seemingly made no difference. When an outbreak of the mumps hit the NHL this year, many reflexively blamed “anti-vaxxers.” Almost no one reported that every affected player appears to have received the MMR vaccine. The Penguins’ Sidney Crosby received not only the initial MMR, but also a booster just before the Sochi Olympics. The director of the Vaccine Education Center at the Children’s Hospital of Philadelphia, Paul Offit, would only say “we know that the short-term effectiveness of the mumps vaccine is excellent.”
Still, none of this would suggest there’s any reason to avoid regular vaccines — were it not for side effects. And here comes another wrinkle: The MMR vaccine can itself give you measles. In 2013, measles began spreading in British Columbia after a two year-old girl contracted the virus from the vaccine, and then began spreading it to others. Though rare, there are other risks worth considering, too: According to the CDC, side effects to MMR can range from minor (fever, mild rash, swelling), to moderate (seizure, temporary low platelet count), to major (deafness, long-term seizures, permanent brain damage). Note that the latter two categories are worse than the disease itself. Perhaps a bigger problem is how these vaccines weaken the immune response among undernourished patients. “In developing countries, the use of high-titre vaccine at 4-6 months of age was associated with an unexpectedly high mortality in girls by the age of 2 years from infectious childhood illness,” a study reported in the British Medical Journal.
As recently as the 1970s, the CDC recommended children receive four vaccines. Today, per CDC protocol, children can receive around 40 shots between birth and the age of 6. What if that number grows to 100? 500? Will it always be unreasonable to ask, “Is all of this really necessary?”
Finally, this may come as a shock, but it’s actually possible for the government and the medical establishment to get things wrong. This year the CDC admitted its flu vaccine was created for the wrong strain — yet Americans are being instructed to get the shot anyway. Indeed, some parents are being threatened with having their children taken if they aren’t given this (almost certainly) useless flu vaccine. For more than a generation Americans were told to avoid as much as possible saturated fat, salt, and calories in general. More recent science shows that salt consumption has no causal relationship with blood pressure; eating healthy saturated fats like grass-fed butter is good for your heart, brain, and metabolism, and calories are actually a form of energy that gives us life.
Assigning responsibility for your children’s health and well-being to others — even “experts” — is precisely the opposite of parenting. Asking questions, educating yourself, soliciting more than one opinion: these aren’t the behaviors of people to be condemned and vilified. When someone insists you submit to the expertise of others, they’re actually asking you to stop thinking for yourself. And that’s a mistake. Vaccines, like so much of life, are more complex than a simple good-vs.-evil analysis affords. Universal solutions rarely work universally. Parents are right to do their homework.
Here’s Senator Rand Paul saying that most vaccines should be voluntary:
Late in the previous century, when the Toronto Star spiked my column debunking Kwanzaa — the editor scolded me for wanting to “ruin other people’s fun” by telling the truth, which in hindsight would make for an apt if ungainly personal motto on my (non-existent) coat of arms — I sent the piece to Canada’s only conservative magazine, the (since defunct) Alberta Report.
Link Byfield, the magazine’s publisher and editor, snapped it up, and asked for more.
I’d been a professional writer for years, but now my career as a right-wing writer had begun.
Byfield died of cancer this week, at 63.
My fellow AB contributor Colby Cosh was and is a libertarian (some might say craggily contrarian) atheist who was nevertheless embraced right out of grad school by the unabashedly Christian so-con Byfields.
Cosh — today, like many former Report writers, a star columnist at a national publication — quickly composed an obituary of Byfield that is, not surprisingly, insightful, elegant and stringently unsentimental.
(The Byfields have a keen eye for talent, if I do say so myself…)
Another longtime colleague, Peter Stockland, attended a tribute to Byfield last September, an event arranged after he was diagnosed with terminal cancer.
Stockland explained Link Byfield’s influence on recent Canadian history with this succinct formula, one that resembles the mnemonic verse British schoolchildren used to learn to keep their kings and queens straight.
No Byfields, no Alberta Report. No Alberta Report, no Reform Party as it was formed. No Reform Party, no [Progressive Conservative Party] collapse. No PC collapse, no [Conservative Party] Harper government.
Some perspective for American readers:
My husband and I attended a lecture about Israel by Melanie Phillips a few years back.
Phillips, while correct on so many issues, remained convinced that Europe’s “fringe” “right-wing” populist political leaders, while anti-sharia, were also racist, anti-Semitic losers and therefore unwelcome allies in the counter-jihad.
Afterwards, my husband took her aside and explained — to her visible surprise – that Canada’s “fringe right wing” populist Reform Party had once been condemned as backward, bigoted and doomed, too; yet one of its founders, Stephen Harper, was now the staunchly pro-Israel prime minister of Canada, having just won a second federal election.
Non-Canadians are, presumably, more familiar with our “free” “healthcare” system, as I call it.
On that topic, Mark Steyn once quoted a fictional Canadian — OK, Quebecois — character’s decision to die a principled death:
Sébastien wants his dad to go to Baltimore for treatment, but Remy roars that he’s the generation that fought passionately for socialized health care and he’s gonna stick with it even if it kills him.
“I voted for Medicare,” he declares. “I’ll accept the consequences.”
But Link Byfield was a real man, not an imaginary one.
That makes what follows all the more notable.
Yet what truly mattered to [Byfield] was having lived out, as far as possible in the midst of a train wreck, a principled reality.
I mentioned an e-mail he sent last summer explaining his choice to forgo chemotherapy because it would not save him, yet would cost taxpayers $100,000.
I said I could not imagine other Canadians who would factor such public policy considerations into their personal health care.
“But that would have been standard thinking among politically literate citizens 50 ago,” he said. “People wouldn’t even articulate it. It would just be something they would think.”
When I asked his source for thinking that way, he said: “Thou shalt not steal.”
How informed is informed? What is the psychological effect of being told of every last possible complication of a treatment? Do all people react the same way to information, or does their reaction depend upon such factors as their intelligence, level of education, and cultural presuppositions, and if so does the informing doctor have to take account of them, and if so how and to what degree? An orthopedic surgeon once told me that obtaining informed consent from patients now takes him so long that he had had to reduce the number of patients that he treats.
An article in a recent edition of the New England Journal of Medicine extols the ethical glories of informed consent without much attention to its limits, difficulties and disadvantages.
It starts by referring to a trial of the level of oxygen in the air given to premature babies, of whom very large numbers are born yearly. Back in the 1940s it was thought that air rich in oxygen would compensate for premature babies’ poor respiratory system, but early in the 1950s British doctors began to suspect, correctly, that these high levels of oxygen caused retinal damage leading to permanent blindness. Fifty years later, the optimal level of oxygen is still not known with certainty, and a trial was conducted that showed that while higher levels of oxygen caused an increased frequency of retinopathy, lower levels resulted in more deaths. The authors of the trial have been criticized because they allegedly did not inform the parents of the possibility that lower levels of oxygen might lead to decreased survival, which was reasonably foreseeable.
How reasonable does reasonability have to be? Many of the most serious consequences of a treatment are totally unexpected and not at all foreseeable (no one suspected that high levels of oxygen for premature babies would result in blindness, for example, and it took many years before this was realized). Ignorance is, after all, the main reason for conducting research.
But suppose parents of premature babies had been asked to participate in a trial in which their offspring were to be allocated randomly to an increased risk of blindness or an increased risk of death. Surely this frankness would have been cruel, all the more so as the precise risks could not have been known in advance. Parents would feel guilt alike if their babies died or were blind.
Now that the answer is known, more or less, parents can be asked to choose in the light of knowledge: but their informed consent will be agonizing because there is no correct answer. Personally, I would rather trust the doctor sufficiently to act in my best interests in the light of his knowledge and experience. So far in life I have not had reason to regret this attitude, though I am aware that it has its hazards also. But
…why should they know their fate?
Since sorrow never comes too late,
And happiness too swiftly flies.
Thought would destroy their paradise.
No more; where ignorance is bliss,
‘Tis folly to be wise.
And I have often thought what medical ethicists would have made of the pioneers of anesthesia. They did not seek the informed consent of their patients, in part, but only in part, because they hadn’t much information to give. What moral irresponsibility, giving potentially noxious and even fatal substances to unsuspecting experimental subjects without warning them of the dangers!
And there are even some medical ethicists who think we should not take advantage of knowledge gained unethically. All operations should henceforth be performed without anesthesia, therefore.
Medical history is instructive, if for no other reason than that it might help to moderate somewhat the medical profession’s natural inclination to arrogance, hubris and self-importance. But the medical curriculum is now too crowded to teach it to medical students and practicing doctors are too busy with their work and keeping up-to-date to devote any time to it. It is only when they retire that doctors take an interest in it, as a kind of golf of the mind, and by then it is too late: any harm caused by their former hubris has already been done.
Until I read an article in a recent edition of the Lancet, I knew of only one eminent doctor who had been shot by his patient or a patient’s relative: the Nobel Prize-winning Portuguese neurologist Egas Moniz, who was paralyzed by a bullet in the back. It was he who first developed the frontal lobotomy, though he was also a pioneer of cerebral arteriography. As he was active politically during Salazar’s dictatorship, I am not sure whether his patient shot him for medical or political reasons, or for some combination of the two.
Of late the New England Journal of Medicine has seemed like the burial ground of good ideas. Researchers follow a promising lead only to find that their new idea fails the crucial test of experience: and the difference between success and failure in research is made to appear as much a matter of chance or luck as of brilliance or skill.
In the first issue of the Journal for 2015, Dutch researchers from 16 different hospitals report an unequivocal success in the treatment of ischemic stroke.
Until now the only proven worthwhile treatment of patients with the kind of stroke that results from the blockage of a cerebral artery is the infusion within four and a half hours of the drug called alteplase, which dissolves thrombus (and which is manufactured from the ovaries of Chinese hamsters). But even with the use of this drug the prognosis is not very good, and there are several contra-indications to its use.
It is easier to advise than to have or to retain a sense of proportion, especially when it is most needed. I have never known anyone genuinely comforted by the idea that others were worse off than he, which perhaps explains why complaint does not decrease in proportion to improvement in general conditions. And he would be a callow doctor who tried to console the parents of a dead child with the thought that, not much more than a century ago, an eighth of all children died before their first birthday.
Still, it is well that from time to time medical journals such as the Lancet should carry articles about medical history, for otherwise we might take our current state of knowledge for granted. Ingratitude, after all, is the mother of much discontent. To know how much we owe to our forebears keeps us from imagining that our ability to diagnose and cure is the consequence of our own peculiar brilliance, rather than simply because we came after so much effort down the ages.
A little article in the Lancet recently was written by two historians who are in the process of analyzing the results of 9000 coroners’ inquests into accidental deaths in Tudor England. It seems astonishing to me that such records should have survived for more than four centuries, but also that the state should have cared enough about the deaths of ordinary people to hold such inquests (coroners’ inquests had already been established for 400 years at the time of the Tudors). In other words, an importance was given to individual human life even before the doctrines of the Enlightenment took root: the soil was already fertile.
“The due process clause of the fourteenth amendment guarantees, protects the rights of parents but the fact is that we have to put it in law. You wouldn’t think we have to go here. What we’re seeing in our country today leads us to believe that if we don’t put this stuff into law then we are behind the eight ball and we find ourselves with these kinds of situations. I’m just afraid, down the road, we’re going to see more and more cases like [the Isaiah Rider case].” — Ken Wilson (R-MO)
We’re farther “down the road” than most dare to imagine.
The bill Rep. Wilson introduced states that a parent cannot be charged with medical child abuse for disagreeing with medical advice and choosing treatment of another doctor. Yeah. We’re there.
You might remember the well-publicized ordeal of Justina Pelletier. It seemed like a fluke of injustice, an isolated case. So beyond right, it was easy to assume there’s more to the story. In the Pelletier case, rather than receiving discharge papers, parents were charged with “medical child abuse,” the new term that has replaced Munchausen by proxy (MSbP). Mr. Pelletier was surrounded by agents of the Massachusetts Department of Children and Families (DCF) and hospital security and ushered off the premises. Justina became a ward of the state for 16 months and her health deteriorated.
In a press conference, Reverend Patrick Mahoney, director of the Christian Defense Coalition in Washington, D.C., and spokesperson for the Pelletier family, made a remarkable statement that became a mirror reflecting an unsettling image of a dangerous mindset:
“t’s easier for us to want to believe, or wrap our brains around the fact that a family is mistreating their child, than the alternative to that, and the alternative to that, is what happened in this case and that is, with impunity government agencies and courts have removed a child from the loving care of their parents—and so that’s that obstacle that no one wants to believe that reality.
“That reality” is the last thing parents think of when they have a chronically ill child or have taken a holistic path to health.
Michelle Rider, the 34-year-old registered nurse and single mother of Isaiah Rider, the boy in the above video, told PJ Lifestyle just why we have a hard time accepting this is happening:
We are taught that hospitals are safe, that doctors are safe, and DCFS intervenes when intervention is needed. So when we accept the fact that this is really happening– we are accepting that we are not safe, and our children are not safe.
While President Barack Obama asks the nation if we will accept the “cruelty of ripping children from their parents’ arms,” it’s blatantly apparent to parents like Michelle that he isn’t talking about sick children like Isaiah. Agents of the state — with calculated impunity — take their children.
On the very day a law was introduced in his name, his worst fears came true.
Low back pain is a condition so common that, intermittently, I suffer from it myself. It comes and goes for no apparent reason, lasting a few days at a time. Nearly 40 years ago I realized that, though I had liked to think of myself as nearly immune from nervous tension, anxiety could cause it.
I was in a far distant country and I had a problem with my return air ticket. At the same time I suffered agonizing low back pain, which I did not connect with the problem of my ticket. When the problem was sorted out, however, my back pain disappeared within two hours.
In general, low back pain is poorly correlated with X-ray and MRI findings. Epidemiological research shows that the self-employed are much less prone to it than employees, and also that those higher in the hierarchy suffer it less than those lower – and not because they do less physical labor. Now comes evidence, in a recent paper from Australia published in the Lancet, that the recommended first treatment usually given for such pain, acetaminophen, also known as paracetamol, is useless, or at least no better than placebo (which is not quite the same thing, of course).
I return to work today after a week recovering from a major procedure. I underwent gastric bypass surgery to treat, among other things, my adult onset type 2 diabetes.
While no surgery occurs without pain, discomfort, disorientation, and some period of recovery, I can say that my experience has been as good as it could have been given the circumstances. My doctors, their staff, the insurance company, and the healthcare provider have all performed professionally and effectively.
That said, as a guy daily occupied with the effect of government upon the human experience, I certainly perceived areas where the healthcare system would undoubtedly improve if less encumbered by government. First, I noted inefficient compartmentalization.
To give you an idea of what I mean, consider the path taken to get this surgery done. First, I needed to see my primary care physician for a referral. Then I needed a consult at a weight loss clinic. Then I spent three months checking off a long list of labs, dietitian visits, psychological evaluation, and preparatory classes and consults. Despite the fact that nearly all this occurred under the umbrella of the same healthcare provider, every single time I saw a different person – even within the same clinic, it was like I was being seen for the very first time. I had to answer the same questions, fill out the same forms, tell the same story, over and over again. I can only imagine how frustrating this is for patients dealing with chronic illness.
To a certain extent, this redundancy can be justified. Some of it no doubt serves patient privacy and security. For instance, asking for my birthdate or address could be a verification check to ensure I am the right patient. However, I have a hard time believing that explains most of the redundancy. Most of it seems to be a product of compartmentalization, a lack of access to information previously disclosed. Other industries model customer service solutions which could easily be applied to healthcare.
When you go to the airport in any major city, you can check in at a kiosk and get your boarding pass without seeing a clerk. You can even check in online ahead of time, from your phone while in transit if necessary. Why can’t we do this in healthcare? I get to an appointment on time, but have to wait ten minutes in line behind other patients with more complex needs, and end up checked in late. There’s no need for that.
There is no new thing under the sun, least of all panic at the approach of an epidemic of a deadly disease. In 1720, the preface to Loimologia, Nathaniel Hodges’ account of the Great Plague of London in 1665, first published in Latin in 1672, referred to the outbreak of plague in Marseilles:
The Alarm we have of late been justly under from a most terrible Destroyer in a neighbouring Kingdom, very naturally calls for all possible Precautions against its Invasion and Progress here…
In fact, though no one was to know it, no epidemic of plague was ever to occur in Western Europe again; and it is doubtful whether the precautions referred to made much difference.
The death rate from the Ebola virus is probably greater than that from bubonic plague, though of course the plague spread much faster and killed far more people in total than Ebola ever has: and at least we, unlike our plague-ridden ancestors, know the causative organism of the Ebola disease, even if we are not certain how the virus first came to infect Mankind.
You might have supposed that trust in the medical profession would have risen as medicine became more effective at warding off death and disease, but you would have been mistaken. In fact, precisely the reverse has happened throughout the western world, but particularly in the United States. Half a century ago, nearly three quarters of Americans had confidence in the medical profession qua profession; now only about a third do so.
According to international surveys reported in an article in a recent New England Journal of Medicine, Americans are among the most mistrustful of doctors of any western people. Asked whether, all things considered, doctors in their country could be trusted, 58 percent of Americans answered in the affirmative; by contrast, 83 percent of the Swiss answered positively. Positive answers were returned by 79, 78, and 76 percent of the Danish, Dutch and British respectively. Americans were 24th of 29 nations polled in their trust of doctors. Furthermore, just fewer than half of Americans in the lowest third of the income range thought that doctors in general could be trusted, and younger Americans were also less likely to trust their doctors than older ones.
Curiously enough, though, Americans were among the most satisfied of nations with their last encounter with their doctor. Only the Swiss and Danes were more satisfied than they, and then not by very much (64, 61 and 56 percent respectively). In other countries, then, people were more likely to trust doctors in general than be satisfied by their last visit to the doctor; in America, it was about the same proportion.
What, if anything, does this mean?
My PJ colleague Walter Hudson published a compelling argument regarding physician-assisted suicide in response to the ongoing dialogue surrounding terminal cancer patient Brittany Maynard. His is a well-reasoned argument regarding the intersection of theology and politics, written in response to Matt Walsh’s Blaze piece titled “There is Nothing Brave About Suicide.” Both pieces are a reminder that, in the ongoing debate over whether or not Maynard has the right to schedule her own death, little has been said regarding the role the medical profession plays in the battle to “Die with Dignity.” Walsh argues:
None of us get to die on our own terms, because if we did then I’m sure our terms would be a perfect, happy, and healthy life, where pain and death never enter into the picture at all.
It’s a simplistic comment that ignores a very real medical fact: Death can come on your own terms. And that doesn’t have to mean suicide.
My mother was a nurse for 20 years. During that time she worked in a variety of settings, from hospitals, to private practice, to nursing homes. Much like Jennifer Worth, the nurse and author of the Call the Midwife series, my mother practiced at the end of Victorian bedside nursing and the dawn of Medicare. As a result, the abuses she witnessed in the name of insurance claims were grotesque. For instance, if a patient required one teaspoon of medication, an entire bottle would be poured into the sink and charged to that patient’s insurance company. This was just the tip of the iceberg of unethical practices that would become priority in the name of the almighty “billing schedule.”
Screenwriters are not known for being sticklers for facts. And when it comes to disasters, writes University of Texas Professor David A. McEntire, “many of Hollywood’s portrayals are based on myths and exaggerations….” That’s certainly the case when it comes to disease disaster films. Here are 10 “fun” movies that are of no use whatsoever in terms of helping viewers respond wisely to a pandemic.
10. Panic in the Streets (1950)
“Patient Zero” is carrying the pulmonary version of bubonic plague. A public official (played by Richard Widmark) has 48 hours to find him before the disease spreads throughout the city. Director Elia Kazan delivers a moody, atmospheric, underappreciated film. But if this is how the police, public health officials and reporters will really act during a crisis, well, we’re all doomed.
We live in the age of acronym. To read a medical journal is sometimes like trying to decipher a code; once, when I was a judge in a competition of medical poetry, I read a poem composed entirely of figures and acronyms:
RTA [road traffic accident]
ETA [expected time of arrival] 13.20 hrs
CGS [Glasgow Coma Scale] 3…
The last line of the poem, inevitably, was:
Sometimes one has the impression that the acronym has been devised before the thing that it is attached to has been decided. In a recent paper in the New England Journal of Medicine, for example, I came across the acronym SWEDEHEART. It stood for the Swedish Web System for Enhancement and Development of Evidence-based Care in Heart Disease Evaluated According to Recommended Therapies. If the web system come before the acronym, however, you could see why the latter was necessary, the former being longer than the average tin-pot dictator’s list of honorific titles.
The paper in which the acronym occurred was yet another in which a common medical practice was shown to be valueless, or very nearly so. It turns out yet again that doctors do things not because they do the patients any good, but because they can do them.
The question is important because public health emergencies allow governments to ignore the usual restrictions or restraints upon their actions. In public health emergencies, governments can override property rights and abrogate all kinds of civil liberties such as freedom of movement. They can .r our goods and tells us where to go and where to stay. They do so only for our own good: health being the highest good, of course.
A recent edition of the New England Journal of Medicine discusses the issue in the context of the declaration of a public health emergency in Massachusetts by the governor of that state, Deval Patrick.
In most people’s minds, no doubt, a public health emergency would be something like the Black Death, the epidemic of plague that wiped out a third of Europe’s population in the fourteenth century. A natural disaster of large proportions might also count, not only because of the death and injury caused directly by the disaster, but by the epidemics which often follow such disasters.
What, then, was the public health emergency that “obliged” Patrick to declare that it existed and that he could and should take uncontrolled administrative measures to halt it?
For several years now the inimitable Theodore Dalrymple has provided PJ Media and PJ Lifestyle with erudite, witty commentaries on controversies in the worlds of health, drugs, and disease, as well as their impact on culture. Here’s a collection featuring links to many of the questions he’s addressed, often in response to some shaky thinking in a new study or an ideologically slanted medical journal article.
What health and medical questions would you like to see him and other writers explore in the future? Please leave your suggestions in the comments.
2011 and 2012
- Is Salt Really Bad for Your Heart?
- Are There Health Effects Due to the Financial Crisis?
- Should the ‘Morning After’ Pill Be Available to All Ages?
- Can Children Be Manipulated into Eating Their Veggies?
- Should We Be Worried about Bird Flu?
- Is Surgery Not Always Necessary for Appendicitis?
- Genomic Medicine: A Great Leap Forward?
- Aspirin: The Elixir of Life?
- Do Nicotine Patches Actually Work?
- Does ‘Good Cholesterol’ Really Help Prevent Heart Attacks?
- Should Women’s High School Soccer Be Banned To Reduce Knee Injuries?
- Is Grief Always Depression?
- Does Fish Oil Prevent Alzheimer’s Disease?
- Do Proactive Measures by Doctors Aid in Smoking Cessation?
- Can Dark Chocolate Reduce High Blood Pressure?
- How Come People Rarely Die of Dementia in Poor Countries?
- Should You Take Antibiotics?
- Is Obesity a Disease or a Moral Failing?
- Are Obese Kids Victims of Child Abuse?
- Need A Few Arguments Against Tattoos?
- Are the Treatment and Prevention of Obesity Different Problems?
- Why Are Psychiatric Disorders Not the Same as Physical Diseases?
- Do Today’s Medical Ethics Prevent New Breakthroughs?
- Should We Be Worried About Parasites from Cats?
- Do Doctors Turn Their Patients into Drug Addicts?
- Should Doctors Lie to Their Patients About Their Survival Chances?
- As Life Expectancy Increases Will the Elderly Become a Greater ‘Burden on Society’?
- What Is the Best Way to Treat Diabetes?
- What Can Be Done to Reduce Post-Hospital Syndrome?
- How Can a Mammogram Kill You?
- Human Feces as Medicine?
- What Will Happen if I Consume Too Much Calcium?
- Is Marijuana a Medicine?
- Why Is Immunization so Controversial?
- Is America at the Point Where HIV Testing Should Be Routine?
- Is Physical Therapy Overrated?
- How Many Smokers Could Quit If Someone Paid Them $10 Million?
- Is Nutrition Really the Key to Good Health?
- Is It Even Possible to Accurately Measure Physical Pain?
- Can Doctors Determine Who Should Be Allowed to Carry a Concealed Gun?
- Does Practice Really Make Perfect for Doctors?
- Should Doctors Be Allowed to Choose Not to Treat Fat People?
- Should Pre-Term Infants Receive Risky Oxygen Treatments?
- We Mock Prudish Victorian Euphemisms, But Are We Really Any Better?
- How Often Do Medical Emergencies Occur on Flights?
- What Is the Safest Day of the Week for Surgery?
- Are Antibiotic-Resistant Diseases Mother Nature’s Revenge?
- How Dangerous Is Obstructive Sleep Apnea During Surgery?
- Can Advances in Medical Technology Make Us Less Healthy?
- Does Badgering Patients to Exercise and Eat Better Actually Work?
- Should an Alcoholic Be Allowed to Get a Second Liver Transplant?
- Can Living With Chickens Protect Against Face-Eating Bacteria?
- Does the Sleep Aid Zolpidem Impair Driving the Next Day?
- Does Too Much Sugar Increase the Risk of Dementia?
- Men: Need Another Excuse to Put Off That Prostate Exam?
- Is Drug Addiction Really Like ‘Any Other Chronic Illness’?
- How Many Doctors Support Suicide for the Terminally Ill?
- What Are the Dangers in Screening for Diseases?
- Was Sir Winston Churchill Right About Exercise?
- Should Doctors Relax the ‘Dead-Donor Rule’ to Increase Organ Transplants?
- Is Living Near an Airport Dangerous for Your Health?
- Can Money Become Medicine?
- Gastric Bypass or Laparoscopic Gastric Band?
- How Do You Measure a Good Doctor Vs a Bad One?
- Should You Eat Lots of Nuts?
- As More People Live Longer Why Are Rates of Dementia Falling?
- Should Treatment of Obesity Begin Before Birth?
- Why Is It So Difficult to Translate Genetic Breakthroughs into Clinical Benefits?
- Can Scientists Create a Cure for Pain From Scorpions, Spiders, and Centipedes?
- Should the Age to Buy Cigarettes Be 21?
- Should You Vaccinate Your Children?
- Is Your Heart Attack More Likely to Kill You at Night or During the Day?
- A Cure For Peanut Allergies?
- Should Taxpayers Pay for the Junky’s Substitute Smack?
- Who Pays for Illegal Immigrant Tetraplegics’ Treatment?
- How Much Would You Pay to Survive Four Months Longer with a Terminal Disease?
- Euthanasia for the Insane?
- Does Valium Increase Your Chances of An Early Death?
- Are Diet Supplements Dangerous?
- Did Flu Drug Companies Perpetuate a Billion Dolllar Scam Around the World?
- What is One of the Most Dangerous Ideas in All of Medicine?
- Why Do Some Mothers Induce Illness in Their Own Children?
- Why Might a Doctor Be Relieved When a New Study Fails To Reduce Deaths?
- Should Prisoners Receive Better Health Care Than the General Population?
- Is Ebola the World’s Most Terrifying Disease?
- What Can Happen to Your Lungs if You Smoke 20 Cigarettes Every Day for A Decade?
- Is This the End of Mammograms to Screen for Breast Cancer?
- Do Medical Experiments on Animals Really Yield Meaningful Results?
- Will Legal Marijuana Be a Bonanza for Trial Lawyers?
- Should You Get Your DNA Tested to See if You’re More Likely to Get Cancer?
- What to do when the Risk of Treatment Outweighs the Benefits?
- Why Must We Take One Step Forward, Two Steps Back in the Battle Against Tuberculosis?
- Is Ignorance Really Bliss? What Is the ‘Nocebo’ Effect?
- What Does Moral Narcissism Looks Like in the Medical World?
- Why is Treating Statistical Markers of Disease Is Not the Same as Treating Disease Itself?
- Heterochronic Parabiosis: Reversing Aging With Young Blood?
- Should Everyone Consume Less Sodium?
- Does a Popular Antibiotic Raise the Risk of Heart Attack?
- Do You Really Need That Colonoscopy?
- Is It ‘Unjust’ for Doctors to Die from Ebola?
image illustration via shutterstock / Sherry Yates Young
When I visited the John F. Kennedy hospital in Monrovia during the long Liberian Civil War, it had been destroyed by a kind of conscientious vandalism. Every last piece of hospital furniture and equipment had been disabled, the wheels sawn off trolleys and gurneys, the electronics smashed. This was not just the result of bombardment during the war but of willful and thorough dismantlement.
There were no patients and no staff in the hospital; it was a ghost establishment, completely deserted. I was severely criticized for suggesting in a book that the painstaking destruction of the hospital, which shortly before had performed open heart surgery, was of symbolic significance.
I was pleased to see from an article in a recent edition of the New England Journal of Medicine that it had re-opened, but saddened to see that its problems were now of an even more terrifying nature than those it encountered during the civil war: for the hospital is at the center of the epidemic of Ebola virus disease, and two of its senior physicians, Sam Brisbane and Abraham Borbor, have recently died of it. The article in the journal lamented their passing and praised them for their bravery in not deserting their posts; they both knew of the dangers of refusing to do so.
Readers may recall my description of aortic valve replacement last year and a warning of the importance of treating heart disease seriously. Here’s another lesson from medical misadventure: if you are going to have a stroke, it is best to have it on an operating room table. And best of all, avoid strokes if at all possible. As miserable and expensive as last year’s surgery and recovery was, I think I would make that trade.
This started as the classic symptoms of a heart attack on the evening of August 2: chest pain, pressure, left-arm pain, a sense of confusion. So I had my wife drive me to an urgent care facility, where they decided that I was beyond the level that they could treat other than giving me aspirin and nitroglycerin, and calling the Ada County Paramedics to transport me to St. Alphonsus hospital. In retrospect, my wife could have driven me there directly in less time, and saved the insurance company $1300.
Every few months I receive a computerized invitation from my doctor asking me to have a colonoscopy to screen for polyps in my bowel. I always tell myself that I am too busy just now, I will have it another time. But really I don’t want to have it at all, and I know that when the next invitation comes I will be too busy then as well.
I am also eager to find a rational reason, or at least a rationalization, for my refusal. I thought I found it in a paper from Norway in a recent edition of the New England Journal of Medicine.
The authors examined the death rate from colorectal cancer in Norway among the 40,826 patients between 1993 and 2007 who had had polyps removed at colonoscopy in that country (the records are more or less complete). They compared the number of deaths in that population with the expected death rate from the disease in the population the same age as a whole. The paper reports that 398 deaths were expected and 383 deaths were observed.
This small difference does not mean that colonoscopy does not work in preventing death from colorectal cancer, of course. This is because the relevant comparison is with people who had polyps not removed by colonoscopy rather than with the population as a whole.
The 40,826 patients who had polyps removed at colonoscopy, however, were not a random sample of the adult population because Norway does not have a screening program for colonic polyps. The patients had colonoscopy in the first place because they were symptomatic, for example bleeding per rectum. They were therefore much more likely to suffer from polyps or cancer in the first place than the rest of the population.
Many people have been misinformed regarding human-to-human transmission of Ebola. The Canadian Health Dept. States that airborne transmission of Ebola is strongly suspected and the CDC admits that Ebola can be transmitted in situations where there is no physical contact between people, i.e.: via airborne inhalation into the lungs or into the eyes where individuals are separated by 3 feet. That helps explain why 81 doctors, nurses and other healthcare workers have died in West Africa to date. These courageous health care providers use careful CDC level barrier precautions such as gowns, gloves and head cover, but it appears they have inadequate respiratory and eye protection. Dr. Michael V. Callahan, an infectious disease specialist at Massachusetts General Hospital who has worked in Africa during Ebola outbreaks said that minimum CDC level precautions “led to the infection of my nurses and physician co-workers who came in contact with body fluids.”
Currently the CDC advises health care workers to use goggles and simple face masks for respiratory and eye protection, and a fitted N-95 mask during aerosol-generating medical procedures. Since so many doctors and nurses are dying in West Africa, it is clear that this level of protection is inadequate. Full face respirators with P-100 replacement filters would provide greater airway and eye protection, and I believe this would save the lives of many doctors, nurses and others who come into close contact with, or in proximity to, Ebola victims.
It is apparent that the primary mode of person-to-person Ebola transmission is through direct contact with the body or bodily fluids of Ebola victims, but it is unwise to ignore the airborne mode. I believe the current evidence supports healthcare workers using a higher level of airway and eye protection than is currently recommended. Since CDC level respiratory/eye precautions for Ebola are inadequate for healthcare workers in West Africa, I assume they will also be inadequate in the United States.
What stands to reason is not always borne out by facts, for reality is often refractory to human wishes. There was a good illustration of this unfortunate principle in a recent edition of the New England Journal of Medicine.
It has long been known that low concentrations of high-density lipoproteins (HDL) and high concentrations of low-density lipoproteins (LDL) are associated, in a more or less linear fashion, with cardiovascular disease such as strokes and heart attacks. It would seem to stand to reason, therefore, that raising the HDL and lowering the LDL would lead to fewer cardiovascular “events,” as strokes and heart attacks are called.
One way to achieve this wished-for biochemical change is to treat patients at risk of such events with niacin, a B vitamin, in addition to the statins that they are already taking. The largest placebo-controlled trial of niacin ever undertaken, with 25,673 patients who had already had a stroke or heart attack, has shown that the addition of niacin, though it does indeed increase HDL and decrease LDL, has no effect on the rate of heart attack or stroke. Worse still, it gave rise to serious side effects, such as worsening of diabetes and unpleasant gastrointestinal, musculoskeletal and dermatological effects. One of the most unexpected findings of the trial was the excess of infections in people treated by niacin. If anything, the overall death rate in the niacin-treated group was higher than that in the placebo control group, though the difference was not statistically significant (which is not quite the same thing as saying that it was not real). The patients were followed up, on average, for nearly four years and at no time was treatment with niacin superior to that with placebo.