Low back pain is a condition so common that, intermittently, I suffer from it myself. It comes and goes for no apparent reason, lasting a few days at a time. Nearly 40 years ago I realized that, though I had liked to think of myself as nearly immune from nervous tension, anxiety could cause it.
I was in a far distant country and I had a problem with my return air ticket. At the same time I suffered agonizing low back pain, which I did not connect with the problem of my ticket. When the problem was sorted out, however, my back pain disappeared within two hours.
In general, low back pain is poorly correlated with X-ray and MRI findings. Epidemiological research shows that the self-employed are much less prone to it than employees, and also that those higher in the hierarchy suffer it less than those lower – and not because they do less physical labor. Now comes evidence, in a recent paper from Australia published in the Lancet, that the recommended first treatment usually given for such pain, acetaminophen, also known as paracetamol, is useless, or at least no better than placebo (which is not quite the same thing, of course).
I return to work today after a week recovering from a major procedure. I underwent gastric bypass surgery to treat, among other things, my adult onset type 2 diabetes.
While no surgery occurs without pain, discomfort, disorientation, and some period of recovery, I can say that my experience has been as good as it could have been given the circumstances. My doctors, their staff, the insurance company, and the healthcare provider have all performed professionally and effectively.
That said, as a guy daily occupied with the effect of government upon the human experience, I certainly perceived areas where the healthcare system would undoubtedly improve if less encumbered by government. First, I noted inefficient compartmentalization.
To give you an idea of what I mean, consider the path taken to get this surgery done. First, I needed to see my primary care physician for a referral. Then I needed a consult at a weight loss clinic. Then I spent three months checking off a long list of labs, dietitian visits, psychological evaluation, and preparatory classes and consults. Despite the fact that nearly all this occurred under the umbrella of the same healthcare provider, every single time I saw a different person – even within the same clinic, it was like I was being seen for the very first time. I had to answer the same questions, fill out the same forms, tell the same story, over and over again. I can only imagine how frustrating this is for patients dealing with chronic illness.
To a certain extent, this redundancy can be justified. Some of it no doubt serves patient privacy and security. For instance, asking for my birthdate or address could be a verification check to ensure I am the right patient. However, I have a hard time believing that explains most of the redundancy. Most of it seems to be a product of compartmentalization, a lack of access to information previously disclosed. Other industries model customer service solutions which could easily be applied to healthcare.
When you go to the airport in any major city, you can check in at a kiosk and get your boarding pass without seeing a clerk. You can even check in online ahead of time, from your phone while in transit if necessary. Why can’t we do this in healthcare? I get to an appointment on time, but have to wait ten minutes in line behind other patients with more complex needs, and end up checked in late. There’s no need for that.
There is no new thing under the sun, least of all panic at the approach of an epidemic of a deadly disease. In 1720, the preface to Loimologia, Nathaniel Hodges’ account of the Great Plague of London in 1665, first published in Latin in 1672, referred to the outbreak of plague in Marseilles:
The Alarm we have of late been justly under from a most terrible Destroyer in a neighbouring Kingdom, very naturally calls for all possible Precautions against its Invasion and Progress here…
In fact, though no one was to know it, no epidemic of plague was ever to occur in Western Europe again; and it is doubtful whether the precautions referred to made much difference.
The death rate from the Ebola virus is probably greater than that from bubonic plague, though of course the plague spread much faster and killed far more people in total than Ebola ever has: and at least we, unlike our plague-ridden ancestors, know the causative organism of the Ebola disease, even if we are not certain how the virus first came to infect Mankind.
You might have supposed that trust in the medical profession would have risen as medicine became more effective at warding off death and disease, but you would have been mistaken. In fact, precisely the reverse has happened throughout the western world, but particularly in the United States. Half a century ago, nearly three quarters of Americans had confidence in the medical profession qua profession; now only about a third do so.
According to international surveys reported in an article in a recent New England Journal of Medicine, Americans are among the most mistrustful of doctors of any western people. Asked whether, all things considered, doctors in their country could be trusted, 58 percent of Americans answered in the affirmative; by contrast, 83 percent of the Swiss answered positively. Positive answers were returned by 79, 78, and 76 percent of the Danish, Dutch and British respectively. Americans were 24th of 29 nations polled in their trust of doctors. Furthermore, just fewer than half of Americans in the lowest third of the income range thought that doctors in general could be trusted, and younger Americans were also less likely to trust their doctors than older ones.
Curiously enough, though, Americans were among the most satisfied of nations with their last encounter with their doctor. Only the Swiss and Danes were more satisfied than they, and then not by very much (64, 61 and 56 percent respectively). In other countries, then, people were more likely to trust doctors in general than be satisfied by their last visit to the doctor; in America, it was about the same proportion.
What, if anything, does this mean?
My PJ colleague Walter Hudson published a compelling argument regarding physician-assisted suicide in response to the ongoing dialogue surrounding terminal cancer patient Brittany Maynard. His is a well-reasoned argument regarding the intersection of theology and politics, written in response to Matt Walsh’s Blaze piece titled “There is Nothing Brave About Suicide.” Both pieces are a reminder that, in the ongoing debate over whether or not Maynard has the right to schedule her own death, little has been said regarding the role the medical profession plays in the battle to “Die with Dignity.” Walsh argues:
None of us get to die on our own terms, because if we did then I’m sure our terms would be a perfect, happy, and healthy life, where pain and death never enter into the picture at all.
It’s a simplistic comment that ignores a very real medical fact: Death can come on your own terms. And that doesn’t have to mean suicide.
My mother was a nurse for 20 years. During that time she worked in a variety of settings, from hospitals, to private practice, to nursing homes. Much like Jennifer Worth, the nurse and author of the Call the Midwife series, my mother practiced at the end of Victorian bedside nursing and the dawn of Medicare. As a result, the abuses she witnessed in the name of insurance claims were grotesque. For instance, if a patient required one teaspoon of medication, an entire bottle would be poured into the sink and charged to that patient’s insurance company. This was just the tip of the iceberg of unethical practices that would become priority in the name of the almighty “billing schedule.”
Screenwriters are not known for being sticklers for facts. And when it comes to disasters, writes University of Texas Professor David A. McEntire, “many of Hollywood’s portrayals are based on myths and exaggerations….” That’s certainly the case when it comes to disease disaster films. Here are 10 “fun” movies that are of no use whatsoever in terms of helping viewers respond wisely to a pandemic.
10. Panic in the Streets (1950)
“Patient Zero” is carrying the pulmonary version of bubonic plague. A public official (played by Richard Widmark) has 48 hours to find him before the disease spreads throughout the city. Director Elia Kazan delivers a moody, atmospheric, underappreciated film. But if this is how the police, public health officials and reporters will really act during a crisis, well, we’re all doomed.
We live in the age of acronym. To read a medical journal is sometimes like trying to decipher a code; once, when I was a judge in a competition of medical poetry, I read a poem composed entirely of figures and acronyms:
RTA [road traffic accident]
ETA [expected time of arrival] 13.20 hrs
CGS [Glasgow Coma Scale] 3…
The last line of the poem, inevitably, was:
Sometimes one has the impression that the acronym has been devised before the thing that it is attached to has been decided. In a recent paper in the New England Journal of Medicine, for example, I came across the acronym SWEDEHEART. It stood for the Swedish Web System for Enhancement and Development of Evidence-based Care in Heart Disease Evaluated According to Recommended Therapies. If the web system come before the acronym, however, you could see why the latter was necessary, the former being longer than the average tin-pot dictator’s list of honorific titles.
The paper in which the acronym occurred was yet another in which a common medical practice was shown to be valueless, or very nearly so. It turns out yet again that doctors do things not because they do the patients any good, but because they can do them.
The question is important because public health emergencies allow governments to ignore the usual restrictions or restraints upon their actions. In public health emergencies, governments can override property rights and abrogate all kinds of civil liberties such as freedom of movement. They can .r our goods and tells us where to go and where to stay. They do so only for our own good: health being the highest good, of course.
A recent edition of the New England Journal of Medicine discusses the issue in the context of the declaration of a public health emergency in Massachusetts by the governor of that state, Deval Patrick.
In most people’s minds, no doubt, a public health emergency would be something like the Black Death, the epidemic of plague that wiped out a third of Europe’s population in the fourteenth century. A natural disaster of large proportions might also count, not only because of the death and injury caused directly by the disaster, but by the epidemics which often follow such disasters.
What, then, was the public health emergency that “obliged” Patrick to declare that it existed and that he could and should take uncontrolled administrative measures to halt it?
For several years now the inimitable Theodore Dalrymple has provided PJ Media and PJ Lifestyle with erudite, witty commentaries on controversies in the worlds of health, drugs, and disease, as well as their impact on culture. Here’s a collection featuring links to many of the questions he’s addressed, often in response to some shaky thinking in a new study or an ideologically slanted medical journal article.
What health and medical questions would you like to see him and other writers explore in the future? Please leave your suggestions in the comments.
2011 and 2012
- Is Salt Really Bad for Your Heart?
- Are There Health Effects Due to the Financial Crisis?
- Should the ‘Morning After’ Pill Be Available to All Ages?
- Can Children Be Manipulated into Eating Their Veggies?
- Should We Be Worried about Bird Flu?
- Is Surgery Not Always Necessary for Appendicitis?
- Genomic Medicine: A Great Leap Forward?
- Aspirin: The Elixir of Life?
- Do Nicotine Patches Actually Work?
- Does ‘Good Cholesterol’ Really Help Prevent Heart Attacks?
- Should Women’s High School Soccer Be Banned To Reduce Knee Injuries?
- Is Grief Always Depression?
- Does Fish Oil Prevent Alzheimer’s Disease?
- Do Proactive Measures by Doctors Aid in Smoking Cessation?
- Can Dark Chocolate Reduce High Blood Pressure?
- How Come People Rarely Die of Dementia in Poor Countries?
- Should You Take Antibiotics?
- Is Obesity a Disease or a Moral Failing?
- Are Obese Kids Victims of Child Abuse?
- Need A Few Arguments Against Tattoos?
- Are the Treatment and Prevention of Obesity Different Problems?
- Why Are Psychiatric Disorders Not the Same as Physical Diseases?
- Do Today’s Medical Ethics Prevent New Breakthroughs?
- Should We Be Worried About Parasites from Cats?
- Do Doctors Turn Their Patients into Drug Addicts?
- Should Doctors Lie to Their Patients About Their Survival Chances?
- As Life Expectancy Increases Will the Elderly Become a Greater ‘Burden on Society’?
- What Is the Best Way to Treat Diabetes?
- What Can Be Done to Reduce Post-Hospital Syndrome?
- How Can a Mammogram Kill You?
- Human Feces as Medicine?
- What Will Happen if I Consume Too Much Calcium?
- Is Marijuana a Medicine?
- Why Is Immunization so Controversial?
- Is America at the Point Where HIV Testing Should Be Routine?
- Is Physical Therapy Overrated?
- How Many Smokers Could Quit If Someone Paid Them $10 Million?
- Is Nutrition Really the Key to Good Health?
- Is It Even Possible to Accurately Measure Physical Pain?
- Can Doctors Determine Who Should Be Allowed to Carry a Concealed Gun?
- Does Practice Really Make Perfect for Doctors?
- Should Doctors Be Allowed to Choose Not to Treat Fat People?
- Should Pre-Term Infants Receive Risky Oxygen Treatments?
- We Mock Prudish Victorian Euphemisms, But Are We Really Any Better?
- How Often Do Medical Emergencies Occur on Flights?
- What Is the Safest Day of the Week for Surgery?
- Are Antibiotic-Resistant Diseases Mother Nature’s Revenge?
- How Dangerous Is Obstructive Sleep Apnea During Surgery?
- Can Advances in Medical Technology Make Us Less Healthy?
- Does Badgering Patients to Exercise and Eat Better Actually Work?
- Should an Alcoholic Be Allowed to Get a Second Liver Transplant?
- Can Living With Chickens Protect Against Face-Eating Bacteria?
- Does the Sleep Aid Zolpidem Impair Driving the Next Day?
- Does Too Much Sugar Increase the Risk of Dementia?
- Men: Need Another Excuse to Put Off That Prostate Exam?
- Is Drug Addiction Really Like ‘Any Other Chronic Illness’?
- How Many Doctors Support Suicide for the Terminally Ill?
- What Are the Dangers in Screening for Diseases?
- Was Sir Winston Churchill Right About Exercise?
- Should Doctors Relax the ‘Dead-Donor Rule’ to Increase Organ Transplants?
- Is Living Near an Airport Dangerous for Your Health?
- Can Money Become Medicine?
- Gastric Bypass or Laparoscopic Gastric Band?
- How Do You Measure a Good Doctor Vs a Bad One?
- Should You Eat Lots of Nuts?
- As More People Live Longer Why Are Rates of Dementia Falling?
- Should Treatment of Obesity Begin Before Birth?
- Why Is It So Difficult to Translate Genetic Breakthroughs into Clinical Benefits?
- Can Scientists Create a Cure for Pain From Scorpions, Spiders, and Centipedes?
- Should the Age to Buy Cigarettes Be 21?
- Should You Vaccinate Your Children?
- Is Your Heart Attack More Likely to Kill You at Night or During the Day?
- A Cure For Peanut Allergies?
- Should Taxpayers Pay for the Junky’s Substitute Smack?
- Who Pays for Illegal Immigrant Tetraplegics’ Treatment?
- How Much Would You Pay to Survive Four Months Longer with a Terminal Disease?
- Euthanasia for the Insane?
- Does Valium Increase Your Chances of An Early Death?
- Are Diet Supplements Dangerous?
- Did Flu Drug Companies Perpetuate a Billion Dolllar Scam Around the World?
- What is One of the Most Dangerous Ideas in All of Medicine?
- Why Do Some Mothers Induce Illness in Their Own Children?
- Why Might a Doctor Be Relieved When a New Study Fails To Reduce Deaths?
- Should Prisoners Receive Better Health Care Than the General Population?
- Is Ebola the World’s Most Terrifying Disease?
- What Can Happen to Your Lungs if You Smoke 20 Cigarettes Every Day for A Decade?
- Is This the End of Mammograms to Screen for Breast Cancer?
- Do Medical Experiments on Animals Really Yield Meaningful Results?
- Will Legal Marijuana Be a Bonanza for Trial Lawyers?
- Should You Get Your DNA Tested to See if You’re More Likely to Get Cancer?
- What to do when the Risk of Treatment Outweighs the Benefits?
- Why Must We Take One Step Forward, Two Steps Back in the Battle Against Tuberculosis?
- Is Ignorance Really Bliss? What Is the ‘Nocebo’ Effect?
- What Does Moral Narcissism Looks Like in the Medical World?
- Why is Treating Statistical Markers of Disease Is Not the Same as Treating Disease Itself?
- Heterochronic Parabiosis: Reversing Aging With Young Blood?
- Should Everyone Consume Less Sodium?
- Does a Popular Antibiotic Raise the Risk of Heart Attack?
- Do You Really Need That Colonoscopy?
- Is It ‘Unjust’ for Doctors to Die from Ebola?
image illustration via shutterstock / Sherry Yates Young
When I visited the John F. Kennedy hospital in Monrovia during the long Liberian Civil War, it had been destroyed by a kind of conscientious vandalism. Every last piece of hospital furniture and equipment had been disabled, the wheels sawn off trolleys and gurneys, the electronics smashed. This was not just the result of bombardment during the war but of willful and thorough dismantlement.
There were no patients and no staff in the hospital; it was a ghost establishment, completely deserted. I was severely criticized for suggesting in a book that the painstaking destruction of the hospital, which shortly before had performed open heart surgery, was of symbolic significance.
I was pleased to see from an article in a recent edition of the New England Journal of Medicine that it had re-opened, but saddened to see that its problems were now of an even more terrifying nature than those it encountered during the civil war: for the hospital is at the center of the epidemic of Ebola virus disease, and two of its senior physicians, Sam Brisbane and Abraham Borbor, have recently died of it. The article in the journal lamented their passing and praised them for their bravery in not deserting their posts; they both knew of the dangers of refusing to do so.
Readers may recall my description of aortic valve replacement last year and a warning of the importance of treating heart disease seriously. Here’s another lesson from medical misadventure: if you are going to have a stroke, it is best to have it on an operating room table. And best of all, avoid strokes if at all possible. As miserable and expensive as last year’s surgery and recovery was, I think I would make that trade.
This started as the classic symptoms of a heart attack on the evening of August 2: chest pain, pressure, left-arm pain, a sense of confusion. So I had my wife drive me to an urgent care facility, where they decided that I was beyond the level that they could treat other than giving me aspirin and nitroglycerin, and calling the Ada County Paramedics to transport me to St. Alphonsus hospital. In retrospect, my wife could have driven me there directly in less time, and saved the insurance company $1300.
Every few months I receive a computerized invitation from my doctor asking me to have a colonoscopy to screen for polyps in my bowel. I always tell myself that I am too busy just now, I will have it another time. But really I don’t want to have it at all, and I know that when the next invitation comes I will be too busy then as well.
I am also eager to find a rational reason, or at least a rationalization, for my refusal. I thought I found it in a paper from Norway in a recent edition of the New England Journal of Medicine.
The authors examined the death rate from colorectal cancer in Norway among the 40,826 patients between 1993 and 2007 who had had polyps removed at colonoscopy in that country (the records are more or less complete). They compared the number of deaths in that population with the expected death rate from the disease in the population the same age as a whole. The paper reports that 398 deaths were expected and 383 deaths were observed.
This small difference does not mean that colonoscopy does not work in preventing death from colorectal cancer, of course. This is because the relevant comparison is with people who had polyps not removed by colonoscopy rather than with the population as a whole.
The 40,826 patients who had polyps removed at colonoscopy, however, were not a random sample of the adult population because Norway does not have a screening program for colonic polyps. The patients had colonoscopy in the first place because they were symptomatic, for example bleeding per rectum. They were therefore much more likely to suffer from polyps or cancer in the first place than the rest of the population.
Many people have been misinformed regarding human-to-human transmission of Ebola. The Canadian Health Dept. States that airborne transmission of Ebola is strongly suspected and the CDC admits that Ebola can be transmitted in situations where there is no physical contact between people, i.e.: via airborne inhalation into the lungs or into the eyes where individuals are separated by 3 feet. That helps explain why 81 doctors, nurses and other healthcare workers have died in West Africa to date. These courageous health care providers use careful CDC level barrier precautions such as gowns, gloves and head cover, but it appears they have inadequate respiratory and eye protection. Dr. Michael V. Callahan, an infectious disease specialist at Massachusetts General Hospital who has worked in Africa during Ebola outbreaks said that minimum CDC level precautions “led to the infection of my nurses and physician co-workers who came in contact with body fluids.”
Currently the CDC advises health care workers to use goggles and simple face masks for respiratory and eye protection, and a fitted N-95 mask during aerosol-generating medical procedures. Since so many doctors and nurses are dying in West Africa, it is clear that this level of protection is inadequate. Full face respirators with P-100 replacement filters would provide greater airway and eye protection, and I believe this would save the lives of many doctors, nurses and others who come into close contact with, or in proximity to, Ebola victims.
It is apparent that the primary mode of person-to-person Ebola transmission is through direct contact with the body or bodily fluids of Ebola victims, but it is unwise to ignore the airborne mode. I believe the current evidence supports healthcare workers using a higher level of airway and eye protection than is currently recommended. Since CDC level respiratory/eye precautions for Ebola are inadequate for healthcare workers in West Africa, I assume they will also be inadequate in the United States.
What stands to reason is not always borne out by facts, for reality is often refractory to human wishes. There was a good illustration of this unfortunate principle in a recent edition of the New England Journal of Medicine.
It has long been known that low concentrations of high-density lipoproteins (HDL) and high concentrations of low-density lipoproteins (LDL) are associated, in a more or less linear fashion, with cardiovascular disease such as strokes and heart attacks. It would seem to stand to reason, therefore, that raising the HDL and lowering the LDL would lead to fewer cardiovascular “events,” as strokes and heart attacks are called.
One way to achieve this wished-for biochemical change is to treat patients at risk of such events with niacin, a B vitamin, in addition to the statins that they are already taking. The largest placebo-controlled trial of niacin ever undertaken, with 25,673 patients who had already had a stroke or heart attack, has shown that the addition of niacin, though it does indeed increase HDL and decrease LDL, has no effect on the rate of heart attack or stroke. Worse still, it gave rise to serious side effects, such as worsening of diabetes and unpleasant gastrointestinal, musculoskeletal and dermatological effects. One of the most unexpected findings of the trial was the excess of infections in people treated by niacin. If anything, the overall death rate in the niacin-treated group was higher than that in the placebo control group, though the difference was not statistically significant (which is not quite the same thing as saying that it was not real). The patients were followed up, on average, for nearly four years and at no time was treatment with niacin superior to that with placebo.
All medical journals these days feel the compulsion to be high-minded, but none is as high-minded as the Lancet. It is as if the editors had taken lessons both in moral philosophy and rhetoric from Mr. Pecksniff himself.
Mr. Pecksniff, you may remember, was the preposterous hypocrite in Dickens’ Martin Chuzzlewit, who introduces his daughters, Charity and Mercy, by adding “Not unholy names, I hope?” As Know thyself was inscribed over the entrance to the temple to Apollo at Delphi, and Abandon hope, all ye who enter here over the entrance to Dante’s hell, so Mr Pecksniff’s words, Let us be moral, must be inscribed over the entrance to the offices of the Lancet, figuratively if not literally
In the week before a Malaysian Airlines plane, taking many AIDS doctors and activists from Amsterdam to Melbourne for an international conference on AIDS, was shot down over eastern Ukraine, the Lancet published a statement called the Declaration of Melbourne, a typically sickly and nauseatingly unctuous statement of ethical principles. It began by saying something that, if not a lie exactly, was certainly not a truth:
We gather in Melbourne, the traditional meeting place of the Wurundjeri, Boonerwrung, Taungurong, Djajawurrung and the Wathaurung people, the original and enduring custodians of the lands that make up the Kulin Nation, to assess progress on the global HIV response and its future direction, at the 20th International AIDS Conference, AIDS 2014.
This, of course, is the purest 21st century Pecksniffery; and unless the signers of the declaration (who look extremely self-congratulatory in photos accompanying the article) can each and severally explain in what sense the Djajawarrung are the custodians of the lands on which the city of Melbourne is built, I suggest that they be banished to the outback for five years to live as pre-contact Australian Aborigines lived.
One of my first medical publications was on the nocebo effect, the unpleasant symptoms patients may suffer as a result of being made aware of potential side effects of a treatment they are about to receive or a procedure they are to undergo. Thus patients who were having a lumbar puncture were either told or not told they might suffer a headache afterwards; and lo and behold, those who were told that they might get headaches duly got headaches while those who were not told didn’t.
On the whole, as an article in a recent edition of the Journal of the American Medical Association points out, doctors are well aware of the placebo effect, that is to say the good that their treatment may do patients by means of mere suggestion, but have little awareness of the opposite nocebo effect, the harm that their treatment may do their patients by mere suggestion.
The nocebo effect poses an ethical dilemma for doctors, say authors of the article. On the one hand, doctors are supposed to do their patients no harm; on the other, they are supposed to be open and honest with their patients about the potential harms of drugs and other treatments. The dilemma is this: foreknowledge of those harms can harm some patients. Should the need for honesty trump the ethical injunction to do no harm?
When my PJ Media editor suggested that I write about having lupus, I almost said no.
I was diagnosed with SLE in 1991 and have been in remission since around 1995. My book about living with this chronic illness came out two years later. Like most writers, by the time a book comes out, I’m so sick – pun intended – of its topic that I dread having to revisit it.
Having been in remission for almost 20 years, I can honestly make the rather unusual claim that not even the perspective of hindsight has changed my ideas or feelings about what being a pain-wracked invalid was like. Not even a little bit.
I feel like I’m supposed to say the opposite: that looking back, I could have “handled” my disease differently, or learned other, “better” lessons from it, and so forth.
But then, from the very beginning, I didn’t fit the mold of the “disease of the week” TV movie heroine, or some “poster child” for lupus.
Here are some things I learned (or, perhaps more accurately, some pre-conceived ideas I had reinforced) when I was at my very sickest.
Warning: What follows is NOT inspirational. At all.
Is there ever any good news without bad? Good and bad seem to be inextricably locked in a Hegelian dialectic, or perhaps Manichaean struggle would be a more accurate way of putting it. For example, tuberculosis became the captain of the men of death, the white plague, between the seventeenth and the nineteenth centuries. Then it began its long decline, accelerated by the discovery of the first effective anti-tuberculous drugs. Then, just as large numbers of people became more susceptible to tuberculosis because of the spread of the human immunodeficiency virus, the germ of tuberculosis developed resistance to the most effective drugs against it. It seemed that the disease might once more become what it had been not so very long before. But then, for the first time in 40 years, a new anti-tuberculous drug, bedaquiline, was developed by the pharmaceutical company Janssen. Good news has not retained the upper hand for very long, however. An article in a recent edition of the Lancet suggests that bedaquiline is not the answer to Mankind’s prayers, at least where tuberculosis is concerned.
The distinction between what the law permits and what the law enjoins is often blurred. An absence of proscription is sometimes mistaken for prescription. The more the law interferes in our lives, the more it becomes the arbiter of our morality. When someone behaves badly, therefore, he is nowadays likely to defend himself by saying that there is no law against what he has done, as if that were a sufficient justification.
The recent Supreme Court decision in the cases of Burwell v. Hobby Lobby Stores and Conestoga Wood Specialties Corp. v. Burwell illustrates the difficulties when two or more rights clash irreconcilably. The complex issues involved were the subject of an article in a recent edition of the New England Journal of Medicine. The matter is still far from settled. It seems to me likely that the Supreme Court will one day reverse itself when its philosophical (or ideological) composition has changed.
The two corporations were owned by strongly religious people. Corporations of their size were enjoined by the government to provide their staff with health insurance which would cover contraceptive services. However, some contraceptive methods violated the religious beliefs of the owners of the companies. Did the companies have the right to except these methods from the policies that they offered to their staff (who, incidentally, numbered thousands, many of whom would not be of the same religious belief)?
In principle medical research is supposed to result in unequivocal guidance to doctors as to how to treat their patients. As often as not, however, the waters are muddied as much as cleared. Two papers in a recent edition of the New England Journal of Medicine about atrial fibrillation and the cause of stroke illustrate this. It has long been known that people with a clinically-detected chaotic heart rhythm called atrial fibrillation (AF) have an increased incidence of stroke by embolism; and likewise that no cause of such stroke can be found in up to 40 percent of patients who suffer from one. Their strokes are called cryptogenic. The two papers addressed the question whether, if you monitor patients with cryptogenic stroke for long enough, some or many of them will turn out to suffer from AF. This is important, because it is generally agreed that, in patients with clinically detected and symptomatic AF, anti-coagulation reduces the subsequent risk of stroke. AF, however, is not an all or none phenomenon. Some people suffer it continuously, but others only occasionally and for only a few seconds at a time. The additional risk of stroke in the latter is unknown, but is an important question because the anticoagulation designed to reduce the risk of stroke is not itself without risk, including that of another kind of stroke, the haemorrhagic kind. In other words, the risk caused by treatment could outweigh its benefits.
If the future were knowable, would we want to know it? When I was young, a fortune teller who predicted several things in my life that subsequently came true predicted my age at death. At the time it seemed an eternity away, so I thought no more of it, but now it is not so very long away at all. If I were more disposed to believe the fortune teller’s prediction than I am, would I use my remaining years more productively or would I be paralyzed with fear?
In a recent edition of the New England Journal of Medicine a question was posed about a 45-year-old man in perfect health (insofar as health can ever be described as perfect) who asked for genetic testing about his susceptibility to cancer, given a fairly strong family history of it. Should he have his genome sequenced?
A geneticist answered that he should not: to have his entire genome sequenced would lead to a great deal of irrelevant and possibly misleading information. But if the family history were of cancers that themselves were of the partially inherited type – more factors than genetics are involved in the development of most cancers – then the man might well consider having the relevant part of his genome, namely that part with a known predisposing connection to the cancers from which his family had suffered, sequenced.
This is not a complete answer, however. Two obvious questions arise: is additional risk clinically as well as statistically significant, and if the risk is known can anything practicable and tolerable be done to reduce it? There is no point in avoiding a risk if to do so makes your life a misery in other respects. You can avoid the risk altogether of a road traffic accident or being mugged on the street by never leaving your house, but few people would recommend such drastic avoidance.
There are few phrases more dangerous in medicine than “it stands to reason,” because what stands to reason may in fact not be a good idea, however brilliant it may once have seemed. This is because reality is always more complex than our theories about it; grey is theory, said Goethe, but green in the tree of life.
Perhaps the greatest single intellectual advance in the medicine of the last century was the realization that “it stands to reason” is no reason at all; everything must be studied in the light of experience. There was a good example of this necessity in a recent edition of the New England Journal of Medicine, which studied the effect of giving patients doses of aspirin or clonidine before and after undergoing non-cardiac surgery.
One of the most serious and feared complications of such surgery is heart attack, especially as the age at which people are operated on has increased. There are good theoretical reasons for believing that either aspirin or clonidine, or both, given peri-operatively might reduce the rate of heart attack in the first month after operation. Aspirin prevents the blood platelets from sticking to one another and the lining of the blood vessels, agglomeration of platelets being one of the mechanisms of heart attack; clonidine blocks the activity of the sympathetic nervous system whose overactivity is thought to be another such mechanism. Therefore it stands to reason, if anything does, that making the platelets less “sticky,” or the sympathetic nervous system less active, before, during and after operation might reduce the rate of post-operative heart attack. But does it?
A large trial was conducted in 135 hospitals in 23 countries, comparing the rates of heart attacks of people given aspirin, clonidine or placebo before and after operation. 10,010 patients were recruited in all, and the rate of follow-up was so high (99.9 percent) that it resembled the results of a Soviet-era election. Surprisingly, more than 40 percent of the patients were already taking low-dose aspirin prophylaxis when they entered the trial; but they were treated in exactly the same fashion as their peers who were not on aspirin.
One of the few laws of political science is that when governments make mistakes, they tend to be whoppers. Luckily for them, the public’s memory is short, and the outrage of today soon declines into the apathy of tomorrow.
From several articles published in a recent edition of the British Medical Journal, it appears that many governments around the world, including those of Britain and the U.S., may have made such a mistake in stockpiling billions of dollars’ worth of anti-flu medications, bought principally from Roche, the largest pharmaceutical company in the world as measured by capitalization.
First the governments overestimated the virulence of the new flu epidemic the drugs were supposed to counter, no doubt a forgivable mistake in the circumstances; but then it stockpiled the supposedly anti-flu drugs on the basis of inadequate evidence. It took published studies at face value without apparently realizing that the drug companies had withheld a great deal of data – 150,000 pages of it, as it turned out. When, after what seems like a rear-guard action to prevent it, the drug companies released all the data, re-calculation showed that the drugs were not quite useless, but had practically no value from the public health point of view. At best they reduced the duration of symptoms by a few hours and in some cases prevented the development of symptomatic disease. But they also caused serious side effects, and neither prevented deaths nor serious complications nor the rate of hospitalization. They did not prevent the spread of the infection either.
My home state of Colorado is a guinea pig for the pros and cons of marijuana legalization. Other states are observing closely to see if they should move down the path towards legalization.
There’s plenty of bad news to go around. Police in other states are pulling over Colorado drivers with no justification other than the green license plate. (We’re all stoners now, I guess.) A college student named Levy Thamba fell to his death from a high balcony during spring break after eating a marijuana cookie. And last week a Denver man who ate pot-infused candy became incoherent and paranoid and shot his wife to death.
Is there good news? Turns out there is. Colorado Springs is the source of the Charlotte’s Web strain of medical marijuana that has sent parents with gravely ill children flocking to the city for treatment.
The strain was developed by Joel Stanley and his brothers in their Colorado Springs medical marijuana facility. They’d read that marijuana strains that are high in a chemical called CBD can help to shrink tumors and prevent seizures. The chemical in marijuana that gets users high is called THC, and since it has an adverse affect on seizures the Stanley’s bred it out of the plant.
Their first patient, 5 year old Charlotte Figis, was so affected by a genetic seizure condition called Dravet’s Syndrome that she was not expected to live much longer. Today, she’s almost seizure free. The Stanley brothers named the strain after their first little patient, and it’s showing the world what medical uses marijuana can offer.
Today there are nearly a hundred families with gravely ill children who have relocated to Colorado Springs, purchasing a treatment for their children that would have landed them in prison just a few years ago. Medical marijuana is well known to help in the treatment of nausea in cancer and AIDs patients, but the strains now being investigated may uncover new lifesaving medicines such as Charlotte’s Web.
The recreational use of marijuana is proving to be the problem it was predicted to be, but while the stoners fill the headlines the researchers in medical marijuana are quietly making amazing advances in the treatment of illnesses. That’s some very good news indeed.
Image via CNN Health.