For a perfect example of how hysteria governs modern debates over complex issues, witness what happened yesterday morning to Governor Chris Christie. For the apparently unpardonable offense of offhandedly suggesting parents ought to have some freedom to decide how their kids are vaccinated, the governor’s political career was declared over. The instantaneous eruption from America’s self-deputized thought police had the governor — only hours later — meekly offering “clarification” of his earlier comments.
The debate over vaccines, itself nearing pandemic proportions in the U.S., is following a familiar pattern. People are either pro-science or anti-; in agreement with the “consensus” or crazy “conspiracists” and “deniers.” Much like the debate over global warming, there’s no room for middle ground; preaching prudence is basically blasphemous. And just as many are calling for climate “deniers” to be ostracized and even arrested, critics and parents who question the conventional wisdom on vaccines are likewise condemned as threats against civilization itself.
Like most everyone else, I am neither a doctor nor even a scientist. But I am smart enough to know there are perfectly valid reasons to question conventional wisdom.
Take the current controversy over measles. From the looks of my Twitter feed and the comments sections under just about any vaccine-related article, you’d think we were talking about the bubonic plague. In fact, measles, despite being highly contagious, isn’t particularly dangerous. So long as your immune system is in decent shape, you’ll be fine. In fact, you might actually want it, as exposure leads to lifetime immunity.
Measles is basically a fever with an accompanying rash. It’s true that in the 1800s, outbreaks caused tragically large numbers of children to die — but these were concentrated in orphanages and hospital wards (places where malnutrition was rampant). As the world prospered, affluence spread, and health improved, in the U.S. the chances of dying after contracting measles dropped to 1-2 percent by the 1930s. By the time a vaccine was introduced in 1963, deaths from measles were virtually nonexistent. Asthma, according to “Vital Statistics of the United States, 1963,” claimed 56 times as many lives.
Today it’s popular to argue that measles would be totally defeated were it not for the Jenny McCarthys of the world. The only problem is that the MMR (measles-mumps-rubella) vaccine does not actually immunize — as most people understand the word — against measles. The most we can expect is temporary protection. That’s because vaccines are injected directly into the body, bypassing the body’s natural immune response. “Most disease-causing organisms enter your body through the mucous membranes of your nose, mouth, pulmonary system or your digestive tract – not through an injection,” explains Dr. Joseph Mercola. “These mucous membranes have their own immune system, called the IgA immune system.”
Initially described as lifelong insurance, health officials realized in the ’70s, when an uptick in measles diagnoses occurred among vaccinated high-school students, that the vaccine should probably be administered more regularly. The CDC now advises receiving the vaccine at 12-15 months, 4-6 years, and again as an adult. The U.S. is also using its third version of a measles vaccine, after the first two proved ineffective.
Which should probably make it no surprise that many of the people catching measles today were vaccinated. Today’s measles cases are occurring in heavily vaccinated populations. When a 2006 outbreak among college students in the Midwest struck, the fact that most of the affected were vaccinated seemingly made no difference. When an outbreak of the mumps hit the NHL this year, many reflexively blamed “anti-vaxxers.” Almost no one reported that every affected player appears to have received the MMR vaccine. The Penguins’ Sidney Crosby received not only the initial MMR, but also a booster just before the Sochi Olympics. The director of the Vaccine Education Center at the Children’s Hospital of Philadelphia, Paul Offit, would only say “we know that the short-term effectiveness of the mumps vaccine is excellent.”
Still, none of this would suggest there’s any reason to avoid regular vaccines — were it not for side effects. And here comes another wrinkle: The MMR vaccine can itself give you measles. In 2013, measles began spreading in British Columbia after a two year-old girl contracted the virus from the vaccine, and then began spreading it to others. Though rare, there are other risks worth considering, too: According to the CDC, side effects to MMR can range from minor (fever, mild rash, swelling), to moderate (seizure, temporary low platelet count), to major (deafness, long-term seizures, permanent brain damage). Note that the latter two categories are worse than the disease itself. Perhaps a bigger problem is how these vaccines weaken the immune response among undernourished patients. “In developing countries, the use of high-titre vaccine at 4-6 months of age was associated with an unexpectedly high mortality in girls by the age of 2 years from infectious childhood illness,” a study reported in the British Medical Journal.
As recently as the 1970s, the CDC recommended children receive four vaccines. Today, per CDC protocol, children can receive around 40 shots between birth and the age of 6. What if that number grows to 100? 500? Will it always be unreasonable to ask, “Is all of this really necessary?”
Finally, this may come as a shock, but it’s actually possible for the government and the medical establishment to get things wrong. This year the CDC admitted its flu vaccine was created for the wrong strain — yet Americans are being instructed to get the shot anyway. Indeed, some parents are being threatened with having their children taken if they aren’t given this (almost certainly) useless flu vaccine. For more than a generation Americans were told to avoid as much as possible saturated fat, salt, and calories in general. More recent science shows that salt consumption has no causal relationship with blood pressure; eating healthy saturated fats like grass-fed butter is good for your heart, brain, and metabolism, and calories are actually a form of energy that gives us life.
Assigning responsibility for your children’s health and well-being to others — even “experts” — is precisely the opposite of parenting. Asking questions, educating yourself, soliciting more than one opinion: these aren’t the behaviors of people to be condemned and vilified. When someone insists you submit to the expertise of others, they’re actually asking you to stop thinking for yourself. And that’s a mistake. Vaccines, like so much of life, are more complex than a simple good-vs.-evil analysis affords. Universal solutions rarely work universally. Parents are right to do their homework.
Here’s Senator Rand Paul saying that most vaccines should be voluntary:
It is easier to advise than to have or to retain a sense of proportion, especially when it is most needed. I have never known anyone genuinely comforted by the idea that others were worse off than he, which perhaps explains why complaint does not decrease in proportion to improvement in general conditions. And he would be a callow doctor who tried to console the parents of a dead child with the thought that, not much more than a century ago, an eighth of all children died before their first birthday.
Still, it is well that from time to time medical journals such as the Lancet should carry articles about medical history, for otherwise we might take our current state of knowledge for granted. Ingratitude, after all, is the mother of much discontent. To know how much we owe to our forebears keeps us from imagining that our ability to diagnose and cure is the consequence of our own peculiar brilliance, rather than simply because we came after so much effort down the ages.
A little article in the Lancet recently was written by two historians who are in the process of analyzing the results of 9000 coroners’ inquests into accidental deaths in Tudor England. It seems astonishing to me that such records should have survived for more than four centuries, but also that the state should have cared enough about the deaths of ordinary people to hold such inquests (coroners’ inquests had already been established for 400 years at the time of the Tudors). In other words, an importance was given to individual human life even before the doctrines of the Enlightenment took root: the soil was already fertile.
There is no new thing under the sun, least of all panic at the approach of an epidemic of a deadly disease. In 1720, the preface to Loimologia, Nathaniel Hodges’ account of the Great Plague of London in 1665, first published in Latin in 1672, referred to the outbreak of plague in Marseilles:
The Alarm we have of late been justly under from a most terrible Destroyer in a neighbouring Kingdom, very naturally calls for all possible Precautions against its Invasion and Progress here…
In fact, though no one was to know it, no epidemic of plague was ever to occur in Western Europe again; and it is doubtful whether the precautions referred to made much difference.
The death rate from the Ebola virus is probably greater than that from bubonic plague, though of course the plague spread much faster and killed far more people in total than Ebola ever has: and at least we, unlike our plague-ridden ancestors, know the causative organism of the Ebola disease, even if we are not certain how the virus first came to infect Mankind.
My PJ colleague Walter Hudson published a compelling argument regarding physician-assisted suicide in response to the ongoing dialogue surrounding terminal cancer patient Brittany Maynard. His is a well-reasoned argument regarding the intersection of theology and politics, written in response to Matt Walsh’s Blaze piece titled “There is Nothing Brave About Suicide.” Both pieces are a reminder that, in the ongoing debate over whether or not Maynard has the right to schedule her own death, little has been said regarding the role the medical profession plays in the battle to “Die with Dignity.” Walsh argues:
None of us get to die on our own terms, because if we did then I’m sure our terms would be a perfect, happy, and healthy life, where pain and death never enter into the picture at all.
It’s a simplistic comment that ignores a very real medical fact: Death can come on your own terms. And that doesn’t have to mean suicide.
My mother was a nurse for 20 years. During that time she worked in a variety of settings, from hospitals, to private practice, to nursing homes. Much like Jennifer Worth, the nurse and author of the Call the Midwife series, my mother practiced at the end of Victorian bedside nursing and the dawn of Medicare. As a result, the abuses she witnessed in the name of insurance claims were grotesque. For instance, if a patient required one teaspoon of medication, an entire bottle would be poured into the sink and charged to that patient’s insurance company. This was just the tip of the iceberg of unethical practices that would become priority in the name of the almighty “billing schedule.”
Screenwriters are not known for being sticklers for facts. And when it comes to disasters, writes University of Texas Professor David A. McEntire, “many of Hollywood’s portrayals are based on myths and exaggerations….” That’s certainly the case when it comes to disease disaster films. Here are 10 “fun” movies that are of no use whatsoever in terms of helping viewers respond wisely to a pandemic.
10. Panic in the Streets (1950)
“Patient Zero” is carrying the pulmonary version of bubonic plague. A public official (played by Richard Widmark) has 48 hours to find him before the disease spreads throughout the city. Director Elia Kazan delivers a moody, atmospheric, underappreciated film. But if this is how the police, public health officials and reporters will really act during a crisis, well, we’re all doomed.
For several years now the inimitable Theodore Dalrymple has provided PJ Media and PJ Lifestyle with erudite, witty commentaries on controversies in the worlds of health, drugs, and disease, as well as their impact on culture. Here’s a collection featuring links to many of the questions he’s addressed, often in response to some shaky thinking in a new study or an ideologically slanted medical journal article.
What health and medical questions would you like to see him and other writers explore in the future? Please leave your suggestions in the comments.
2011 and 2012
- Is Salt Really Bad for Your Heart?
- Are There Health Effects Due to the Financial Crisis?
- Should the ‘Morning After’ Pill Be Available to All Ages?
- Can Children Be Manipulated into Eating Their Veggies?
- Should We Be Worried about Bird Flu?
- Is Surgery Not Always Necessary for Appendicitis?
- Genomic Medicine: A Great Leap Forward?
- Aspirin: The Elixir of Life?
- Do Nicotine Patches Actually Work?
- Does ‘Good Cholesterol’ Really Help Prevent Heart Attacks?
- Should Women’s High School Soccer Be Banned To Reduce Knee Injuries?
- Is Grief Always Depression?
- Does Fish Oil Prevent Alzheimer’s Disease?
- Do Proactive Measures by Doctors Aid in Smoking Cessation?
- Can Dark Chocolate Reduce High Blood Pressure?
- How Come People Rarely Die of Dementia in Poor Countries?
- Should You Take Antibiotics?
- Is Obesity a Disease or a Moral Failing?
- Are Obese Kids Victims of Child Abuse?
- Need A Few Arguments Against Tattoos?
- Are the Treatment and Prevention of Obesity Different Problems?
- Why Are Psychiatric Disorders Not the Same as Physical Diseases?
- Do Today’s Medical Ethics Prevent New Breakthroughs?
- Should We Be Worried About Parasites from Cats?
- Do Doctors Turn Their Patients into Drug Addicts?
- Should Doctors Lie to Their Patients About Their Survival Chances?
- As Life Expectancy Increases Will the Elderly Become a Greater ‘Burden on Society’?
- What Is the Best Way to Treat Diabetes?
- What Can Be Done to Reduce Post-Hospital Syndrome?
- How Can a Mammogram Kill You?
- Human Feces as Medicine?
- What Will Happen if I Consume Too Much Calcium?
- Is Marijuana a Medicine?
- Why Is Immunization so Controversial?
- Is America at the Point Where HIV Testing Should Be Routine?
- Is Physical Therapy Overrated?
- How Many Smokers Could Quit If Someone Paid Them $10 Million?
- Is Nutrition Really the Key to Good Health?
- Is It Even Possible to Accurately Measure Physical Pain?
- Can Doctors Determine Who Should Be Allowed to Carry a Concealed Gun?
- Does Practice Really Make Perfect for Doctors?
- Should Doctors Be Allowed to Choose Not to Treat Fat People?
- Should Pre-Term Infants Receive Risky Oxygen Treatments?
- We Mock Prudish Victorian Euphemisms, But Are We Really Any Better?
- How Often Do Medical Emergencies Occur on Flights?
- What Is the Safest Day of the Week for Surgery?
- Are Antibiotic-Resistant Diseases Mother Nature’s Revenge?
- How Dangerous Is Obstructive Sleep Apnea During Surgery?
- Can Advances in Medical Technology Make Us Less Healthy?
- Does Badgering Patients to Exercise and Eat Better Actually Work?
- Should an Alcoholic Be Allowed to Get a Second Liver Transplant?
- Can Living With Chickens Protect Against Face-Eating Bacteria?
- Does the Sleep Aid Zolpidem Impair Driving the Next Day?
- Does Too Much Sugar Increase the Risk of Dementia?
- Men: Need Another Excuse to Put Off That Prostate Exam?
- Is Drug Addiction Really Like ‘Any Other Chronic Illness’?
- How Many Doctors Support Suicide for the Terminally Ill?
- What Are the Dangers in Screening for Diseases?
- Was Sir Winston Churchill Right About Exercise?
- Should Doctors Relax the ‘Dead-Donor Rule’ to Increase Organ Transplants?
- Is Living Near an Airport Dangerous for Your Health?
- Can Money Become Medicine?
- Gastric Bypass or Laparoscopic Gastric Band?
- How Do You Measure a Good Doctor Vs a Bad One?
- Should You Eat Lots of Nuts?
- As More People Live Longer Why Are Rates of Dementia Falling?
- Should Treatment of Obesity Begin Before Birth?
- Why Is It So Difficult to Translate Genetic Breakthroughs into Clinical Benefits?
- Can Scientists Create a Cure for Pain From Scorpions, Spiders, and Centipedes?
- Should the Age to Buy Cigarettes Be 21?
- Should You Vaccinate Your Children?
- Is Your Heart Attack More Likely to Kill You at Night or During the Day?
- A Cure For Peanut Allergies?
- Should Taxpayers Pay for the Junky’s Substitute Smack?
- Who Pays for Illegal Immigrant Tetraplegics’ Treatment?
- How Much Would You Pay to Survive Four Months Longer with a Terminal Disease?
- Euthanasia for the Insane?
- Does Valium Increase Your Chances of An Early Death?
- Are Diet Supplements Dangerous?
- Did Flu Drug Companies Perpetuate a Billion Dolllar Scam Around the World?
- What is One of the Most Dangerous Ideas in All of Medicine?
- Why Do Some Mothers Induce Illness in Their Own Children?
- Why Might a Doctor Be Relieved When a New Study Fails To Reduce Deaths?
- Should Prisoners Receive Better Health Care Than the General Population?
- Is Ebola the World’s Most Terrifying Disease?
- What Can Happen to Your Lungs if You Smoke 20 Cigarettes Every Day for A Decade?
- Is This the End of Mammograms to Screen for Breast Cancer?
- Do Medical Experiments on Animals Really Yield Meaningful Results?
- Will Legal Marijuana Be a Bonanza for Trial Lawyers?
- Should You Get Your DNA Tested to See if You’re More Likely to Get Cancer?
- What to do when the Risk of Treatment Outweighs the Benefits?
- Why Must We Take One Step Forward, Two Steps Back in the Battle Against Tuberculosis?
- Is Ignorance Really Bliss? What Is the ‘Nocebo’ Effect?
- What Does Moral Narcissism Looks Like in the Medical World?
- Why is Treating Statistical Markers of Disease Is Not the Same as Treating Disease Itself?
- Heterochronic Parabiosis: Reversing Aging With Young Blood?
- Should Everyone Consume Less Sodium?
- Does a Popular Antibiotic Raise the Risk of Heart Attack?
- Do You Really Need That Colonoscopy?
- Is It ‘Unjust’ for Doctors to Die from Ebola?
image illustration via shutterstock / Sherry Yates Young
When I visited the John F. Kennedy hospital in Monrovia during the long Liberian Civil War, it had been destroyed by a kind of conscientious vandalism. Every last piece of hospital furniture and equipment had been disabled, the wheels sawn off trolleys and gurneys, the electronics smashed. This was not just the result of bombardment during the war but of willful and thorough dismantlement.
There were no patients and no staff in the hospital; it was a ghost establishment, completely deserted. I was severely criticized for suggesting in a book that the painstaking destruction of the hospital, which shortly before had performed open heart surgery, was of symbolic significance.
I was pleased to see from an article in a recent edition of the New England Journal of Medicine that it had re-opened, but saddened to see that its problems were now of an even more terrifying nature than those it encountered during the civil war: for the hospital is at the center of the epidemic of Ebola virus disease, and two of its senior physicians, Sam Brisbane and Abraham Borbor, have recently died of it. The article in the journal lamented their passing and praised them for their bravery in not deserting their posts; they both knew of the dangers of refusing to do so.
Readers may recall my description of aortic valve replacement last year and a warning of the importance of treating heart disease seriously. Here’s another lesson from medical misadventure: if you are going to have a stroke, it is best to have it on an operating room table. And best of all, avoid strokes if at all possible. As miserable and expensive as last year’s surgery and recovery was, I think I would make that trade.
This started as the classic symptoms of a heart attack on the evening of August 2: chest pain, pressure, left-arm pain, a sense of confusion. So I had my wife drive me to an urgent care facility, where they decided that I was beyond the level that they could treat other than giving me aspirin and nitroglycerin, and calling the Ada County Paramedics to transport me to St. Alphonsus hospital. In retrospect, my wife could have driven me there directly in less time, and saved the insurance company $1300.
When my PJ Media editor suggested that I write about having lupus, I almost said no.
I was diagnosed with SLE in 1991 and have been in remission since around 1995. My book about living with this chronic illness came out two years later. Like most writers, by the time a book comes out, I’m so sick – pun intended – of its topic that I dread having to revisit it.
Having been in remission for almost 20 years, I can honestly make the rather unusual claim that not even the perspective of hindsight has changed my ideas or feelings about what being a pain-wracked invalid was like. Not even a little bit.
I feel like I’m supposed to say the opposite: that looking back, I could have “handled” my disease differently, or learned other, “better” lessons from it, and so forth.
But then, from the very beginning, I didn’t fit the mold of the “disease of the week” TV movie heroine, or some “poster child” for lupus.
Here are some things I learned (or, perhaps more accurately, some pre-conceived ideas I had reinforced) when I was at my very sickest.
Warning: What follows is NOT inspirational. At all.
In principle medical research is supposed to result in unequivocal guidance to doctors as to how to treat their patients. As often as not, however, the waters are muddied as much as cleared. Two papers in a recent edition of the New England Journal of Medicine about atrial fibrillation and the cause of stroke illustrate this. It has long been known that people with a clinically-detected chaotic heart rhythm called atrial fibrillation (AF) have an increased incidence of stroke by embolism; and likewise that no cause of such stroke can be found in up to 40 percent of patients who suffer from one. Their strokes are called cryptogenic. The two papers addressed the question whether, if you monitor patients with cryptogenic stroke for long enough, some or many of them will turn out to suffer from AF. This is important, because it is generally agreed that, in patients with clinically detected and symptomatic AF, anti-coagulation reduces the subsequent risk of stroke. AF, however, is not an all or none phenomenon. Some people suffer it continuously, but others only occasionally and for only a few seconds at a time. The additional risk of stroke in the latter is unknown, but is an important question because the anticoagulation designed to reduce the risk of stroke is not itself without risk, including that of another kind of stroke, the haemorrhagic kind. In other words, the risk caused by treatment could outweigh its benefits.
If the future were knowable, would we want to know it? When I was young, a fortune teller who predicted several things in my life that subsequently came true predicted my age at death. At the time it seemed an eternity away, so I thought no more of it, but now it is not so very long away at all. If I were more disposed to believe the fortune teller’s prediction than I am, would I use my remaining years more productively or would I be paralyzed with fear?
In a recent edition of the New England Journal of Medicine a question was posed about a 45-year-old man in perfect health (insofar as health can ever be described as perfect) who asked for genetic testing about his susceptibility to cancer, given a fairly strong family history of it. Should he have his genome sequenced?
A geneticist answered that he should not: to have his entire genome sequenced would lead to a great deal of irrelevant and possibly misleading information. But if the family history were of cancers that themselves were of the partially inherited type – more factors than genetics are involved in the development of most cancers – then the man might well consider having the relevant part of his genome, namely that part with a known predisposing connection to the cancers from which his family had suffered, sequenced.
This is not a complete answer, however. Two obvious questions arise: is additional risk clinically as well as statistically significant, and if the risk is known can anything practicable and tolerable be done to reduce it? There is no point in avoiding a risk if to do so makes your life a misery in other respects. You can avoid the risk altogether of a road traffic accident or being mugged on the street by never leaving your house, but few people would recommend such drastic avoidance.
My home state of Colorado is a guinea pig for the pros and cons of marijuana legalization. Other states are observing closely to see if they should move down the path towards legalization.
There’s plenty of bad news to go around. Police in other states are pulling over Colorado drivers with no justification other than the green license plate. (We’re all stoners now, I guess.) A college student named Levy Thamba fell to his death from a high balcony during spring break after eating a marijuana cookie. And last week a Denver man who ate pot-infused candy became incoherent and paranoid and shot his wife to death.
Is there good news? Turns out there is. Colorado Springs is the source of the Charlotte’s Web strain of medical marijuana that has sent parents with gravely ill children flocking to the city for treatment.
The strain was developed by Joel Stanley and his brothers in their Colorado Springs medical marijuana facility. They’d read that marijuana strains that are high in a chemical called CBD can help to shrink tumors and prevent seizures. The chemical in marijuana that gets users high is called THC, and since it has an adverse affect on seizures the Stanley’s bred it out of the plant.
Their first patient, 5 year old Charlotte Figis, was so affected by a genetic seizure condition called Dravet’s Syndrome that she was not expected to live much longer. Today, she’s almost seizure free. The Stanley brothers named the strain after their first little patient, and it’s showing the world what medical uses marijuana can offer.
Today there are nearly a hundred families with gravely ill children who have relocated to Colorado Springs, purchasing a treatment for their children that would have landed them in prison just a few years ago. Medical marijuana is well known to help in the treatment of nausea in cancer and AIDs patients, but the strains now being investigated may uncover new lifesaving medicines such as Charlotte’s Web.
The recreational use of marijuana is proving to be the problem it was predicted to be, but while the stoners fill the headlines the researchers in medical marijuana are quietly making amazing advances in the treatment of illnesses. That’s some very good news indeed.
Image via CNN Health.
Good and bad news often go together, for what is good news for some is bad for others. Shareholders in pharmaceutical companies that produce statins will have been heartened (no pun intended) by a paper in a recent edition of the New England Journal of Medicine in which the authors calculated that, under the new guidelines of the American College of Cardiology and the American Heart Association with regard to lipid levels in the blood, 12.8 million more adults in the United States alone would be “eligible” for (i.e. ought ideally to have) treatment with statins. In fact, very nearly half the population older than 40 ought to take them, and seven eighths of the population over 60. As a man over sixty who never has any blood tests done, my heart sinks (again no pun intended). We are all guilty of illness until proven healthy: not good news.
The authors compared the therapeutic consequences of the old guidelines with the new. In effect the new guidelines lowered the threshold for treatment. According to these guidelines, anyone over 40 with known cardiovascular disease should receive statins, irrespective of their level of Low Density Lipoprotein (LDL); while anyone with a level of 70 milligrams per decilitre or more and who has diabetes or a statistical risk of a heart attack of more than 7.5 percent within the next ten years should also receive them.
Taking a rather small sample of adults over 40 from the National Health and Nutrition Examination Survey whose blood lipids were measured and extrapolating it to the U.S. population as a whole, the authors conclude that, if the new guidelines were put into practice rather than the old, 14.4 million adults in the U.S. who would not have been “eligible” for treatment under the old guidelines would now be “eligible” for it, while 1.6 million who would have been “eligible” under the old guidelines would no longer be “eligible.”
Week 1 — Something’s Got To Give
As part of my “taking it easier” with my blog, over at According To Hoyt, I’ve been running ‘blasts from the past’ – i.e. posts a year or more old at least a day a week. (For instance on Tuesday I posted Jean Pierre Squirrel, from February 2011.)
The interesting thing going through the blog is seeing how many days I curtailed posting or posted briefer or weirder because I was ill.
Now I was aware of having been in indifferent health for the last ten years or so. It’s nothing really bad or spectacularly interesting, which is part of the issue, because if it were, I could take time off and not feel guilty. I confess I have found myself at various occasions fantasizing about a stay in the hospital. Which is stupid, because no one rests in the hospital. (What I need, of course, is a stay in a remote cottage for a few days. Even if I’m writing.) And I knew that my health got much worse in the last year. 2013 was the pits, at least since August or so. But it is not unusual for me to spend every third week “down.” – Usually with an ear infection or a throat thingy or some kind of stomach bug.
My friends have said for years that this is because I don’t listen to my body’s signals to slow down or stop, so it has to bring me to a complete stop by making me too sick to work.
This is part of the reason Charlie Martin and I (in collaboration) are doing a series on taming the work monster. Part of it is that I have way too much to do, and part of it is that it’s really hard to compartmentalize things when you work from home. Eventually when we sell the house and move, we’d like to get a place where the office is a distinct area. It was pretty much all of the attic in our last house, which meant if I came downstairs for dinner (which I did) I didn’t go up again. But now my office is half of the bedroom (and before someone imagines me cramped in a corner, the bedroom runs the full front of the house. We just couldn’t figure out what to do with a room that size. We don’t sleep that much.) This is convenient in terms of my getting up really early to work, or of my going to bed way after my husband, because I’m right there… It’s also contributing to a 24/7 work schedule, because I can think “Oh, I should write about that” and roll out of bed, and do so. There is no “I have to be dressed, as the sons might be roaming the house” and there isn’t (as in the other house) “the attic will be cold.”
Today is 19 October. Yeah, I know, you can see it at the top of the article, but that’s an important date, because it’s now exactly a year since I determined I had to take some actions about my weight and glucose. (I came out about it in my first 13 Weeks post, “A Fat Nerd Does Diet,” on 28 October last year.)
The results overall have been good. I had several different issues when I started.
- I weighed 301.5 on the 19th.
- My A1c was 7.5. Although I struggled with admitting it, that’s real no-kidding diabetes mellitus. For me it appears to be type 2, (T2DM) characterized by lowered sensitivity to insulin. That was on a pretty much maximum dose of metformin, 2500 mg/day; if I were depending on drug treatment alone, I was heading for insulin.
- I had a long-term problem with gastric reflux (GERD) and irritable bowel syndrome (IBS); I was on omeprazole every day and had been since a severe esophageal spasm and put me into the ER with chest pain two years before.
- My total lipids were reasonable on 20mg/day of simvastatin but my high-density lipoproteins (HDL) were low, and my low-density (LDL) were high.
- I also had a long-term problem with depression, although I hadn’t had a really acute episode in some years.
Now, a year later:
- I’m down nearly 40 pounds; my recent low was 264.
- My A1c is been between 5.9 and 6.4. The T2DM appears to be under control. I’m down to 1000 mg/day of metformin, and did a long stretch at 500 mg/day.
- My lipids are enough better that I’m off statins, at least for this 13 week period.
- The IBS no longer troubles me — I can’t say it’s completely resolved because, frankly, how would I know? But I haven’t had a painful episode in certainly almost a year. The GERD is also considerably better, and I’m slowly weaning myself off the omeprazole.
- I think I can say the depression is significantly better. I haven’t had an acute episode this year, but then I hadn’t had a really acute episode in some years. But I had also been chronically dysthymic, which in combination with acute depression is called “double depression.” I really feel like that’s significantly better. I plan to write more about depression in the coming months; there are interesting suggestions that there may be some physiology that connects depression, obesity, and T2DM.
What did I do?
- I’ve adopted a consistently low-carb, high-fat diet. I’ve played around with variants, and right now I’m around 50g carbs a day, with most of the carbs coming from fruits and yoghurt.
- I’ve nearly completely eliminated wheat. Occasionally eating wheat seems to result in immediate exacerbation of the GERD and possibly of the IBS.
- I’ve experimented with high-intensity interval training and high-intensity strength training, although I’ve had trouble making that a consistent practice.
- I recently tried a broad-spectrum probiotic, which seems to have had very good effects.
- I’ve largely structured these changes into a series of 13 week long experiments, which appears to be a sufficiently powerful model that a number of other people have adopted it for their own changes.
What have I learned in this year? It’s complicated.
A good friend of mine had a heart attack the other day. She did everything right — went to the ER right away when she had the first mild angina, she’d been taking care of herself with exercise and controlling her weight. In other words, pretty much the opposite of what my mother did two years ago.
It turned out to be mild, and she was given a stent and is rehabbing now. There were, however, two things that very possibly contributed: her blood sugar was elevated into “pre-diabetic” ranges and had been for years, and her blood lipids, cholesterol and the like, were pretty elevated.
So, now as well as doing the cardiac rehab routine of mild exercise, she’s starting to manage her blood sugar, and she’s on a statin drug for the lipids.
So we were talking about it this morning and she said something that struck a chord.
I bet you will identify with how much I cringe at the word diabetic. It is so associated with not taking care of yourself because of the media.
That really struck me, because I have noticed the same thing: I’ve found it very difficult to come out and say “I am a diabetic.”
Movies and fiction about people who recover from alcoholism pr drugs usually have this dramatic, climactic scene where, after hitting bottom in some dramatic and more or less disgusting way, the main character has the “moment of clarity” and stands up in a meeting and says “I am an alcoholic.” (Two great examples, by the way, are an under-appreciated Michael Keaton film Clean and Sober, and the Matthew Scudder books by Lawrence Block.) It’s an important moment in recovery because it marks the point at which you are — at the risk of sounding like I live in Boulder — taking ownership of the problem. Your wife isn’t driving you to drink, if it’s genetic it’s still your problem, and however you got there, that’s where you are now and you have to deal with it.
It’s also really hard to say because of the social stigma: socially, we see drunks as morally flawed. Same thing with obesity, with depression, and with drug addiction. Theodore Dalrymple has an instructive, if in my opinion mistaken, piece on this in PJM, where he questions whether we’d think of having “Arthritics Anonymous” where someone stands up and says “I am arthritic.”
Even non-hypochondriacs such as I sometimes worry fleetingly about their health when, having reached a certain age, some of their friends and acquaintances fall foul of a disease, namely (in this case) cancer of the prostate. But my anxiety does not last long and so far I have managed successfully to resist all attempts by my medical colleagues to measure my prostate specific antigen (PSA). I want to have as little to do with doctors as possible, other than socially of course, and there is nothing quite like a high PSA level to provoke doctors’ interference in a man’s life.
Would this interference, though, prolong my life if I allowed it to take place? A recent paper in the New England Journal of Medicine starts optimistically and ends pessimistically. It draws attention to the fact that mortality from prostate cancer has fallen drastically and attributes this to improvement both in early diagnosis of the cancer by means of screening and of treatment once diagnosed.
The body of the paper, however, is less sanguine. First 18,880 elderly men were divided into those who were given finasteride, a drug that was hoped would prevent cancer, and those given placebo. Some years later it was discovered that finasteride did indeed reduce the numbers of patients who developed cancer, in fact by nearly a third.
So far so good: but this is not the end of the story. Unfortunately, prostate cancer is a very variable disease such that, while some men die of it, many more men die with it than of it. And while finasteride seems to have prevented many low-grade cancers, those that would not have killed the men in any case, it seems also to have increased both the number and proportion of the more serious kind.
Nothing gave me a better glimpse of the Father’s love for me than looking into the face of my newborn. Only then did I understand what it meant to love someone else more than myself.
The first time my boy got sick it broke my heart. Lying in my arms his limp little body radiated heat. His eyes seemed glazed over with a sheet of pink glass. I thought to myself, “I wish it were me and not him.” At that moment, I realized I would gladly give my life for his. Almost instantly, I understood why God described Himself to us as our Father, and why Christ would die for us — unconditional love.
Then came the toddler years. Although my love never changed, how I expressed it sure did. I made rules. Most of the time, he really couldn’t understand why I said no. That’s perfectly fine with me. I didn’t need him to understand that the big brown “boat” swirling in the water was not put there for him to play with. He’ll get it later when he discovers the meaning of gross, and eventually he’ll understand the concept of germs. Until then, I just expected obedience.
He’s 35 now. It’s not an issue. Although he’s never thanked me, I’m pretty sure he’s glad I never let him splash in the toilet, or eat everything he found on the floor.
In The Maker’s Diet the author Jordan S.Rubin, makes a strong case that the dietary laws given to God’s chosen people, is His hand of protection. Apparently God knew that with enough barbecue sauce we would happily lick a toilet.
The major medical journals of the world receive far more papers than they can ever publish, and so it is rather surprising when dull, trivial or bad work appears in them. This must mean either that the editors of the journals, like Homer, sometimes nod, or that the general standard of the work submitted for publication is lower than one might hope or suppose.
A recent paper in the New England Journal of Medicine, entitled “Glucose Levels and Risk of Dementia,” by no fewer than fourteen authors, is a case in point. They repeatedly measured the blood glucose levels of 2067 people aged on average 76 at the start of the study, followed them up for a median length of 6.8 years, and correlated the levels with the patient’s chances of developing dementia.
It was already known that diabetics are at increased risk of developing dementia, not surprisingly in view of the damage that diabetes does to small blood vessels in the brain. But the authors of the paper put forward the hypothesis that higher levels of glucose even in non-diabetics would increase the risk of developing dementia.
They indeed found that non-diabetic patients with a blood sugar level of 115 milligrams per decilitre were more likely to develop dementia than those with a level of 100 milligrams. However, the extra chance, 1.18 times, though statistically significant, was so small that its significance in any other sense must be doubted. Generally speaking, epidemiological surveys which find such small differences are not of much value from the point of view of elucidation of the causation of diseases. If you trawled through a hundred factors – coffee consumption, number of begonias in the garden, subscription to a newspaper, etc. – you would probably find five such factors with odds rations as large (or small).
One of the nice things about getting older is that you begin to understand, and learn to appreciate, the fact that you’re not alone. Your health crisis, while a tragedy, is still only unique to you. That is comforting and extremely empowering. Especially when it comes to taking control of our health and healing. We can learn from each other’s hard-won knowledge.
When Mike’s health deteriorated, and our medical options ran out, we turned to a “radical” nutritionist. We based our decision to go the nutritional route on this simple premise: God created us and He wants us healthy and whole. He also created the food we eat, to give our bodies what it needs to heal itself. We’ve dubbed our journey back to health and its new lifestyle as “Kosher-Christian.” You can read more about our story here.
It’s Jordan Rubin’s story that has captured my interest today. Although his is quite different than ours, the essence is the same. When medical options failed, he found a nutritionist that believed the Creator has already given us the resources for healing. His health was restored, better than before and he found a new life and purpose.
But that’s where our similarities end. You could almost say our situations stand at opposite ends of the spectrum. At the young age of 19, Rubin went from a strong healthy college student, to a fragile patient with Crohn’s disease. Over the course of two years, he traveled oceans and saw seventy doctors in search for help. He tried everything that offered even a sliver of hope. Nothing helped. Not until he met a nutritionist, who told bluntly why he was sick. He was not eating “the diet of the Bible.”
Rubin dubbed his journey back to health The Maker’s Diet. The New York Times bestseller encompasses a complete lifestyle change.
After two years of suffering, his turnaround came in the first 40 days. If you need encouragement, and want to borrow someone else’s faith for your own journey back to health, I found a good place for you to start.
Take a peek at Jordan Rubin’s “after” picture.
In private health-care systems, rationing of health care is by price; in public health care it is by waiting lists and administrative fiat. Both have their defenders, usually ferocious and bitterly opposed, but the fact remains that there are some treatments that have to be rationed however much money is available for health care: as when, for example, there are more people needing organ transplants than there are organs to be transplanted. Few people would be entirely happy to allocate organs merely to the highest bidder.
A recent article in the New England Journal of Medicine tackles the problem of allocation of lung transplants. A system was in place in the United States that excluded children under 12 years of age from receiving adult lungs as transplants, an exclusion that parents of a child with cystic fibrosis challenged in the courts. The problem for children under the age of 12 requiring lung transplants is that there are very few child donors, so in effect the system discriminated against them.
The reason for the exclusion was that most children for whom lung transplants are considered have cystic fibrosis, a condition for which the results of such transplants are equivocal given the constantly improving medical treatment of the disease. Moreover, children are especially liable to complications from the procedure, though these can be partially overcome by using not whole adult lungs for transplant but only resected lobes of them.
The American system of allocation of lungs for transplant into adults takes into account various factors, such as years of potential benefit from transplant, the imminence of death without transplant, the statistical chance of success of transplant, and so forth. Ability to pay does not come into it; in other words it is a socialized system, but there is a mechanism of appeal for those relegated to low priority which the more educated and wealthier are better able to take advantage of. No explicit judgment is made about the relative social or economic worth of the individual, however, for that way madness, or at least extreme nastiness, lies. And the authors of the article think that, on the whole, the system works well, for it seems to stand to reason that those who would benefit most should go to the top of the waiting list.
I once had a small transplant dilemma of my own…
In recent posts I revealed a few personal pieces of our lives, mostly focusing on the economic impact of a health crisis. However, life-changing events such as these seldom come in isolation. This perfect storm arose out of our lifestyle and diet, devastating my husband’s health and testing our faith.
In the span of a weekend my hard-working husband Mike went from a “Top Gun” insurance-fraud investigator to a bedridden patient, while I morphed into little more than a trembling caregiver. Without our realizing it, his lifestyle of constant traveling and eating on the road along with my budget-conscious (rather than health-conscious) efforts at home created unthinkable consequences.
Without any real symptoms, over a period of years he quietly developed chronic deep vein thrombosis. After a stint in critical care, surgery, and high-power medications, we exhausted all medical avenues to dissolve the clot.
The surgeon came in sporting a “you-did-this-to-yourself-big-guy” attitude and handed us a one-way ticket into a nursing facility. He declared that nothing more, medically, could be done. He explained, in a clear “good-luck-with-that” tone, that Mike’s body had to heal itself. He needed to “forge new veins.”
The finest health-care system in the world could only stop the progression of the clotting — which, arguably, is profound. Nonetheless, medicine had nothing further to offer us other than opiates, Warfarin, insulin, and around-the-clock, skilled care.
No cure, not even an injection of hope.
The fluid in his legs wasn’t going away “any time soon.” Which translated to him not getting out of bed any time soon. What fluid remained in six months, they said, would become permanent — an inconceivable thought.
My oldest daughter developed a theory and a plan. In the process we discovered these simple principles that had a profound impact on Mike’s recovery and my life.
Resistance to antibiotics is often described by neo-pagans as Mother Nature’s vengeance on Man for having had the temerity to interfere in her natural biological processes. According to the neo-pagans, this vengeance has left Man (deservedly) worse off than if he had never discovered antibiotics at all. I do not see the logic of this.
There is no doubt, however, that bacterial resistance to antibiotics is a serious problem worldwide. It is particularly serious in hospitals, where patients may pick up infections that they never had before admission. Many patients die from these infections, which may be of epidemic proportions.
The most important such infection is MRSA, methicillin-resistant Staphylococcus aureus. (Methicillin is a semi-artificial penicillin that was developed when the Staphylococcus first became resistant to ordinary penicillin, and soon met with resistance itself.) MRSA accounts for most post-surgical infections; the proportion of patients infected by it is often taken in research as a measure of a hospital’s hygiene.
An important paper in a recent edition of the New England Journal of Medicine compares various strategies for reducing the spread of MRSA in intensive care units, a common place for patients to become infected.
The method of control usually employed is to screen patients for MRSA on admission to the ICU and to institute special precautions such as isolation and barrier nursing if they test positive. The authors compared this method with attempts by means of antibacterial products at “decolonization” of those who tested positive, and similar “decolonization” practiced on every patient admitted to an ICU irrespective of whether or not he tested positive for MRSA.
For a long time doctors were subject to contradictory imperatives with regard to AIDS. On the one hand they were enjoined to treat it as they would treat any other disease, without animadversion on the way in which the patient had caught it; on the other hand they had, before testing for the presence of HIV, to seek special permission of the patient and to ensure that he or she had had counselling before the test was taken – quite unlike the testing for any other disease, syphilis for example. So AIDS was at the same time a disease like any other and also in a completely different category from all other diseases.
It cannot be said that pre-test counseling is universally popular among patients. There was an Australian clinic that famously offered the test with “guaranteed no counseling” and it did not lack for clients. For quite a number of years, however, HIV-test counselling has provided a living for the kind of people who like to hover around the edges of human catastrophe.
However, the recommendation by the United States Preventive Services Task Force (USPSTF), reported in an article in a recent edition of the New England Journal of Medicine, that henceforth the screening of adults for HIV infection should be routine will, if adopted, put paid to all such pre-test counseling. One cannot counsel scores or hundreds of millions of people.
Seven years ago the USPSTF came to a different conclusion on the question of screening for HIV, believing that the benefits were insufficient to recommend it. Since then, however, evidence has accumulated that treating people early in the course of their infection not only prolongs their life but reduces spread of the infection.