I am slightly ashamed of how much I liked Burma when I visited it nearly a third of a century ago. What a delight it was to go to a country in which there had been no progress for 40 years! Of course it was a xenophobic, klepto-socialist, Buddho-Marxist military dictatorship run by the shadowy, sinister and corrupt General Ne Win, and so, in theory, I should have hated it. Instead, I loved it and wished I could have stayed.
Since then there has probably been some progress, no doubt to the detriment of the country’s charm. Burma (now Myanmar) is slowly rejoining the rest of the world, and one consequence of this will be the more rapid advance of treatment-resistant malaria.
A recent paper in the Lancet examined the proportion of patients in Burma with malaria in whom the parasite, Plasmodium falciparum, was resistant to what is now the mainstay of treatment, artemisinin, a derivative of a herbal remedy known for hundreds of years to Chinese medicine. The results are not reassuring.
There was a time, not so very long ago, when the global eradication of malaria was envisaged by the WHO, and it looked for a time as if it might even be achieved. The means employed to eradicate it was insecticide that killed the mosquitoes that transmitted the malarial parasites, but a combination of pressure from environmentalists who were worried about the effects of DDT on the ecosystem and mosquito resistance to insecticides led to a recrudescence of the disease.
At the same time, unfortunately, resistance to antimalarial drugs emerged. Control of malaria, not its eradication, became the goal; an insect and a protozoan had defeated the best efforts of mankind. And this is no small matter: the last time resistance to a mainstay of treatment for malaria, chloroquine, emerged in South-East Asia, millions of people died as a result in Africa for lack of an alternative treatment.
What most surprised me about this paper was the method the authors used to determine the prevalence of resistance to artemisinin in the malarial parasites of Burma: for I remember the days when such prevalence was measured by the crude clinical method of giving patients chloroquine and estimating how many of them failed to get better.
The genetic mutations that make the parasite resistant to artemisinin have been recognized. The authors were able to estimate the percentage of patients with malarial parasites that had mutations associated with drug resistance. Nearly 40 percent of their sample had such mutations, and in a province nearest to India the figure was nearly half. The prospects for the geographical spread of resistance are therefore high.
Nor is this all. Artemisinin resistance was first recognized in Cambodia 10 years ago but the mutations in Burma were different, suggesting that resistance can arise spontaneously in different places at the same time. From the evolutionary point of view, this is not altogether surprising: selection pressure to develop resistance to artemisinin exists wherever the drug is widely used.
One way of reducing the spread of resistance is the use in treatment of malaria of more than one antimalarial drug at a time, but this will only retard the spread, not prevent it altogether. As with tuberculosis, it is likely that parasites resistant to all known drugs will emerge. The authors of the paper end on a pessimistic note:
The pace at which the geographical extent of artemisinin resistance is spreading is faster than the rate at which control and elimination measures are being developed and instituted, or new drugs being introduced.
In other words, deaths from malaria will increase rather than continue to decrease, which is what we have come to think of as the normal evolution of a disease.
The Wall Street Journal is covering the latest trend in rejuveniling among the Millennial set: preschool for adults, where “play is serious business.” Six adults pay anywhere from $300 to $1000 to crowd into a Brooklyn duplex on Tuesday nights from 7 – 10 p.m. and participate in everything from nap time to envisioning themselves as superheroes.
Student Amanda Devereux detailed her reasons for enrolling in the Pre-K at Cosmo:
The self-help and goal-setting aspects were new, but welcome. I can use all the help I can get in making it to the gym, even if it means creating a superhero to get me there. I’m looking forward to seeing whether the preschool experience changes me over the next month, and I’m excited to see where Miss Joni and Miss CanCan take us on our class field trip. Mostly though, I’m excited about the snacks.
Is this latest trend in seeking eternal youth another glorified self-help program, or a sign that our traditional cultural institutions aren’t filled with hope and change? Is there a solution to be found in regressive creativity, or is this just another attempt at blissful ignorance? If you enrolled in preschool today, what would you learn?
When I was a boy in London I used to love what we called pea-soupers, that is to say fogs so thick that you couldn’t see your hand in front of your face at midday. They came every November and buses, with a man walking slowly before them to guide them, would loom up suddenly out of the gloom with their headlights like the glowing eyes of monsters. It took my father so long to drive to work that by the time he arrived it was time for him to come home again. I loved those fogs, but then the government went and spoiled the fun by passing the Clean Air Act. They never returned, those wonderful, exciting fogs.
Little did I know (or care) that those wonderful, exciting fogs killed thousands by bronchitis. But many years later I got bronchitis for the first and only time in my life from breathing the polluted winter air of Calcutta. I have also traveled in Communist countries where it seemed that the only thing the factories produced was pollution. I don’t need persuading that clear air is a good thing, not only aesthetically but also from the health point of view.
Southern California used to have some of the worst air pollution in the United States, but the quality of the air in Los Angeles has improved over the last two or three decades. Researchers who reported their findings in a recent edition of the New England Journal of Medicine conducted what is called a natural experiment: they estimated the pulmonary capacity of children who grew up as the level of pollution declined.
Most research on the health effects of air pollution has concentrated on deaths from cardiovascular disease among adults, usually of a certain age. But it is known that relatively poor lung function among younger people predicts cardiovascular disease later in life quite well. There is also an association between air pollution and early death from cardiovascular disease, though of course an association does not by itself prove causation. Does air pollution cause poor lung function in children?
The researchers measured lung function in three cohorts of children, 2120 in all, aged 11 to 15, who were of those ages between 1994 and 1998, 1997 and 2001, and 2007 and 2011. During this period, atmospheric pollution in Los Angeles declined markedly, as measured by levels of nitrogen dioxide, ozone and particulate matter.
Lung function, estimated by forced expiratory volume, improved (or at any rate increased) as air pollution declined. The numbers of children with lower than predicted function declined from 7.9 percent to 6.3 percent to 3.6 percent in the three cohorts. The improvement occurred among whites and Hispanics, boys and girls, and even those with asthma, i.e. the asthmatics, were less incapacitated.
The authors thought that the improvement in lung function was likely to persist into adulthood or, to put it in a slightly less cheerful way, damage done in childhood by air pollution might be permanent. This is not quite so pessimistic as it sounds, for there is probably no age at which an improvement in the quality of the air is not capable of producing an improvement in health.
The main drawback of the study was that there was no control group, that is to say a population whose cohorts of children experienced no improvement in the quality of the air they breathed. Perhaps the function of their lungs would have shown the same improvement as well, though I rather doubt it.
One little semantic point about the paper: children aged 11 were referred to as students rather than as pupils. Perhaps this is because we nowadays expect people to grow up very quickly, but not very far.
Wayne Goss is a 37-year old makeup artist with 15 years of experience and nearly a million YouTube followers. Lately he’s been receiving a lot of requests from female clients to make them up drag queen style, in large part due to the popularity of the drag queen look on television and social media. As Goss illustrates, drag queens use makeup to create the feminine look already inherent in female faces. Essentially, he’s been asked to mask natural femininity with a false face, leading him to question how we interpret the female look and concepts of natural female beauty.
Few things could be simpler: use a few exercises that work as much of the body at one time as possible, find out how strong you are now on these exercises, and next time you train, lift a little heavier weight. Just a little. It’s the same process you used to learn to read, to play the guitar, to get a suntan, and to finish your master’s thesis. It’s the same process used to build an airplane or to evolve a more complex organism. It’s the accumulation of adaptation – the enemy of entropy – and it can be done by quite literally everybody.
The ability to adapt to stress is a trait common to all living things. A physical stress is a change in the physical conditions under which an organism, like you, lives. If the conditions stay the same, you stay the same. If the conditions change, you have two choices: adapt, so that the new conditions aren’t a stress anymore, or fail to adapt, and perhaps die.
It is also important to understand that adaptation is specific to the stress that causes it.
The calluses on your hands from the shovel grow on the palms, where the shovel handle rubs, not on your face. You don’t learn to play the piano by playing the clarinet.
At its most elemental reduction, this is the situation. The ability to adapt to physical stress is built into our DNA, and it’s kept us alive for a long time. Training is the systematic and intentional application of progressively increasing specific stress – enough to make you adapt, not enough to kill you. It’s just simple arithmetic.
So what’s the problem? If this process is so simple, both logistically and philosophically, then why in the hell is there so much pointless confusion about what, how, and why?
I’ll tell you: because it suits the purposes of lots of people to make you think it’s complicated. Ever heard the term “muscle confusion”? It was popularized decades ago by the Weider organization, publishers of Muscle Builder magazine, as one of the famous “Weider Principles” of bodybuilding. Along with several other fabulously screwed-up ideas, such as the “Retro-Gravity Principle,” the “Partial Reps Principle,” and the “Triple Split Principle,” the idea that things have to be complicated to be effective was planted in millions of young minds. Trainee confusion, actually.
We grew up, some of us got into the business ourselves, and many of us clung to the idea that effectiveness requires complexity. Sometimes it does, usually it doesn’t. If you are playing the piano at the level of Glenn Gould, and you want to get even a little better, the process will be complex. It will involve a high level of tortuosity, relying on constantly-varying tempo, difficulty, precision, and musical style.
— Jason (@Vision365) February 14, 2015
Last week social media jumped on the story of a woman who supposedly decided to have a late-term abortion specifically because she found out she was having a boy. Based on a near-anonymous comment posted on an Internet forum, the story is highly questionable at best. Nevertheless, both pro- and anti-abortion advocates pounced on the missive. The dialogue generated took on a life of its own, inspiring the following comment from feminist site Jezebel:
“The virality of this story is sort of a nice reminder about confirmation bias: when something fits our preferred narrative just a little too snugly, it’s probably time for skepticism,” wrote Jezebel’s Anna Merlan.
How, exactly, does gendercide “fit our narrative” in the West, especially in relation to boys?
You’re reading a post for Preparedness Week, a weeklong series of blogs about disaster and emergency preparation inspired by the launch of Freedom Academy’s newest e-book, Surviving the End: A Practical Guide for Everyday Americans in the Age of Terror by James Jay Carafano, Ph.D. You can download the e-book exclusively at the PJ Store here.
People who go overboard to prepare for disaster scenarios are easy targets. I think back to 1999 during the whole Y2K scare, when the pastor of our church at the time held a seminar about what to stock up on when all the computers failed on New Year’s Eve at the stroke of midnight. I’ll never forget grown men arguing over who had the bigger food stash. My own personal stash consisted of two cans of green beans, and those cans helped me survive the crisis of what to serve with pork chops one day in January 2000.
National Geographic’s Doomsday Preppers series brought the eccentricities of modern disaster preppers to light in an entertaining way, showing us what some otherwise normal Americans do to prepare for “when the s*** hits the fan,” as so many of them were apt to say. These folks could have been your neighbors, except unlike you they were also worried about implausible scenarios like the super-volcano underneath Yellowstone Park erupting and throwing New York City into chaos. We’re talking about people who make plans to live off bathtub water or stockpile liquor to use as barter — people whose endearing wackiness packs a perverse fascination.
But the reality is that we do have genuine threats to worry about and ways to prepare for the worst without going off the deep end. That’s the point national that security expert and my PJ Lifestyle colleague James Jay Carafano, PhD makes in his brand new book Surviving the End: A Practical Guide for Everyday Americans in the Age Of Terror. Nowhere in this book will you find advice on how to create the ideal liquor stockpile or how to “bug out” to the wilderness, and you won’t read about an eruption at Yellowstone Park. What you will find is sober-minded advice on how to prepare for real, plausible scenarios that threaten the American way of life.
Carafano writes not with a Chicken Little doomsday mentality but with an eye toward clear thinking and calm judgment in a crisis (and with just the right amount of humor). His solutions are not over the top or prohibitively expensive — rather, his ideas only require reasonable amounts of time and money. Most simply put, Carafano drills down his philosophy of preparedness to health, faith, family, and education.
In Surving the End, Carafano looks at five distinct threats: epidemic disease, nuclear explosions, terrorism in its may forms, EMPs (electromagnetic pulses), and cyber attacks. While each of these scenarios carry their own scariness, they’re all quite real and carry their own far-reaching consequences. With each threat, Carafano examines the potential danger and fallout (no pun intended) and looks at practical and reasonable ways to ensure safety and long-term survival in each situation.
One theme that emerges throughout the book is that we should be proactive as families and communities to prepare for the worst, rather than relying on the federal government to help us out in a crisis. While he admits that Uncle Sam does provide some good resources and gets responses right once in a while, Carafano goes to great lengths to point out the failure of federal authorities when both sides are in charge. Glaring recent examples like Hurricane Katrina and the Fukushima nuclear disaster stand alongside historical records like the 1918 Swine Flu epidemic to warn all of us that governments rarely have the answers in a crisis.
Carafano’s recommendations in the book are always practical and doable. Some of them require investments of time and money, of course, but so do most worthwhile pursuits. Nothing the author suggests requires the odd leaps of faith that eccentric preppers promote. The fact that Carafano recommends so many well-researched and sensible responses to worst-case scenarios lends a genuine credibility to his writing. Surviving the End is no doomsday manual — it’s a guidebook for practical preparedness.
When all is said and done, Carafano has brought a new attitude to the arena of disaster prep — neither the quasi-Biblical urgency of a Glenn Beck nor the smug fatalism of reality show preppers, but a common-sense, can-do approach to readiness. And in the end, Carafano encourages us to realize that being sensibly prepared is the American way.
This guide has given you the best there is to offer of simple, practical, useful measures you can take to keep your loved ones safe. But there is another important message in the guide as well. We all will survive better if we pull together – not as mindless lemmings following Washington, but as free Americans who fight together for the future of freedom.
As terrible as the terrors we have talked about here are, they are no worse than the suffering at Valley Forge, the slaughter of Gettysburg, the crushing Great Depression, the tragedy of Pearl Harbor, or the terror of the Cuban Missile Crisis. This generation of Americans is every bit as capable of besting the worst life has to offer. If we do that together, our odds are more than even.
You know, he’s right. I really only had to read this book for the sake of this review, but I’ve already begun making a list of things I want to do to become more prepared (including getting in shape — as if I needed another reason to remind me), and I’ll recommend that my loved ones do the same. For this kind of sober-minded preparation boils down to common sense, plain and simple.
Carafano suggests that we all become preppers, and if we take the advice we read in Surviving the End, we can do so. We won’t turn into the kind of weirdos who are ready to off the pets and high tail it out to the wilderness or move to a bunker with more canned food than a Super Walmart “when the s*** hits the fan,” but we’ll be the kind of people who embody the robust, enterprising American spirit that has made our nation so great. And we’ll do our part to help ensure that America survives just as much as our families survive.
Apparently, many Americans would trust strangers more to make a medical diagnosis than a doctor according to the email I was sent today by a site called CrowdMed:
According to a recent study by CrowdMed [www.crowdmed.com (http://www.crowdmed.com/)] — a groundbreaking medical website that helps “crowdsource” solutions to the country’s most difficult medical mysteries — nearly one in five Americans (19%) has had to wait at least six months for a doctor to accurately diagnose a family member’s mysterious medical condition.
But what if you can’t wait that long? If you had a medical condition that baffled your doctor, would you be willing to get suggestions from perfect strangers?
According to the CrowdMed Medical Trust Census — a survey of 1,500 Americans on their attitudes toward traditional and nontraditional medical diagnosis — the vast majority of U.S. patients are interested in consulting others who are not necessarily practicing doctors. Noteworthy findings include:
>> 73% of Americans would trust a NURSE to suggest a diagnosis
>> 74% would trust an ALTERNATIVE MEDICINE PRACTITIONER
>> 84% would trust a RETIRED DOCTOR
>> 87% would trust a FORMER PATIENT WITH RELATED SYMPTOMS
>> 62% would trust a MEDICAL STUDENT
According to the site, you fill out a questionnaire and “collaborate With Medical Detectives” and then receive a report which includes the top diagnostic suggestions and solutions from the community. Given the problems with our healthcare system these days, it might be quicker to make a stop at this site than wait for ObamaCare to come through….
I supposed it was inevitable that we’d learn that Jonathan Gruber wants to tax body weight:
“Ultimately, what may be needed to address the obesity problem are direct taxes on body weight,” Gruber wrote in an essay for the National Institute for Health Care Management in April 2010, just months after helping design ObamaCare with the president in the Oval Office and during the period in which he was under contract as an Obama administration consultant.
“While it is hard to conceive of this approach being a common public policy tool in the near term, such taxation may be happening indirectly through health insurance surcharges,” he wrote. “Currently, employers may charge up to 20 percent higher health insurance premiums for employees who fail to meet certain health-related standards, such as attaining a healthy BMI.”
A couple of things.
The first is that BMI is a BS way to determine obesity, or much of anything else, really. But it’s easy to measure, especially if you just line up bunches of mostly-naked American Serfs™ for their annual IRS weigh-in. You might think I’m kidding, but if we’ve got to tax fat people, then we’ve got to weigh and measure them, and with 315 million Americans, the logistics get… busy. The Nazis used cattle cars, but I’m sure our tender IRS thugs would come up with something more humane.
The second is that if ♡bamaCare!!! saves us all this money, why do we have to keep coming up with new and ridiculous ways to finance it?
In the past, medical journals, pharmaceutical companies and researchers themselves have been criticized for publishing selectively only their positive results, that is to say, the results that they wanted to find. This is important because accentuation of the positive can easily mislead the medical profession into believing that a certain drug or treatment is much more effective than it really is.
On reading the New England Journal of Medicine and other medical journals, I sometimes wonder whether the pendulum has swung too far in the other direction, in accentuating the negative. To read of so many bright ideas that did not work could act as a discouragement to others and even lead to that permanent temptation of ageing doctors, therapeutic nihilism. But the truth is the truth, and we must follow it wherever it leads.
A recent edition of the NEJM, for example, reported on three trials, two with negative results and one with mildly positive ones. The trials involved the early treatment of stroke, the prophylaxis of HIV injection, and the treatment of angina refractory to normal treatment (a growing problem). Only the latter was successful, but it involved 104 patients as against 6729 patients in the two unsuccessful ones.
The successful trial involved the insertion of a device that increased pressure in the coronary sinus, the vein that drains the blood from the heart itself. For reasons not understood, this seems to redistribute the blood flow in the heart muscle, thus relieving angina. In the trial, the new device relieved and reduced angina symptoms, and improved the quality of life in the patients who received it compared with those who underwent a placebo-operation. The trial was too small, however, to determine whether the device improved survival, though even if it did not a reduction of symptoms and an improvement in the quality of life is worthwhile.
The trial of chemoprophylaxis of HIV was, by contrast, a total failure. The trial recruited 5029 young women in Africa who were given an anti-HIV drug in tablet or cream form, and others who were given placebos. The rate at which they became infected with HIV was compared, and no difference was found.
In large part this was because the patients did not take or use the pills or cream, though they claimed to have done so. A drug that few take is not of much use however effective it might be in theory, especially in prophylaxis rather than treatment. And this points to another problem of pharmaceutical research: in drug trials that require patients’ compliance with a regime, that compliance may be high during the trial itself (thanks to the researchers’ vigilance and enthusiasm) but low in “natural” conditions, when the patients are left to their own devices.
The trial of magnesium sulphate in the early treatment of stroke was also a failure. It had been suggested by experiments on animals that this chemical protects brain cells from degeneration after ischaemic stroke. It stood to reason, then, that it might improve the outcome in humans in ischaemic stroke, at least if given as soon as suspected.
Alas, it was no to be. The trial, involving 1700 patients, showed that the early administration of magnesium sulphate did not improve outcome in the slightest. At 90 days there was no difference between those who received it and those who had received placebo.
Is an idea bad just because it does not work? Could it be that those who discover something useful are just luckier than their colleagues? Perhaps there ought to be a Nobel Prize for failure, that is to say for the brightest idea that failed.
image illustration via shutterstock / PathDoc
Things looked pretty darn good in the middle of the twentieth century. We split the atom, using its energy for power and to send the most dead-end, dead-enders of the Axis scurrying. The Green Revolution saved a billion people from starving to death. On the micro level, we developed vaccines for polio, mumps measles and rubella.
In other words, we had the future and it was so bright, the world had to wear shades.
Fast forward another half-century.
In January 2015, we have at least 91 people infected in an outbreak linked to Disney Land. School districts are quarantining some students. The disease has spread from the happiest place on earth to other states and beyond our borders.
To keep this in perspective, we had 644 cases of measles in the United States for the year of 2014. That was a record year.
But hey, these things happen. After all, President Obama made our border easier to crack than a high school kegger and invited an unprecedented surge of illegal alien kids to crash that party. So an uptick of children’s diseases makes sense, right?
The disease is hitting the unvaccinated Americans and those unvaccinated aren’t born in East LA.
According to the National Institutes of Health,
“[u]nvaccinated children tended to be white, to have a mother who was married and had a college degree, to live in a household with an annual income exceeding 75,000 dollars, and to have parents who expressed concerns regarding the safety of vaccines and indicated that medical doctors have little influence over vaccination decisions for their children” (emphasis added).
So it’s not the poor and ignorant who avoid vaccines. It’s the Real Housewives of Orange County.
Well, in their defense, they have Jenny McCarthy on her side. And Jenny McCarthy went on both Oprah and Larry King.
The reality is that a significant subset of our population has bought hook, line and sinker that vaccines cause autism. They even had a study that showed the link between vaccines and autism.
Of course that study is discredited, not as an error but as an actual fraud by a man paid by the lawyers suing vaccine manufacturers. Its author lost his medical license. His coauthors removed their names from the study. Lancet, which carried the fraudulent data, pulled it.
Yet the non-vaccinated children still come from educated households.
Okay, that’s just one crazy superstition that can kind of make sense because a washed-up Playboy model glommed onto a fraudulent study.
That’s no reason to see a trend, right?
Well, look at the case of manmade global warming.
Well, wait. Here the elites have science. After all, didn’t President Obama point out that 2014 was the hottest year in human history?
If you can’t trust a president who just had his butt handed to him in the midterms, whom can you trust?
In ancient days, when life was nasty, brutish and short, people looked for any sort of advantage to reach the ripe old age of 30. First, there was fire and with it came cool things like keeping the animals at bay and not having bleeding runs every time you ate your latest kill. Then came the wheel, an easier way to get that steaming carcass of meat from here to there.
But let’s face it. In the game of survival, there’s no better way to get an edge on the local saber-toothed tiger — or your annoying neighbor — than seeing the future.
Thus we have the casting of bones because everybody knows that if anything is linked to the future, it’s chicken bones.
I mean, that’s just logic.
Global warming alarmists have their own version of chicken bones, in the form of computer climate models:
Problem: When compared to what is actually observed in the real world, the climate models fail to make accurate predictions. And this is a consistent problem.
You have to think that if our chicken-bone-throwing ancestors noticed that none of their throws matched up to actual events, they’d realize something was wrong. Perhaps they might not give up on the enterprise of chicken-bone throwing altogether – after all, who can deny chicken bones? – but they might decide that they’d killed a defective chicken.
Today’s educated savages can’t even make that leap. An honest man would say since the models don’t figure in things like water vapor – just a small part of the atmosphere, after all – and don’t actually predict the future, let’s try something else.
Instead, the educated savages award the computer modelers the Nobel Prize.
Primitive superstition is also strong in Leftist economics.
In World War II, the tribes of Papua New Guinea saw vast amounts of wealth coming into the Pacific on both the Allied and Axis sides. They had no way to comprehend the power of industrialized economies fully mobilized and dedicated to the largest war the world had ever seen. The natives made the natural assumption that spirits sent cargo to the earth and the evil outsiders jacked the loot.
So they built fake airplanes. They erected structures in the jungle and filled them with fake cash, sometimes even making fake suitcases.
Hmmm. Make work projects paid for with worthless currency. Doesn’t that sound like Obama’s stimulus plan or Paul Krugman – another educated savage Nobel laureate – looking for an alien threat in order to create demand to boost the economy?
Yes, Keynesian economic theory is a cargo cult, dressed up in suits and the flowery rhetoric of the university. Unfortunately, it shows the same effectiveness.
Welcome to the new Dark Ages, a time of policy based on superstitions easily recognized by savages sitting around the campfire. They might not understand the terms of the new cargo cults that have risen but they’d understand that old time religion.
image illustration via shutterstock / maximillion
As we get older, many of us go to the doctor more than we should. We ask the doctor about things doctors don’t really know much about, like diet and exercise. Doctors – having had no institutional training in diet and exercise while at the same time feeling as though they must maintain their authority over all things physical – most usually provide advice about these things anyway. They advise you to eat less fat and go walking every once in a while.
If you ask about strength training – since you have heard that it was a good idea and you know that walking is not strength training – their advice will be to just lift lighter weights and do more reps. Lighter weights and higher reps, that’s the ticket, right? Same effect, less risk, lighter is safer and more reps make up for the lighter weight, right?
It could be that doctors tell older people to just lift lighter weights because they have a genuine interest in not hurting older people, and they perceive that heavier weight is more dangerous than lighter weight. If they didn’t tell this to everybody else too, I might believe this was their intent. Hell, if they didn’t tell this to everybody, I wouldn’t be writing about it.
You have never seen an article here that I have written about diet, because that is not my field of either expertise or experience. I know something about it, most likely more than your doctor, but I reserve my public opinions on things about which I am not qualified to opine. When your doctor tells you to just use lighter weights and higher reps, he is wrong. Like when I refrain from writing about brain surgery, he should refrain from giving this advice about exercise. Here’s why.
Strength, as I have said many times, is merely the production of force by your muscles. The more weight you lift, the more force you produce. Since you can’t lift as much weight 10 times as you can 5 times, 5 reps allows you to use a heavier weight than 10 reps. Therefore, 5 reps with a heavier weight than 10 reps makes you stronger.
And that’s really all you need to know, because it really is this simple. The more weight you can lift, the stronger you are, and the heavier the weight you use in your training, the stronger you will become. Even you. A heavy set of 10 is mathematically lighter than a heavy set of 5. And there you have it.
But more importantly, sets of 10 are not just inefficient for building strength – they are counterproductive in a couple of ways. First, fatigue is the result of more repetitions of a weight, even a lighter weight. You know this yourself from working with your body. Any task repeated many times produces fatigue, and the heavier the task the more rapidly fatigue sets in. Walking doesn’t count because walking isn’t hard. Shoveling snow is a better example, and it’s easy to get pretty tired pretty quick with a big shovel.
Here’s the critical point: fatigue produces sloppy movement, and sloppy movement produces injuries. A set of 10 gets sloppy at about rep number 8 or 9, unless you’re an experienced lifter, and even then it’s damned hard to hold good form on the last reps of a high-rep set. A set of 5 ends before you get fatigued – 5 reps is an interesting compromise between heavy weight and work volume. Unless you’re a heart/lung patient, 5 reps won’t elevate your breathing rate until after the set is over, but a set of 10 will have your respiration rate elevated before the end of the set.
For a perfect example of how hysteria governs modern debates over complex issues, witness what happened yesterday morning to Governor Chris Christie. For the apparently unpardonable offense of offhandedly suggesting parents ought to have some freedom to decide how their kids are vaccinated, the governor’s political career was declared over. The instantaneous eruption from America’s self-deputized thought police had the governor — only hours later — meekly offering “clarification” of his earlier comments.
The debate over vaccines, itself nearing pandemic proportions in the U.S., is following a familiar pattern. People are either pro-science or anti-; in agreement with the “consensus” or crazy “conspiracists” and “deniers.” Much like the debate over global warming, there’s no room for middle ground; preaching prudence is basically blasphemous. And just as many are calling for climate “deniers” to be ostracized and even arrested, critics and parents who question the conventional wisdom on vaccines are likewise condemned as threats against civilization itself.
Like most everyone else, I am neither a doctor nor even a scientist. But I am smart enough to know there are perfectly valid reasons to question conventional wisdom.
Take the current controversy over measles. From the looks of my Twitter feed and the comments sections under just about any vaccine-related article, you’d think we were talking about the bubonic plague. In fact, measles, despite being highly contagious, isn’t particularly dangerous. So long as your immune system is in decent shape, you’ll be fine. In fact, you might actually want it, as exposure leads to lifetime immunity.
Measles is basically a fever with an accompanying rash. It’s true that in the 1800s, outbreaks caused tragically large numbers of children to die — but these were concentrated in orphanages and hospital wards (places where malnutrition was rampant). As the world prospered, affluence spread, and health improved, in the U.S. the chances of dying after contracting measles dropped to 1-2 percent by the 1930s. By the time a vaccine was introduced in 1963, deaths from measles were virtually nonexistent. Asthma, according to “Vital Statistics of the United States, 1963,” claimed 56 times as many lives.
Today it’s popular to argue that measles would be totally defeated were it not for the Jenny McCarthys of the world. The only problem is that the MMR (measles-mumps-rubella) vaccine does not actually immunize — as most people understand the word — against measles. The most we can expect is temporary protection. That’s because vaccines are injected directly into the body, bypassing the body’s natural immune response. “Most disease-causing organisms enter your body through the mucous membranes of your nose, mouth, pulmonary system or your digestive tract – not through an injection,” explains Dr. Joseph Mercola. “These mucous membranes have their own immune system, called the IgA immune system.”
Initially described as lifelong insurance, health officials realized in the ’70s, when an uptick in measles diagnoses occurred among vaccinated high-school students, that the vaccine should probably be administered more regularly. The CDC now advises receiving the vaccine at 12-15 months, 4-6 years, and again as an adult. The U.S. is also using its third version of a measles vaccine, after the first two proved ineffective.
Which should probably make it no surprise that many of the people catching measles today were vaccinated. Today’s measles cases are occurring in heavily vaccinated populations. When a 2006 outbreak among college students in the Midwest struck, the fact that most of the affected were vaccinated seemingly made no difference. When an outbreak of the mumps hit the NHL this year, many reflexively blamed “anti-vaxxers.” Almost no one reported that every affected player appears to have received the MMR vaccine. The Penguins’ Sidney Crosby received not only the initial MMR, but also a booster just before the Sochi Olympics. The director of the Vaccine Education Center at the Children’s Hospital of Philadelphia, Paul Offit, would only say “we know that the short-term effectiveness of the mumps vaccine is excellent.”
Still, none of this would suggest there’s any reason to avoid regular vaccines — were it not for side effects. And here comes another wrinkle: The MMR vaccine can itself give you measles. In 2013, measles began spreading in British Columbia after a two year-old girl contracted the virus from the vaccine, and then began spreading it to others. Though rare, there are other risks worth considering, too: According to the CDC, side effects to MMR can range from minor (fever, mild rash, swelling), to moderate (seizure, temporary low platelet count), to major (deafness, long-term seizures, permanent brain damage). Note that the latter two categories are worse than the disease itself. Perhaps a bigger problem is how these vaccines weaken the immune response among undernourished patients. “In developing countries, the use of high-titre vaccine at 4-6 months of age was associated with an unexpectedly high mortality in girls by the age of 2 years from infectious childhood illness,” a study reported in the British Medical Journal.
As recently as the 1970s, the CDC recommended children receive four vaccines. Today, per CDC protocol, children can receive around 40 shots between birth and the age of 6. What if that number grows to 100? 500? Will it always be unreasonable to ask, “Is all of this really necessary?”
Finally, this may come as a shock, but it’s actually possible for the government and the medical establishment to get things wrong. This year the CDC admitted its flu vaccine was created for the wrong strain — yet Americans are being instructed to get the shot anyway. Indeed, some parents are being threatened with having their children taken if they aren’t given this (almost certainly) useless flu vaccine. For more than a generation Americans were told to avoid as much as possible saturated fat, salt, and calories in general. More recent science shows that salt consumption has no causal relationship with blood pressure; eating healthy saturated fats like grass-fed butter is good for your heart, brain, and metabolism, and calories are actually a form of energy that gives us life.
Assigning responsibility for your children’s health and well-being to others — even “experts” — is precisely the opposite of parenting. Asking questions, educating yourself, soliciting more than one opinion: these aren’t the behaviors of people to be condemned and vilified. When someone insists you submit to the expertise of others, they’re actually asking you to stop thinking for yourself. And that’s a mistake. Vaccines, like so much of life, are more complex than a simple good-vs.-evil analysis affords. Universal solutions rarely work universally. Parents are right to do their homework.
Here’s Senator Rand Paul saying that most vaccines should be voluntary:
Late in the previous century, when the Toronto Star spiked my column debunking Kwanzaa — the editor scolded me for wanting to “ruin other people’s fun” by telling the truth, which in hindsight would make for an apt if ungainly personal motto on my (non-existent) coat of arms — I sent the piece to Canada’s only conservative magazine, the (since defunct) Alberta Report.
Link Byfield, the magazine’s publisher and editor, snapped it up, and asked for more.
I’d been a professional writer for years, but now my career as a right-wing writer had begun.
Byfield died of cancer this week, at 63.
My fellow AB contributor Colby Cosh was and is a libertarian (some might say craggily contrarian) atheist who was nevertheless embraced right out of grad school by the unabashedly Christian so-con Byfields.
Cosh — today, like many former Report writers, a star columnist at a national publication — quickly composed an obituary of Byfield that is, not surprisingly, insightful, elegant and stringently unsentimental.
(The Byfields have a keen eye for talent, if I do say so myself…)
Another longtime colleague, Peter Stockland, attended a tribute to Byfield last September, an event arranged after he was diagnosed with terminal cancer.
Stockland explained Link Byfield’s influence on recent Canadian history with this succinct formula, one that resembles the mnemonic verse British schoolchildren used to learn to keep their kings and queens straight.
No Byfields, no Alberta Report. No Alberta Report, no Reform Party as it was formed. No Reform Party, no [Progressive Conservative Party] collapse. No PC collapse, no [Conservative Party] Harper government.
Some perspective for American readers:
My husband and I attended a lecture about Israel by Melanie Phillips a few years back.
Phillips, while correct on so many issues, remained convinced that Europe’s “fringe” “right-wing” populist political leaders, while anti-sharia, were also racist, anti-Semitic losers and therefore unwelcome allies in the counter-jihad.
Afterwards, my husband took her aside and explained — to her visible surprise – that Canada’s “fringe right wing” populist Reform Party had once been condemned as backward, bigoted and doomed, too; yet one of its founders, Stephen Harper, was now the staunchly pro-Israel prime minister of Canada, having just won a second federal election.
Non-Canadians are, presumably, more familiar with our “free” “healthcare” system, as I call it.
On that topic, Mark Steyn once quoted a fictional Canadian — OK, Quebecois — character’s decision to die a principled death:
Sébastien wants his dad to go to Baltimore for treatment, but Remy roars that he’s the generation that fought passionately for socialized health care and he’s gonna stick with it even if it kills him.
“I voted for Medicare,” he declares. “I’ll accept the consequences.”
But Link Byfield was a real man, not an imaginary one.
That makes what follows all the more notable.
Yet what truly mattered to [Byfield] was having lived out, as far as possible in the midst of a train wreck, a principled reality.
I mentioned an e-mail he sent last summer explaining his choice to forgo chemotherapy because it would not save him, yet would cost taxpayers $100,000.
I said I could not imagine other Canadians who would factor such public policy considerations into their personal health care.
“But that would have been standard thinking among politically literate citizens 50 ago,” he said. “People wouldn’t even articulate it. It would just be something they would think.”
When I asked his source for thinking that way, he said: “Thou shalt not steal.”
How informed is informed? What is the psychological effect of being told of every last possible complication of a treatment? Do all people react the same way to information, or does their reaction depend upon such factors as their intelligence, level of education, and cultural presuppositions, and if so does the informing doctor have to take account of them, and if so how and to what degree? An orthopedic surgeon once told me that obtaining informed consent from patients now takes him so long that he had had to reduce the number of patients that he treats.
An article in a recent edition of the New England Journal of Medicine extols the ethical glories of informed consent without much attention to its limits, difficulties and disadvantages.
It starts by referring to a trial of the level of oxygen in the air given to premature babies, of whom very large numbers are born yearly. Back in the 1940s it was thought that air rich in oxygen would compensate for premature babies’ poor respiratory system, but early in the 1950s British doctors began to suspect, correctly, that these high levels of oxygen caused retinal damage leading to permanent blindness. Fifty years later, the optimal level of oxygen is still not known with certainty, and a trial was conducted that showed that while higher levels of oxygen caused an increased frequency of retinopathy, lower levels resulted in more deaths. The authors of the trial have been criticized because they allegedly did not inform the parents of the possibility that lower levels of oxygen might lead to decreased survival, which was reasonably foreseeable.
How reasonable does reasonability have to be? Many of the most serious consequences of a treatment are totally unexpected and not at all foreseeable (no one suspected that high levels of oxygen for premature babies would result in blindness, for example, and it took many years before this was realized). Ignorance is, after all, the main reason for conducting research.
But suppose parents of premature babies had been asked to participate in a trial in which their offspring were to be allocated randomly to an increased risk of blindness or an increased risk of death. Surely this frankness would have been cruel, all the more so as the precise risks could not have been known in advance. Parents would feel guilt alike if their babies died or were blind.
Now that the answer is known, more or less, parents can be asked to choose in the light of knowledge: but their informed consent will be agonizing because there is no correct answer. Personally, I would rather trust the doctor sufficiently to act in my best interests in the light of his knowledge and experience. So far in life I have not had reason to regret this attitude, though I am aware that it has its hazards also. But
…why should they know their fate?
Since sorrow never comes too late,
And happiness too swiftly flies.
Thought would destroy their paradise.
No more; where ignorance is bliss,
‘Tis folly to be wise.
And I have often thought what medical ethicists would have made of the pioneers of anesthesia. They did not seek the informed consent of their patients, in part, but only in part, because they hadn’t much information to give. What moral irresponsibility, giving potentially noxious and even fatal substances to unsuspecting experimental subjects without warning them of the dangers!
And there are even some medical ethicists who think we should not take advantage of knowledge gained unethically. All operations should henceforth be performed without anesthesia, therefore.
I can’t decide if this is troubling or decent advice: “The enemy within: People who hear voices in their heads are being encouraged to talk back:”
Research suggests that up to one in 25 people hears voices regularly and that up to 40 per cent of the population will hear voices at some point in their lives. But many live healthy and fulfilling lives despite those aural spectres.
Recently, Waddingham and more than 200 other voice-hearers from around the world gathered in Thessaloniki, Greece, for the sixth annual World Hearing Voices Congress, organised by Intervoice, an international network of people who hear voices and their supporters. They reject the traditional idea that the voices are a symptom of mental illness. They recast voices as meaningful, albeit unusual, experiences, and believe that potential problems lie not in the voices themselves but in a person’s relationship with them.
“If people believe their voices are omnipotent and can harm and control them, then they are less likely to cope and more likely to end up as psychiatric patients,” says Eugenie Georgaca, a senior lecturer at the Aristotle University of Thessaloniki and the organiser of this year’s conference. “If they have explanations of voices that allow them to deal with them better, that is a first step toward learning to live with them.”
The road to this form of recovery often begins in small support groups run by the worldwide Hearing Voices Network (HVN). Founded in the Netherlands in 1987, it allows members to share their stories and coping mechanisms – for example, setting appointments to talk with the voices, so that the voice-hearer can function without distraction the rest of the day – and above all gives voice-hearers a sense of community, as people rather than patients.
Here are The basic assumptions of INTERVOICE from their website:
Hearing voices is a normal though unusual and personal variation of human experience.
Hearing voices makes sense in relation to personal life experiences.
The problem is not hearing voices but the difficulty to cope with the experience.
People who hear voices can cope with these experiences by accepting and owning their voices.
A positive attitude by society and its members towards people hearing voices increases acceptance of voices and people who hear voices. Discrimination and excluding of people hearing voices must stop.
I am leaning towards troubling….
Some of the happiest afternoons of my childhood were passed in the company of a guy named Norm Breyfogle.
Norm was the artist on Detective Comics back in the late ‘80s. But it might be more accurate to say he was the window through which I got to see Batman patrolling the rooftops of Gotham, beating the ever-loving hell out of drug dealers and triumphing over crazed killers. For me and many other late-Generation Xers, Norm was the definitive Batman artist. It was his version of the character (along with writer Alan Grant) that my generation grew up with.
It’s probably hard to appreciate now how innovative Norm’s style was at the time. I couldn’t have explained back then, of course—I just liked the artwork’s energy and story-telling—but looking back his style was much more expressionistic than his contemporaries’. Perspective and shadow were distorted to amplify every panel’s mood.
But it wasn’t just a scene’s feel that he cared about. There was so much energy in Norm’s action scenes as he showed heartbeat-by-heartbeat how Batman defeated a given bad guy:
Long after I’d stopped reading comics, Christopher Nolan’s Batman movies would occasionally make me nostalgic for the version I grew up with. It was a pleasant surprise to discover online that I was just one of many impressed, and grateful, for Norm’s years on the character. It made me happy to know that, even decades later, his work on Batman was remembered as one of the best runs in comics history.
My generation’s Batman, still one of the best. Cool.
So it came as a shock to learn that Norm Breyfogle, just 54 years old, suffered a stroke in mid-December.
He’s expected to recover eventually, but in the meantime the stroke has paralyzed his left side which is especially heartbreaking considering Norm is a left-handed artist. It’s also put him in the hole for $200K on medical expenses. His family has turned to crowdfunding to help with the costs, and has set up a contribution site here.
I gave, and have since been watching the funds-raised bar, hoping it will make it to $200K. It hovers at $70K as of this writing. There’s only 7 days left in the drive.
The comic book blogosphere has covered it, trying to spread word about the crowdfunding effort. But the guys reading those sites will probably skew younger. They’re not of the generation that grew up reading Norm’s Batman. They don’t owe childhood memories to him like I do.
Most guys my age don’t follow comic book news anymore. I only learned about what happened to Norm myself because a friend who has kept a hand in the comics posted it on Facebook. With the clock ticking down, it’s time to get word out to other corners of the internet where his old fans may now be.
Which is why I’m here now. To get word out that a man who brought a lot of happiness to a generation of kids needs help. To let all the people that grew up enjoying Norm’s work know that he could use some of your help now.
If you’re able to, please consider contributing.
Medical history is instructive, if for no other reason than that it might help to moderate somewhat the medical profession’s natural inclination to arrogance, hubris and self-importance. But the medical curriculum is now too crowded to teach it to medical students and practicing doctors are too busy with their work and keeping up-to-date to devote any time to it. It is only when they retire that doctors take an interest in it, as a kind of golf of the mind, and by then it is too late: any harm caused by their former hubris has already been done.
Until I read an article in a recent edition of the Lancet, I knew of only one eminent doctor who had been shot by his patient or a patient’s relative: the Nobel Prize-winning Portuguese neurologist Egas Moniz, who was paralyzed by a bullet in the back. It was he who first developed the frontal lobotomy, though he was also a pioneer of cerebral arteriography. As he was active politically during Salazar’s dictatorship, I am not sure whether his patient shot him for medical or political reasons, or for some combination of the two.
I have worked in the fitness industry since 1978, and have owned a gym since 1984. Since I went into business for myself, I have approached the teaching of strength training from a completely different perspective than the industry’s standard model — I have taught all my members to lift barbells, as opposed to the machine-based exercise paradigm used by the commercial fitness industry at large.
During my time as a gym owner I have made several mistakes, none of which had anything to do with my decision to teach everybody how to use barbells safely, efficiently, and productively. Rather, my biggest regret was not doing so, once, when I should have.
Dr. Coleman came to the gym on the advice of his doctor. He was in his late 60s at the time, still a working cardiologist, but he was not terribly robust even for a guy his age. He was a very nice man, excruciatingly polite to everyone and generous to a fault. I remember the first question I asked him, being one of the first doctors we’d had in the gym and me being curious about lots of things: “How is it, Dr. Coleman, that a dog can drink nasty water out of a puddle in the road and be perfectly fine, but if I did that I’d get sick — as a dog? Haha.” He regarded me momentarily, as if deciding how to respond to a curious but dull child (not an altogether inappropriate assessment), and calmly explained that there were profound differences in the digestive environment between that of myself and my little bulldog girlfriend Dumplin. He was a patient man as well.
My friend Cardell ended up with Dr. Coleman as his personal training client. Cardell and I had trained together for years, starting at the YMCA in downtown Wichita Falls, Texas, in the early ‘80s. This was the same weight room in which Bill Starr, former editor of York Barbell’s Strength and Health and one of the first strength coaches in the world, had started out in the late ‘50s – the room had history. It was important to us too, as a place where we honed our skills and grew as lifters and men. When I bought Anderson’s Gym in 1984, we moved our training headquarters to the renamed Wichita Falls Athletic Club, and I began the task of applying barbell training to a commercial gym’s clientele.
Following the prescribed industry methodology we had both been taught by the then-becoming-mainstream National Strength and Conditioning Association, Cardell used a machine-based approach in his work with Dr. Coleman. It was perfectly congruent with the thinking at the time, and it still is: the client was old, free weights are dangerous, we mustn’t hurt old people — we mustn’t even entertain the possibility of hurting old people — and Dr. Coleman skated through his workouts with Cardell unscathed.
He also failed to make any significant progress toward a more robust physical capacity. Dr. Coleman joined the gym as a frail older man, never walking with the aggressive, confident stride of a fit person, and never assuming the positions of sitting, standing back up, or getting in and out of the car without carefully and deliberately measuring his position. He left the gym many years later a still-frail, even-older man.
And I let it happen. My fault for standing there, watching but paying no attention, as the potential for reversing the effects of age and a sedentary lifestyle slipped through our fingers.
Of late the New England Journal of Medicine has seemed like the burial ground of good ideas. Researchers follow a promising lead only to find that their new idea fails the crucial test of experience: and the difference between success and failure in research is made to appear as much a matter of chance or luck as of brilliance or skill.
In the first issue of the Journal for 2015, Dutch researchers from 16 different hospitals report an unequivocal success in the treatment of ischemic stroke.
Until now the only proven worthwhile treatment of patients with the kind of stroke that results from the blockage of a cerebral artery is the infusion within four and a half hours of the drug called alteplase, which dissolves thrombus (and which is manufactured from the ovaries of Chinese hamsters). But even with the use of this drug the prognosis is not very good, and there are several contra-indications to its use.
This year has been a strange one in terms of celebrity behavior, some of which was concerning if not entirely disturbing, and apparently contagious as well. Examples of skin selfies and exchanges that were once considered private are posted all over the internet. Active participants are all ages, shapes and sizes: beware the visuals of regular people (generally females) sharing their cups overflowing or unsuspecting panties being eaten alive by a ravenous pair of robust cheeks. Who’d have guessed that plumber’s crack would be exalted to such artistic (albeit unsavory) exhibitionist displays?
Yet for some unknowable reason, fans can’t seem to get enough lifestyle advice from entertainers, emulating even the most bizarre spectacles, especially when it comes to diet and beauty.
Female celebs in particular offer infinite health counsel for the masses. And women of all walks eat it up, the more peculiar, the better. Such odd “healthy” behaviors include January Jones ingesting her own dried and encapsulated placenta or Lady Gaga touting her revolutionary “Hangover Diet” consisting of nothing but whiskey… Then of course there is the explosive “Fermented Foods Diet” that Madonna uses to keep her colon free from debris. Sounds delicious.
Everyone knows the pleasures of having his prejudices confirmed by the evidence. The pleasures of changing one’s mind because of the evidence are somewhat less frequently experienced, though none the less real. Among those pleasures is that of self-congratulation on one’s own open-mindedness and rationality. It would therefore delight me to learn that my prejudice about obesity — that it is a natural consequence of overeating, which is to say of human weakness and self-indulgence — was false.
I therefore read with interest and anticipation a recent article in the New England Journal of Medicine with the title “Microbiota, Antibiotics, and Obesity.” The connection of antibiotics with obesity had not previously occurred to me; perhaps the real reason why so many people now have the appearance of beached whales was about to be revealed to me.
It is easier to advise than to have or to retain a sense of proportion, especially when it is most needed. I have never known anyone genuinely comforted by the idea that others were worse off than he, which perhaps explains why complaint does not decrease in proportion to improvement in general conditions. And he would be a callow doctor who tried to console the parents of a dead child with the thought that, not much more than a century ago, an eighth of all children died before their first birthday.
Still, it is well that from time to time medical journals such as the Lancet should carry articles about medical history, for otherwise we might take our current state of knowledge for granted. Ingratitude, after all, is the mother of much discontent. To know how much we owe to our forebears keeps us from imagining that our ability to diagnose and cure is the consequence of our own peculiar brilliance, rather than simply because we came after so much effort down the ages.
A little article in the Lancet recently was written by two historians who are in the process of analyzing the results of 9000 coroners’ inquests into accidental deaths in Tudor England. It seems astonishing to me that such records should have survived for more than four centuries, but also that the state should have cared enough about the deaths of ordinary people to hold such inquests (coroners’ inquests had already been established for 400 years at the time of the Tudors). In other words, an importance was given to individual human life even before the doctrines of the Enlightenment took root: the soil was already fertile.