Few things could be simpler: use a few exercises that work as much of the body at one time as possible, find out how strong you are now on these exercises, and next time you train, lift a little heavier weight. Just a little. It’s the same process you used to learn to read, to play the guitar, to get a suntan, and to finish your master’s thesis. It’s the same process used to build an airplane or to evolve a more complex organism. It’s the accumulation of adaptation – the enemy of entropy – and it can be done by quite literally everybody.
The ability to adapt to stress is a trait common to all living things. A physical stress is a change in the physical conditions under which an organism, like you, lives. If the conditions stay the same, you stay the same. If the conditions change, you have two choices: adapt, so that the new conditions aren’t a stress anymore, or fail to adapt, and perhaps die.
It is also important to understand that adaptation is specific to the stress that causes it.
The calluses on your hands from the shovel grow on the palms, where the shovel handle rubs, not on your face. You don’t learn to play the piano by playing the clarinet.
At its most elemental reduction, this is the situation. The ability to adapt to physical stress is built into our DNA, and it’s kept us alive for a long time. Training is the systematic and intentional application of progressively increasing specific stress – enough to make you adapt, not enough to kill you. It’s just simple arithmetic.
So what’s the problem? If this process is so simple, both logistically and philosophically, then why in the hell is there so much pointless confusion about what, how, and why?
I’ll tell you: because it suits the purposes of lots of people to make you think it’s complicated. Ever heard the term “muscle confusion”? It was popularized decades ago by the Weider organization, publishers of Muscle Builder magazine, as one of the famous “Weider Principles” of bodybuilding. Along with several other fabulously screwed-up ideas, such as the “Retro-Gravity Principle,” the “Partial Reps Principle,” and the “Triple Split Principle,” the idea that things have to be complicated to be effective was planted in millions of young minds. Trainee confusion, actually.
We grew up, some of us got into the business ourselves, and many of us clung to the idea that effectiveness requires complexity. Sometimes it does, usually it doesn’t. If you are playing the piano at the level of Glenn Gould, and you want to get even a little better, the process will be complex. It will involve a high level of tortuosity, relying on constantly-varying tempo, difficulty, precision, and musical style.
— Jason (@Vision365) February 14, 2015
Last week social media jumped on the story of a woman who supposedly decided to have a late-term abortion specifically because she found out she was having a boy. Based on a near-anonymous comment posted on an Internet forum, the story is highly questionable at best. Nevertheless, both pro- and anti-abortion advocates pounced on the missive. The dialogue generated took on a life of its own, inspiring the following comment from feminist site Jezebel:
“The virality of this story is sort of a nice reminder about confirmation bias: when something fits our preferred narrative just a little too snugly, it’s probably time for skepticism,” wrote Jezebel’s Anna Merlan.
How, exactly, does gendercide “fit our narrative” in the West, especially in relation to boys?
You’re reading a post for Preparedness Week, a weeklong series of blogs about disaster and emergency preparation inspired by the launch of Freedom Academy’s newest e-book, Surviving the End: A Practical Guide for Everyday Americans in the Age of Terror by James Jay Carafano, Ph.D. You can download the e-book exclusively at the PJ Store here.
People who go overboard to prepare for disaster scenarios are easy targets. I think back to 1999 during the whole Y2K scare, when the pastor of our church at the time held a seminar about what to stock up on when all the computers failed on New Year’s Eve at the stroke of midnight. I’ll never forget grown men arguing over who had the bigger food stash. My own personal stash consisted of two cans of green beans, and those cans helped me survive the crisis of what to serve with pork chops one day in January 2000.
National Geographic’s Doomsday Preppers series brought the eccentricities of modern disaster preppers to light in an entertaining way, showing us what some otherwise normal Americans do to prepare for “when the s*** hits the fan,” as so many of them were apt to say. These folks could have been your neighbors, except unlike you they were also worried about implausible scenarios like the super-volcano underneath Yellowstone Park erupting and throwing New York City into chaos. We’re talking about people who make plans to live off bathtub water or stockpile liquor to use as barter — people whose endearing wackiness packs a perverse fascination.
But the reality is that we do have genuine threats to worry about and ways to prepare for the worst without going off the deep end. That’s the point national that security expert and my PJ Lifestyle colleague James Jay Carafano, PhD makes in his brand new book Surviving the End: A Practical Guide for Everyday Americans in the Age Of Terror. Nowhere in this book will you find advice on how to create the ideal liquor stockpile or how to “bug out” to the wilderness, and you won’t read about an eruption at Yellowstone Park. What you will find is sober-minded advice on how to prepare for real, plausible scenarios that threaten the American way of life.
Carafano writes not with a Chicken Little doomsday mentality but with an eye toward clear thinking and calm judgment in a crisis (and with just the right amount of humor). His solutions are not over the top or prohibitively expensive — rather, his ideas only require reasonable amounts of time and money. Most simply put, Carafano drills down his philosophy of preparedness to health, faith, family, and education.
In Surving the End, Carafano looks at five distinct threats: epidemic disease, nuclear explosions, terrorism in its may forms, EMPs (electromagnetic pulses), and cyber attacks. While each of these scenarios carry their own scariness, they’re all quite real and carry their own far-reaching consequences. With each threat, Carafano examines the potential danger and fallout (no pun intended) and looks at practical and reasonable ways to ensure safety and long-term survival in each situation.
One theme that emerges throughout the book is that we should be proactive as families and communities to prepare for the worst, rather than relying on the federal government to help us out in a crisis. While he admits that Uncle Sam does provide some good resources and gets responses right once in a while, Carafano goes to great lengths to point out the failure of federal authorities when both sides are in charge. Glaring recent examples like Hurricane Katrina and the Fukushima nuclear disaster stand alongside historical records like the 1918 Swine Flu epidemic to warn all of us that governments rarely have the answers in a crisis.
Carafano’s recommendations in the book are always practical and doable. Some of them require investments of time and money, of course, but so do most worthwhile pursuits. Nothing the author suggests requires the odd leaps of faith that eccentric preppers promote. The fact that Carafano recommends so many well-researched and sensible responses to worst-case scenarios lends a genuine credibility to his writing. Surviving the End is no doomsday manual — it’s a guidebook for practical preparedness.
When all is said and done, Carafano has brought a new attitude to the arena of disaster prep — neither the quasi-Biblical urgency of a Glenn Beck nor the smug fatalism of reality show preppers, but a common-sense, can-do approach to readiness. And in the end, Carafano encourages us to realize that being sensibly prepared is the American way.
This guide has given you the best there is to offer of simple, practical, useful measures you can take to keep your loved ones safe. But there is another important message in the guide as well. We all will survive better if we pull together – not as mindless lemmings following Washington, but as free Americans who fight together for the future of freedom.
As terrible as the terrors we have talked about here are, they are no worse than the suffering at Valley Forge, the slaughter of Gettysburg, the crushing Great Depression, the tragedy of Pearl Harbor, or the terror of the Cuban Missile Crisis. This generation of Americans is every bit as capable of besting the worst life has to offer. If we do that together, our odds are more than even.
You know, he’s right. I really only had to read this book for the sake of this review, but I’ve already begun making a list of things I want to do to become more prepared (including getting in shape — as if I needed another reason to remind me), and I’ll recommend that my loved ones do the same. For this kind of sober-minded preparation boils down to common sense, plain and simple.
Carafano suggests that we all become preppers, and if we take the advice we read in Surviving the End, we can do so. We won’t turn into the kind of weirdos who are ready to off the pets and high tail it out to the wilderness or move to a bunker with more canned food than a Super Walmart “when the s*** hits the fan,” but we’ll be the kind of people who embody the robust, enterprising American spirit that has made our nation so great. And we’ll do our part to help ensure that America survives just as much as our families survive.
Apparently, many Americans would trust strangers more to make a medical diagnosis than a doctor according to the email I was sent today by a site called CrowdMed:
According to a recent study by CrowdMed [www.crowdmed.com (http://www.crowdmed.com/)] — a groundbreaking medical website that helps “crowdsource” solutions to the country’s most difficult medical mysteries — nearly one in five Americans (19%) has had to wait at least six months for a doctor to accurately diagnose a family member’s mysterious medical condition.
But what if you can’t wait that long? If you had a medical condition that baffled your doctor, would you be willing to get suggestions from perfect strangers?
According to the CrowdMed Medical Trust Census — a survey of 1,500 Americans on their attitudes toward traditional and nontraditional medical diagnosis — the vast majority of U.S. patients are interested in consulting others who are not necessarily practicing doctors. Noteworthy findings include:
>> 73% of Americans would trust a NURSE to suggest a diagnosis
>> 74% would trust an ALTERNATIVE MEDICINE PRACTITIONER
>> 84% would trust a RETIRED DOCTOR
>> 87% would trust a FORMER PATIENT WITH RELATED SYMPTOMS
>> 62% would trust a MEDICAL STUDENT
According to the site, you fill out a questionnaire and “collaborate With Medical Detectives” and then receive a report which includes the top diagnostic suggestions and solutions from the community. Given the problems with our healthcare system these days, it might be quicker to make a stop at this site than wait for ObamaCare to come through….
I supposed it was inevitable that we’d learn that Jonathan Gruber wants to tax body weight:
“Ultimately, what may be needed to address the obesity problem are direct taxes on body weight,” Gruber wrote in an essay for the National Institute for Health Care Management in April 2010, just months after helping design ObamaCare with the president in the Oval Office and during the period in which he was under contract as an Obama administration consultant.
“While it is hard to conceive of this approach being a common public policy tool in the near term, such taxation may be happening indirectly through health insurance surcharges,” he wrote. “Currently, employers may charge up to 20 percent higher health insurance premiums for employees who fail to meet certain health-related standards, such as attaining a healthy BMI.”
A couple of things.
The first is that BMI is a BS way to determine obesity, or much of anything else, really. But it’s easy to measure, especially if you just line up bunches of mostly-naked American Serfs™ for their annual IRS weigh-in. You might think I’m kidding, but if we’ve got to tax fat people, then we’ve got to weigh and measure them, and with 315 million Americans, the logistics get… busy. The Nazis used cattle cars, but I’m sure our tender IRS thugs would come up with something more humane.
The second is that if ♡bamaCare!!! saves us all this money, why do we have to keep coming up with new and ridiculous ways to finance it?
In the past, medical journals, pharmaceutical companies and researchers themselves have been criticized for publishing selectively only their positive results, that is to say, the results that they wanted to find. This is important because accentuation of the positive can easily mislead the medical profession into believing that a certain drug or treatment is much more effective than it really is.
On reading the New England Journal of Medicine and other medical journals, I sometimes wonder whether the pendulum has swung too far in the other direction, in accentuating the negative. To read of so many bright ideas that did not work could act as a discouragement to others and even lead to that permanent temptation of ageing doctors, therapeutic nihilism. But the truth is the truth, and we must follow it wherever it leads.
A recent edition of the NEJM, for example, reported on three trials, two with negative results and one with mildly positive ones. The trials involved the early treatment of stroke, the prophylaxis of HIV injection, and the treatment of angina refractory to normal treatment (a growing problem). Only the latter was successful, but it involved 104 patients as against 6729 patients in the two unsuccessful ones.
The successful trial involved the insertion of a device that increased pressure in the coronary sinus, the vein that drains the blood from the heart itself. For reasons not understood, this seems to redistribute the blood flow in the heart muscle, thus relieving angina. In the trial, the new device relieved and reduced angina symptoms, and improved the quality of life in the patients who received it compared with those who underwent a placebo-operation. The trial was too small, however, to determine whether the device improved survival, though even if it did not a reduction of symptoms and an improvement in the quality of life is worthwhile.
The trial of chemoprophylaxis of HIV was, by contrast, a total failure. The trial recruited 5029 young women in Africa who were given an anti-HIV drug in tablet or cream form, and others who were given placebos. The rate at which they became infected with HIV was compared, and no difference was found.
In large part this was because the patients did not take or use the pills or cream, though they claimed to have done so. A drug that few take is not of much use however effective it might be in theory, especially in prophylaxis rather than treatment. And this points to another problem of pharmaceutical research: in drug trials that require patients’ compliance with a regime, that compliance may be high during the trial itself (thanks to the researchers’ vigilance and enthusiasm) but low in “natural” conditions, when the patients are left to their own devices.
The trial of magnesium sulphate in the early treatment of stroke was also a failure. It had been suggested by experiments on animals that this chemical protects brain cells from degeneration after ischaemic stroke. It stood to reason, then, that it might improve the outcome in humans in ischaemic stroke, at least if given as soon as suspected.
Alas, it was no to be. The trial, involving 1700 patients, showed that the early administration of magnesium sulphate did not improve outcome in the slightest. At 90 days there was no difference between those who received it and those who had received placebo.
Is an idea bad just because it does not work? Could it be that those who discover something useful are just luckier than their colleagues? Perhaps there ought to be a Nobel Prize for failure, that is to say for the brightest idea that failed.
image illustration via shutterstock / PathDoc
Things looked pretty darn good in the middle of the twentieth century. We split the atom, using its energy for power and to send the most dead-end, dead-enders of the Axis scurrying. The Green Revolution saved a billion people from starving to death. On the micro level, we developed vaccines for polio, mumps measles and rubella.
In other words, we had the future and it was so bright, the world had to wear shades.
Fast forward another half-century.
In January 2015, we have at least 91 people infected in an outbreak linked to Disney Land. School districts are quarantining some students. The disease has spread from the happiest place on earth to other states and beyond our borders.
To keep this in perspective, we had 644 cases of measles in the United States for the year of 2014. That was a record year.
But hey, these things happen. After all, President Obama made our border easier to crack than a high school kegger and invited an unprecedented surge of illegal alien kids to crash that party. So an uptick of children’s diseases makes sense, right?
The disease is hitting the unvaccinated Americans and those unvaccinated aren’t born in East LA.
According to the National Institutes of Health,
“[u]nvaccinated children tended to be white, to have a mother who was married and had a college degree, to live in a household with an annual income exceeding 75,000 dollars, and to have parents who expressed concerns regarding the safety of vaccines and indicated that medical doctors have little influence over vaccination decisions for their children” (emphasis added).
So it’s not the poor and ignorant who avoid vaccines. It’s the Real Housewives of Orange County.
Well, in their defense, they have Jenny McCarthy on her side. And Jenny McCarthy went on both Oprah and Larry King.
The reality is that a significant subset of our population has bought hook, line and sinker that vaccines cause autism. They even had a study that showed the link between vaccines and autism.
Of course that study is discredited, not as an error but as an actual fraud by a man paid by the lawyers suing vaccine manufacturers. Its author lost his medical license. His coauthors removed their names from the study. Lancet, which carried the fraudulent data, pulled it.
Yet the non-vaccinated children still come from educated households.
Okay, that’s just one crazy superstition that can kind of make sense because a washed-up Playboy model glommed onto a fraudulent study.
That’s no reason to see a trend, right?
Well, look at the case of manmade global warming.
Well, wait. Here the elites have science. After all, didn’t President Obama point out that 2014 was the hottest year in human history?
If you can’t trust a president who just had his butt handed to him in the midterms, whom can you trust?
In ancient days, when life was nasty, brutish and short, people looked for any sort of advantage to reach the ripe old age of 30. First, there was fire and with it came cool things like keeping the animals at bay and not having bleeding runs every time you ate your latest kill. Then came the wheel, an easier way to get that steaming carcass of meat from here to there.
But let’s face it. In the game of survival, there’s no better way to get an edge on the local saber-toothed tiger — or your annoying neighbor — than seeing the future.
Thus we have the casting of bones because everybody knows that if anything is linked to the future, it’s chicken bones.
I mean, that’s just logic.
Global warming alarmists have their own version of chicken bones, in the form of computer climate models:
Problem: When compared to what is actually observed in the real world, the climate models fail to make accurate predictions. And this is a consistent problem.
You have to think that if our chicken-bone-throwing ancestors noticed that none of their throws matched up to actual events, they’d realize something was wrong. Perhaps they might not give up on the enterprise of chicken-bone throwing altogether – after all, who can deny chicken bones? – but they might decide that they’d killed a defective chicken.
Today’s educated savages can’t even make that leap. An honest man would say since the models don’t figure in things like water vapor – just a small part of the atmosphere, after all – and don’t actually predict the future, let’s try something else.
Instead, the educated savages award the computer modelers the Nobel Prize.
Primitive superstition is also strong in Leftist economics.
In World War II, the tribes of Papua New Guinea saw vast amounts of wealth coming into the Pacific on both the Allied and Axis sides. They had no way to comprehend the power of industrialized economies fully mobilized and dedicated to the largest war the world had ever seen. The natives made the natural assumption that spirits sent cargo to the earth and the evil outsiders jacked the loot.
So they built fake airplanes. They erected structures in the jungle and filled them with fake cash, sometimes even making fake suitcases.
Hmmm. Make work projects paid for with worthless currency. Doesn’t that sound like Obama’s stimulus plan or Paul Krugman – another educated savage Nobel laureate – looking for an alien threat in order to create demand to boost the economy?
Yes, Keynesian economic theory is a cargo cult, dressed up in suits and the flowery rhetoric of the university. Unfortunately, it shows the same effectiveness.
Welcome to the new Dark Ages, a time of policy based on superstitions easily recognized by savages sitting around the campfire. They might not understand the terms of the new cargo cults that have risen but they’d understand that old time religion.
image illustration via shutterstock / maximillion
As we get older, many of us go to the doctor more than we should. We ask the doctor about things doctors don’t really know much about, like diet and exercise. Doctors – having had no institutional training in diet and exercise while at the same time feeling as though they must maintain their authority over all things physical – most usually provide advice about these things anyway. They advise you to eat less fat and go walking every once in a while.
If you ask about strength training – since you have heard that it was a good idea and you know that walking is not strength training – their advice will be to just lift lighter weights and do more reps. Lighter weights and higher reps, that’s the ticket, right? Same effect, less risk, lighter is safer and more reps make up for the lighter weight, right?
It could be that doctors tell older people to just lift lighter weights because they have a genuine interest in not hurting older people, and they perceive that heavier weight is more dangerous than lighter weight. If they didn’t tell this to everybody else too, I might believe this was their intent. Hell, if they didn’t tell this to everybody, I wouldn’t be writing about it.
You have never seen an article here that I have written about diet, because that is not my field of either expertise or experience. I know something about it, most likely more than your doctor, but I reserve my public opinions on things about which I am not qualified to opine. When your doctor tells you to just use lighter weights and higher reps, he is wrong. Like when I refrain from writing about brain surgery, he should refrain from giving this advice about exercise. Here’s why.
Strength, as I have said many times, is merely the production of force by your muscles. The more weight you lift, the more force you produce. Since you can’t lift as much weight 10 times as you can 5 times, 5 reps allows you to use a heavier weight than 10 reps. Therefore, 5 reps with a heavier weight than 10 reps makes you stronger.
And that’s really all you need to know, because it really is this simple. The more weight you can lift, the stronger you are, and the heavier the weight you use in your training, the stronger you will become. Even you. A heavy set of 10 is mathematically lighter than a heavy set of 5. And there you have it.
But more importantly, sets of 10 are not just inefficient for building strength – they are counterproductive in a couple of ways. First, fatigue is the result of more repetitions of a weight, even a lighter weight. You know this yourself from working with your body. Any task repeated many times produces fatigue, and the heavier the task the more rapidly fatigue sets in. Walking doesn’t count because walking isn’t hard. Shoveling snow is a better example, and it’s easy to get pretty tired pretty quick with a big shovel.
Here’s the critical point: fatigue produces sloppy movement, and sloppy movement produces injuries. A set of 10 gets sloppy at about rep number 8 or 9, unless you’re an experienced lifter, and even then it’s damned hard to hold good form on the last reps of a high-rep set. A set of 5 ends before you get fatigued – 5 reps is an interesting compromise between heavy weight and work volume. Unless you’re a heart/lung patient, 5 reps won’t elevate your breathing rate until after the set is over, but a set of 10 will have your respiration rate elevated before the end of the set.
For a perfect example of how hysteria governs modern debates over complex issues, witness what happened yesterday morning to Governor Chris Christie. For the apparently unpardonable offense of offhandedly suggesting parents ought to have some freedom to decide how their kids are vaccinated, the governor’s political career was declared over. The instantaneous eruption from America’s self-deputized thought police had the governor — only hours later — meekly offering “clarification” of his earlier comments.
The debate over vaccines, itself nearing pandemic proportions in the U.S., is following a familiar pattern. People are either pro-science or anti-; in agreement with the “consensus” or crazy “conspiracists” and “deniers.” Much like the debate over global warming, there’s no room for middle ground; preaching prudence is basically blasphemous. And just as many are calling for climate “deniers” to be ostracized and even arrested, critics and parents who question the conventional wisdom on vaccines are likewise condemned as threats against civilization itself.
Like most everyone else, I am neither a doctor nor even a scientist. But I am smart enough to know there are perfectly valid reasons to question conventional wisdom.
Take the current controversy over measles. From the looks of my Twitter feed and the comments sections under just about any vaccine-related article, you’d think we were talking about the bubonic plague. In fact, measles, despite being highly contagious, isn’t particularly dangerous. So long as your immune system is in decent shape, you’ll be fine. In fact, you might actually want it, as exposure leads to lifetime immunity.
Measles is basically a fever with an accompanying rash. It’s true that in the 1800s, outbreaks caused tragically large numbers of children to die — but these were concentrated in orphanages and hospital wards (places where malnutrition was rampant). As the world prospered, affluence spread, and health improved, in the U.S. the chances of dying after contracting measles dropped to 1-2 percent by the 1930s. By the time a vaccine was introduced in 1963, deaths from measles were virtually nonexistent. Asthma, according to “Vital Statistics of the United States, 1963,” claimed 56 times as many lives.
Today it’s popular to argue that measles would be totally defeated were it not for the Jenny McCarthys of the world. The only problem is that the MMR (measles-mumps-rubella) vaccine does not actually immunize — as most people understand the word — against measles. The most we can expect is temporary protection. That’s because vaccines are injected directly into the body, bypassing the body’s natural immune response. “Most disease-causing organisms enter your body through the mucous membranes of your nose, mouth, pulmonary system or your digestive tract – not through an injection,” explains Dr. Joseph Mercola. “These mucous membranes have their own immune system, called the IgA immune system.”
Initially described as lifelong insurance, health officials realized in the ’70s, when an uptick in measles diagnoses occurred among vaccinated high-school students, that the vaccine should probably be administered more regularly. The CDC now advises receiving the vaccine at 12-15 months, 4-6 years, and again as an adult. The U.S. is also using its third version of a measles vaccine, after the first two proved ineffective.
Which should probably make it no surprise that many of the people catching measles today were vaccinated. Today’s measles cases are occurring in heavily vaccinated populations. When a 2006 outbreak among college students in the Midwest struck, the fact that most of the affected were vaccinated seemingly made no difference. When an outbreak of the mumps hit the NHL this year, many reflexively blamed “anti-vaxxers.” Almost no one reported that every affected player appears to have received the MMR vaccine. The Penguins’ Sidney Crosby received not only the initial MMR, but also a booster just before the Sochi Olympics. The director of the Vaccine Education Center at the Children’s Hospital of Philadelphia, Paul Offit, would only say “we know that the short-term effectiveness of the mumps vaccine is excellent.”
Still, none of this would suggest there’s any reason to avoid regular vaccines — were it not for side effects. And here comes another wrinkle: The MMR vaccine can itself give you measles. In 2013, measles began spreading in British Columbia after a two year-old girl contracted the virus from the vaccine, and then began spreading it to others. Though rare, there are other risks worth considering, too: According to the CDC, side effects to MMR can range from minor (fever, mild rash, swelling), to moderate (seizure, temporary low platelet count), to major (deafness, long-term seizures, permanent brain damage). Note that the latter two categories are worse than the disease itself. Perhaps a bigger problem is how these vaccines weaken the immune response among undernourished patients. “In developing countries, the use of high-titre vaccine at 4-6 months of age was associated with an unexpectedly high mortality in girls by the age of 2 years from infectious childhood illness,” a study reported in the British Medical Journal.
As recently as the 1970s, the CDC recommended children receive four vaccines. Today, per CDC protocol, children can receive around 40 shots between birth and the age of 6. What if that number grows to 100? 500? Will it always be unreasonable to ask, “Is all of this really necessary?”
Finally, this may come as a shock, but it’s actually possible for the government and the medical establishment to get things wrong. This year the CDC admitted its flu vaccine was created for the wrong strain — yet Americans are being instructed to get the shot anyway. Indeed, some parents are being threatened with having their children taken if they aren’t given this (almost certainly) useless flu vaccine. For more than a generation Americans were told to avoid as much as possible saturated fat, salt, and calories in general. More recent science shows that salt consumption has no causal relationship with blood pressure; eating healthy saturated fats like grass-fed butter is good for your heart, brain, and metabolism, and calories are actually a form of energy that gives us life.
Assigning responsibility for your children’s health and well-being to others — even “experts” — is precisely the opposite of parenting. Asking questions, educating yourself, soliciting more than one opinion: these aren’t the behaviors of people to be condemned and vilified. When someone insists you submit to the expertise of others, they’re actually asking you to stop thinking for yourself. And that’s a mistake. Vaccines, like so much of life, are more complex than a simple good-vs.-evil analysis affords. Universal solutions rarely work universally. Parents are right to do their homework.
Here’s Senator Rand Paul saying that most vaccines should be voluntary:
Late in the previous century, when the Toronto Star spiked my column debunking Kwanzaa — the editor scolded me for wanting to “ruin other people’s fun” by telling the truth, which in hindsight would make for an apt if ungainly personal motto on my (non-existent) coat of arms — I sent the piece to Canada’s only conservative magazine, the (since defunct) Alberta Report.
Link Byfield, the magazine’s publisher and editor, snapped it up, and asked for more.
I’d been a professional writer for years, but now my career as a right-wing writer had begun.
Byfield died of cancer this week, at 63.
My fellow AB contributor Colby Cosh was and is a libertarian (some might say craggily contrarian) atheist who was nevertheless embraced right out of grad school by the unabashedly Christian so-con Byfields.
Cosh — today, like many former Report writers, a star columnist at a national publication — quickly composed an obituary of Byfield that is, not surprisingly, insightful, elegant and stringently unsentimental.
(The Byfields have a keen eye for talent, if I do say so myself…)
Another longtime colleague, Peter Stockland, attended a tribute to Byfield last September, an event arranged after he was diagnosed with terminal cancer.
Stockland explained Link Byfield’s influence on recent Canadian history with this succinct formula, one that resembles the mnemonic verse British schoolchildren used to learn to keep their kings and queens straight.
No Byfields, no Alberta Report. No Alberta Report, no Reform Party as it was formed. No Reform Party, no [Progressive Conservative Party] collapse. No PC collapse, no [Conservative Party] Harper government.
Some perspective for American readers:
My husband and I attended a lecture about Israel by Melanie Phillips a few years back.
Phillips, while correct on so many issues, remained convinced that Europe’s “fringe” “right-wing” populist political leaders, while anti-sharia, were also racist, anti-Semitic losers and therefore unwelcome allies in the counter-jihad.
Afterwards, my husband took her aside and explained — to her visible surprise – that Canada’s “fringe right wing” populist Reform Party had once been condemned as backward, bigoted and doomed, too; yet one of its founders, Stephen Harper, was now the staunchly pro-Israel prime minister of Canada, having just won a second federal election.
Non-Canadians are, presumably, more familiar with our “free” “healthcare” system, as I call it.
On that topic, Mark Steyn once quoted a fictional Canadian — OK, Quebecois — character’s decision to die a principled death:
Sébastien wants his dad to go to Baltimore for treatment, but Remy roars that he’s the generation that fought passionately for socialized health care and he’s gonna stick with it even if it kills him.
“I voted for Medicare,” he declares. “I’ll accept the consequences.”
But Link Byfield was a real man, not an imaginary one.
That makes what follows all the more notable.
Yet what truly mattered to [Byfield] was having lived out, as far as possible in the midst of a train wreck, a principled reality.
I mentioned an e-mail he sent last summer explaining his choice to forgo chemotherapy because it would not save him, yet would cost taxpayers $100,000.
I said I could not imagine other Canadians who would factor such public policy considerations into their personal health care.
“But that would have been standard thinking among politically literate citizens 50 ago,” he said. “People wouldn’t even articulate it. It would just be something they would think.”
When I asked his source for thinking that way, he said: “Thou shalt not steal.”
How informed is informed? What is the psychological effect of being told of every last possible complication of a treatment? Do all people react the same way to information, or does their reaction depend upon such factors as their intelligence, level of education, and cultural presuppositions, and if so does the informing doctor have to take account of them, and if so how and to what degree? An orthopedic surgeon once told me that obtaining informed consent from patients now takes him so long that he had had to reduce the number of patients that he treats.
An article in a recent edition of the New England Journal of Medicine extols the ethical glories of informed consent without much attention to its limits, difficulties and disadvantages.
It starts by referring to a trial of the level of oxygen in the air given to premature babies, of whom very large numbers are born yearly. Back in the 1940s it was thought that air rich in oxygen would compensate for premature babies’ poor respiratory system, but early in the 1950s British doctors began to suspect, correctly, that these high levels of oxygen caused retinal damage leading to permanent blindness. Fifty years later, the optimal level of oxygen is still not known with certainty, and a trial was conducted that showed that while higher levels of oxygen caused an increased frequency of retinopathy, lower levels resulted in more deaths. The authors of the trial have been criticized because they allegedly did not inform the parents of the possibility that lower levels of oxygen might lead to decreased survival, which was reasonably foreseeable.
How reasonable does reasonability have to be? Many of the most serious consequences of a treatment are totally unexpected and not at all foreseeable (no one suspected that high levels of oxygen for premature babies would result in blindness, for example, and it took many years before this was realized). Ignorance is, after all, the main reason for conducting research.
But suppose parents of premature babies had been asked to participate in a trial in which their offspring were to be allocated randomly to an increased risk of blindness or an increased risk of death. Surely this frankness would have been cruel, all the more so as the precise risks could not have been known in advance. Parents would feel guilt alike if their babies died or were blind.
Now that the answer is known, more or less, parents can be asked to choose in the light of knowledge: but their informed consent will be agonizing because there is no correct answer. Personally, I would rather trust the doctor sufficiently to act in my best interests in the light of his knowledge and experience. So far in life I have not had reason to regret this attitude, though I am aware that it has its hazards also. But
…why should they know their fate?
Since sorrow never comes too late,
And happiness too swiftly flies.
Thought would destroy their paradise.
No more; where ignorance is bliss,
‘Tis folly to be wise.
And I have often thought what medical ethicists would have made of the pioneers of anesthesia. They did not seek the informed consent of their patients, in part, but only in part, because they hadn’t much information to give. What moral irresponsibility, giving potentially noxious and even fatal substances to unsuspecting experimental subjects without warning them of the dangers!
And there are even some medical ethicists who think we should not take advantage of knowledge gained unethically. All operations should henceforth be performed without anesthesia, therefore.
I can’t decide if this is troubling or decent advice: “The enemy within: People who hear voices in their heads are being encouraged to talk back:”
Research suggests that up to one in 25 people hears voices regularly and that up to 40 per cent of the population will hear voices at some point in their lives. But many live healthy and fulfilling lives despite those aural spectres.
Recently, Waddingham and more than 200 other voice-hearers from around the world gathered in Thessaloniki, Greece, for the sixth annual World Hearing Voices Congress, organised by Intervoice, an international network of people who hear voices and their supporters. They reject the traditional idea that the voices are a symptom of mental illness. They recast voices as meaningful, albeit unusual, experiences, and believe that potential problems lie not in the voices themselves but in a person’s relationship with them.
“If people believe their voices are omnipotent and can harm and control them, then they are less likely to cope and more likely to end up as psychiatric patients,” says Eugenie Georgaca, a senior lecturer at the Aristotle University of Thessaloniki and the organiser of this year’s conference. “If they have explanations of voices that allow them to deal with them better, that is a first step toward learning to live with them.”
The road to this form of recovery often begins in small support groups run by the worldwide Hearing Voices Network (HVN). Founded in the Netherlands in 1987, it allows members to share their stories and coping mechanisms – for example, setting appointments to talk with the voices, so that the voice-hearer can function without distraction the rest of the day – and above all gives voice-hearers a sense of community, as people rather than patients.
Here are The basic assumptions of INTERVOICE from their website:
Hearing voices is a normal though unusual and personal variation of human experience.
Hearing voices makes sense in relation to personal life experiences.
The problem is not hearing voices but the difficulty to cope with the experience.
People who hear voices can cope with these experiences by accepting and owning their voices.
A positive attitude by society and its members towards people hearing voices increases acceptance of voices and people who hear voices. Discrimination and excluding of people hearing voices must stop.
I am leaning towards troubling….
Some of the happiest afternoons of my childhood were passed in the company of a guy named Norm Breyfogle.
Norm was the artist on Detective Comics back in the late ‘80s. But it might be more accurate to say he was the window through which I got to see Batman patrolling the rooftops of Gotham, beating the ever-loving hell out of drug dealers and triumphing over crazed killers. For me and many other late-Generation Xers, Norm was the definitive Batman artist. It was his version of the character (along with writer Alan Grant) that my generation grew up with.
It’s probably hard to appreciate now how innovative Norm’s style was at the time. I couldn’t have explained back then, of course—I just liked the artwork’s energy and story-telling—but looking back his style was much more expressionistic than his contemporaries’. Perspective and shadow were distorted to amplify every panel’s mood.
But it wasn’t just a scene’s feel that he cared about. There was so much energy in Norm’s action scenes as he showed heartbeat-by-heartbeat how Batman defeated a given bad guy:
Long after I’d stopped reading comics, Christopher Nolan’s Batman movies would occasionally make me nostalgic for the version I grew up with. It was a pleasant surprise to discover online that I was just one of many impressed, and grateful, for Norm’s years on the character. It made me happy to know that, even decades later, his work on Batman was remembered as one of the best runs in comics history.
My generation’s Batman, still one of the best. Cool.
So it came as a shock to learn that Norm Breyfogle, just 54 years old, suffered a stroke in mid-December.
He’s expected to recover eventually, but in the meantime the stroke has paralyzed his left side which is especially heartbreaking considering Norm is a left-handed artist. It’s also put him in the hole for $200K on medical expenses. His family has turned to crowdfunding to help with the costs, and has set up a contribution site here.
I gave, and have since been watching the funds-raised bar, hoping it will make it to $200K. It hovers at $70K as of this writing. There’s only 7 days left in the drive.
The comic book blogosphere has covered it, trying to spread word about the crowdfunding effort. But the guys reading those sites will probably skew younger. They’re not of the generation that grew up reading Norm’s Batman. They don’t owe childhood memories to him like I do.
Most guys my age don’t follow comic book news anymore. I only learned about what happened to Norm myself because a friend who has kept a hand in the comics posted it on Facebook. With the clock ticking down, it’s time to get word out to other corners of the internet where his old fans may now be.
Which is why I’m here now. To get word out that a man who brought a lot of happiness to a generation of kids needs help. To let all the people that grew up enjoying Norm’s work know that he could use some of your help now.
If you’re able to, please consider contributing.
Medical history is instructive, if for no other reason than that it might help to moderate somewhat the medical profession’s natural inclination to arrogance, hubris and self-importance. But the medical curriculum is now too crowded to teach it to medical students and practicing doctors are too busy with their work and keeping up-to-date to devote any time to it. It is only when they retire that doctors take an interest in it, as a kind of golf of the mind, and by then it is too late: any harm caused by their former hubris has already been done.
Until I read an article in a recent edition of the Lancet, I knew of only one eminent doctor who had been shot by his patient or a patient’s relative: the Nobel Prize-winning Portuguese neurologist Egas Moniz, who was paralyzed by a bullet in the back. It was he who first developed the frontal lobotomy, though he was also a pioneer of cerebral arteriography. As he was active politically during Salazar’s dictatorship, I am not sure whether his patient shot him for medical or political reasons, or for some combination of the two.
I have worked in the fitness industry since 1978, and have owned a gym since 1984. Since I went into business for myself, I have approached the teaching of strength training from a completely different perspective than the industry’s standard model — I have taught all my members to lift barbells, as opposed to the machine-based exercise paradigm used by the commercial fitness industry at large.
During my time as a gym owner I have made several mistakes, none of which had anything to do with my decision to teach everybody how to use barbells safely, efficiently, and productively. Rather, my biggest regret was not doing so, once, when I should have.
Dr. Coleman came to the gym on the advice of his doctor. He was in his late 60s at the time, still a working cardiologist, but he was not terribly robust even for a guy his age. He was a very nice man, excruciatingly polite to everyone and generous to a fault. I remember the first question I asked him, being one of the first doctors we’d had in the gym and me being curious about lots of things: “How is it, Dr. Coleman, that a dog can drink nasty water out of a puddle in the road and be perfectly fine, but if I did that I’d get sick — as a dog? Haha.” He regarded me momentarily, as if deciding how to respond to a curious but dull child (not an altogether inappropriate assessment), and calmly explained that there were profound differences in the digestive environment between that of myself and my little bulldog girlfriend Dumplin. He was a patient man as well.
My friend Cardell ended up with Dr. Coleman as his personal training client. Cardell and I had trained together for years, starting at the YMCA in downtown Wichita Falls, Texas, in the early ‘80s. This was the same weight room in which Bill Starr, former editor of York Barbell’s Strength and Health and one of the first strength coaches in the world, had started out in the late ‘50s – the room had history. It was important to us too, as a place where we honed our skills and grew as lifters and men. When I bought Anderson’s Gym in 1984, we moved our training headquarters to the renamed Wichita Falls Athletic Club, and I began the task of applying barbell training to a commercial gym’s clientele.
Following the prescribed industry methodology we had both been taught by the then-becoming-mainstream National Strength and Conditioning Association, Cardell used a machine-based approach in his work with Dr. Coleman. It was perfectly congruent with the thinking at the time, and it still is: the client was old, free weights are dangerous, we mustn’t hurt old people — we mustn’t even entertain the possibility of hurting old people — and Dr. Coleman skated through his workouts with Cardell unscathed.
He also failed to make any significant progress toward a more robust physical capacity. Dr. Coleman joined the gym as a frail older man, never walking with the aggressive, confident stride of a fit person, and never assuming the positions of sitting, standing back up, or getting in and out of the car without carefully and deliberately measuring his position. He left the gym many years later a still-frail, even-older man.
And I let it happen. My fault for standing there, watching but paying no attention, as the potential for reversing the effects of age and a sedentary lifestyle slipped through our fingers.
Of late the New England Journal of Medicine has seemed like the burial ground of good ideas. Researchers follow a promising lead only to find that their new idea fails the crucial test of experience: and the difference between success and failure in research is made to appear as much a matter of chance or luck as of brilliance or skill.
In the first issue of the Journal for 2015, Dutch researchers from 16 different hospitals report an unequivocal success in the treatment of ischemic stroke.
Until now the only proven worthwhile treatment of patients with the kind of stroke that results from the blockage of a cerebral artery is the infusion within four and a half hours of the drug called alteplase, which dissolves thrombus (and which is manufactured from the ovaries of Chinese hamsters). But even with the use of this drug the prognosis is not very good, and there are several contra-indications to its use.
This year has been a strange one in terms of celebrity behavior, some of which was concerning if not entirely disturbing, and apparently contagious as well. Examples of skin selfies and exchanges that were once considered private are posted all over the internet. Active participants are all ages, shapes and sizes: beware the visuals of regular people (generally females) sharing their cups overflowing or unsuspecting panties being eaten alive by a ravenous pair of robust cheeks. Who’d have guessed that plumber’s crack would be exalted to such artistic (albeit unsavory) exhibitionist displays?
Yet for some unknowable reason, fans can’t seem to get enough lifestyle advice from entertainers, emulating even the most bizarre spectacles, especially when it comes to diet and beauty.
Female celebs in particular offer infinite health counsel for the masses. And women of all walks eat it up, the more peculiar, the better. Such odd “healthy” behaviors include January Jones ingesting her own dried and encapsulated placenta or Lady Gaga touting her revolutionary “Hangover Diet” consisting of nothing but whiskey… Then of course there is the explosive “Fermented Foods Diet” that Madonna uses to keep her colon free from debris. Sounds delicious.
Everyone knows the pleasures of having his prejudices confirmed by the evidence. The pleasures of changing one’s mind because of the evidence are somewhat less frequently experienced, though none the less real. Among those pleasures is that of self-congratulation on one’s own open-mindedness and rationality. It would therefore delight me to learn that my prejudice about obesity — that it is a natural consequence of overeating, which is to say of human weakness and self-indulgence — was false.
I therefore read with interest and anticipation a recent article in the New England Journal of Medicine with the title “Microbiota, Antibiotics, and Obesity.” The connection of antibiotics with obesity had not previously occurred to me; perhaps the real reason why so many people now have the appearance of beached whales was about to be revealed to me.
It is easier to advise than to have or to retain a sense of proportion, especially when it is most needed. I have never known anyone genuinely comforted by the idea that others were worse off than he, which perhaps explains why complaint does not decrease in proportion to improvement in general conditions. And he would be a callow doctor who tried to console the parents of a dead child with the thought that, not much more than a century ago, an eighth of all children died before their first birthday.
Still, it is well that from time to time medical journals such as the Lancet should carry articles about medical history, for otherwise we might take our current state of knowledge for granted. Ingratitude, after all, is the mother of much discontent. To know how much we owe to our forebears keeps us from imagining that our ability to diagnose and cure is the consequence of our own peculiar brilliance, rather than simply because we came after so much effort down the ages.
A little article in the Lancet recently was written by two historians who are in the process of analyzing the results of 9000 coroners’ inquests into accidental deaths in Tudor England. It seems astonishing to me that such records should have survived for more than four centuries, but also that the state should have cared enough about the deaths of ordinary people to hold such inquests (coroners’ inquests had already been established for 400 years at the time of the Tudors). In other words, an importance was given to individual human life even before the doctrines of the Enlightenment took root: the soil was already fertile.
When I was working in Africa I read a paper that proved that intravenous corticosteroids were of no benefit in cerebral malaria. Soon afterwards I had a patient with that foul disease whom I had treated according to the scientific evidence, but who failed to respond, at least as far as his mental condition was concerned – which, after all, was quite important. To save the body without the mind is of doubtful value.
I gave the patient an injection of corticosteroid and he responded as if by miracle. What was I supposed to conclude? That, according to the evidence, it was mere coincidence? This I could not do: and I have retained a healthy (or is it unhealthy?) skepticism of large, controlled trials ever since. For in the large numbers of patients who take part in such trials there may be patients who react idiosyncratically, that is to say, differently from the rest.
A paper in a recent edition of the New England Journal of Medicine brought back my experience with cerebral malaria. Animal experimentation had shown that progesterone, one of the class of steroids produced naturally by females, protected against the harmful effects of severe brain injury. The paper does not specify what exactly it was necessary to do to experimental animals to reach this conclusion, but it does say that it has been proven in several species. What is not said is often as eloquent as what is said.
There’s no shortage of media representations of childbirth, between television and movies. The scene, which has played out for as long as babies have been “born” on television, is fairly cookie cutter: the woman’s water breaks and there’s a mad dash to the hospital — otherwise the baby will be born in a stalled elevator. The woman screams in pain, begging for drugs, and then out comes a beautiful, usually clean baby who cries immediately before being wrapped and placed in mom’s arms.
As with all mainstream media representations of real-life events, writers and producers take a lot of liberties with the scene and how it plays out in real life. Since having a child myself, I often wonder if anyone on the writing or producing staff has ever been present for the birth of a child, given how diametrically different these moments are in real life.
The way childbirth is portrayed isn’t just inaccurate, but also fuels a false perception in our society of childbirth as scary, dangerous, and often negative. Several aspects of how childbirth plays out on screen also affect how real life couples may process their own experience in the moment. So what can a couple expect out of the birth of their child? What does the media get wrong? This list is just a start:
1. Babies come out pink
One of the scariest moments for any parent who has seen enough babies being born on television is the color their child comes out. While some people may be ready for the goop and slime that coat a baby’s skin, the color of their skin usually comes as a total shock, even if intellectually one has been made aware that often babies don’t come out flesh-colored or pink right out of the womb.
On the series Parenthood, which, unsurprisingly, has seen quite a few births over the course of the last six seasons, the youngest son of the clan, Crosby Braverman, had a daughter with his wife Jasmine. She came out looking like this:
The very first moments a baby comes into the world, before they’ve had an opportunity to get oxygen into their bodies, a baby’s skin tone, regardless of race, is often a deep shade of purple, which can be petrifying if unprepared, which most parents are. Those first fleeting moments are usually forgotten in the haze of new parenthood, but it’s a shame that most first-time parents find themselves scared for their child’s safety and well-being before the cord has even been cut. Better images would go a long way in changing our image of brand new human beings, highlighting what can be normal in healthy childbirth.
The truth, the whole truth, and nothing but the truth: that is what one swears to tell in a court of law. One lies there and then. It is a noble ideal that one swears to, but one that in practice is impossible to live up to. Not only is the truth rarely pure and never simple, as Oscar Wilde said, but it is never whole, even in the most rigorous of scientific papers.
Not that scientific papers are often as rigorous as they could or should be. This is especially so in trials of drugs or procedures, the kind of investigation that is said to be the gold standard of modern medical evidence.
Considering how every doctor learns that the most fundamental principle of medical ethics is primum non nocere, first do no harm, it is strange how little interest doctors often take in the harms that their treatment does. Psychologically, this is not difficult to understand: every doctors wants to think he is doing good, and therefore has a powerful motive for disregarding or underestimating the harm that he does. But in addition, trials of drugs or procedures often fail to mention the harms caused by the drug or procedure that they uncover.
This is the royal road to over-treatment: it encourages doctors to be overoptimistic on their patients’ behalf. It also skews or makes impossible so-called informed consent: for if the harms are unknown even to the doctor, how can he inform the patient of them? The doctor becomes more a propagandist than informant, and the patient cannot give his informed consent because such consent involves weighing up a known against an unknown.
A paper in a recent edition of the British Medical Journal examined a large series of papers to see whether they had fully reported adverse events caused by the drug or procedure under trial. It found that, even where a specific harm was anticipated and looked for, the reporting was inadequate in the great majority of cases.
“The due process clause of the fourteenth amendment guarantees, protects the rights of parents but the fact is that we have to put it in law. You wouldn’t think we have to go here. What we’re seeing in our country today leads us to believe that if we don’t put this stuff into law then we are behind the eight ball and we find ourselves with these kinds of situations. I’m just afraid, down the road, we’re going to see more and more cases like [the Isaiah Rider case].” — Ken Wilson (R-MO)
We’re farther “down the road” than most dare to imagine.
The bill Rep. Wilson introduced states that a parent cannot be charged with medical child abuse for disagreeing with medical advice and choosing treatment of another doctor. Yeah. We’re there.
You might remember the well-publicized ordeal of Justina Pelletier. It seemed like a fluke of injustice, an isolated case. So beyond right, it was easy to assume there’s more to the story. In the Pelletier case, rather than receiving discharge papers, parents were charged with “medical child abuse,” the new term that has replaced Munchausen by proxy (MSbP). Mr. Pelletier was surrounded by agents of the Massachusetts Department of Children and Families (DCF) and hospital security and ushered off the premises. Justina became a ward of the state for 16 months and her health deteriorated.
In a press conference, Reverend Patrick Mahoney, director of the Christian Defense Coalition in Washington, D.C., and spokesperson for the Pelletier family, made a remarkable statement that became a mirror reflecting an unsettling image of a dangerous mindset:
“t’s easier for us to want to believe, or wrap our brains around the fact that a family is mistreating their child, than the alternative to that, and the alternative to that, is what happened in this case and that is, with impunity government agencies and courts have removed a child from the loving care of their parents—and so that’s that obstacle that no one wants to believe that reality.
“That reality” is the last thing parents think of when they have a chronically ill child or have taken a holistic path to health.
Michelle Rider, the 34-year-old registered nurse and single mother of Isaiah Rider, the boy in the above video, told PJ Lifestyle just why we have a hard time accepting this is happening:
We are taught that hospitals are safe, that doctors are safe, and DCFS intervenes when intervention is needed. So when we accept the fact that this is really happening– we are accepting that we are not safe, and our children are not safe.
While President Barack Obama asks the nation if we will accept the “cruelty of ripping children from their parents’ arms,” it’s blatantly apparent to parents like Michelle that he isn’t talking about sick children like Isaiah. Agents of the state — with calculated impunity — take their children.
On the very day a law was introduced in his name, his worst fears came true.
The “New Year’s Resolution” must be one of the most ridiculous of human customs. You identify a problem you’re having, and then you wait until January 1 of the next year to address it, in the spirit of a group-participation event that nobody completes and nobody approaches seriously. You decide that you’re going to quit eating chocolate or stop scratching your feet. You stop until January 5. You’re typical.
In the gym business, New Year’s Resolution business used to be a bigger factor than it is now. Twenty-five years ago, fewer people participated in the fitness industry during the regular course of the year, so more people were free to buy memberships in January they weren’t going to use. Back then, New Year’s business was a significant percentage of the year’s gross, and the leveling off of this spike is really a good thing for everybody. The gym isn’t as crowded with amateurs for the three weeks after their hangovers are gone, and more people are using the gym more of the year.
But if you fall into the category of die-hard NYRers that insist on giving it a shot this year — again — let me suggest a different approach this time: strength training.
Training is the systematic approach a person employs to improve a physical ability. Preparing for a marathon, a football season, or a weightlifting meet are examples of training. They require an analysis of the specifics of the task, an assessment of where you are now in relation to where you want to be, and a plan for getting there. The plan and its constituent components are the training. The constituent components are the workouts, and each workout is important because together they produce an accumulation of increasing physical capacity. The plan that controls and directs the process is what makes training different than what you did last year.
Exercising is what you did last year.