There is a lot of misinformation circling around in mainstream nutrition.
I have listed the worst examples in this article, but unfortunately this is just the tip of the iceberg.
Here are the top 11 biggest lies, myths and misconceptions of mainstream nutrition.
1. Eggs Are Unhealthy
There’s one thing that nutrition professionals have had remarkable success with… and that is demonizing incredibly healthy foods.
The worst example of that is eggs, which happen to contain a large amount of cholesterol and were therefore considered to increase the risk of heart disease.
But recently it has been proven that the cholesterol in the diet doesn’t really raise the cholesterol in blood. In fact, eggs primarily raise the “good” cholesterol and are NOT associated with increased risk of heart disease (1, 2).
What we’re left with is one of the most nutritious foods on the planet. They’re high in all sorts of nutrients along with unique antioxidants that protect our eyes (3).
To top it all of, despite being a “high fat” food, eating eggs for breakfast is proven to cause significant weight loss compared to bagels for breakfast (4, 5).
Bottom Line: Eggs do not cause heart disease and are among the most nutritious foods on the planet. Eggs for breakfast can help you lose weight.
2. Saturated Fat is Bad For You
A few decades ago it was decided that the epidemic of heart disease was caused by eating too much fat, in particular saturated fat.
This was based on highly flawed studies and political decisions that have now been proven to be completely wrong.
A massive review article published in 2010 looked at 21 prospective epidemiological studies with a total of 347.747 subjects. Their results: absolutely no association between saturated fat and heart disease (6).
The idea that saturated fat raised the risk of heart disease was an unproven theory that somehow became conventional wisdom (7).
Eating saturated fat raises the amount of HDL (the “good”) cholesterol in the blood and changes the LDL from small, dense LDL (very bad) to Large LDL, which is benign (8, 9).
Meat, coconut oil, cheese, butter… there is absolutely no reason to fear these foods.
Related at PJ Lifestyle:
When it comes to adding a shot of alcohol to your cold or flu remedy, it’s hard not to wish those boozy concoctions are doing some good for your health. As it turns out, they are.
Drinks like hot toddies, which traditionally contain whiskey, lemon and honey, can actually give cold and flu patients relief from their symptoms, said Dr. William Schaffner, chair of preventive medicine at Vanderbilt University Medical Center in Nashville, Tenn.
It just can’t prevent or cure a cold or flu virus.
“It would not have an effect on the virus itself, but its effect on the body can possibly give you some modest symptom relief,” Schaffner said. “The alcohol dilates blood vessels a little bit, and that makes it easier for your mucus membranes to deal with the infection.”
Since Sept. 30, more than 5,100 influenza cases have been reported to the Centers for Disease Control and Prevention, including 40 cases of H1N1.
Schaffner said warm moisture from a steaming mug of any beverage can offer symptom relief.
“That’s part of why chicken soup is thought to work,” he said.
Related at PJ Lifestyle:
The 79 million boomers alive today make up over a quarter of the entire American population. Last year, the oldest members of the generation turned 65. For the next 18 years, 10,000 boomers will turn 65 each day, according to the Pew Research Center. Today, the average life expectancy for women in America is 81 years old. For men, it is 76 years old. According to Gallup, the expected retirement age in the United States is 67. So, as Boomers enter into the retirement that precedes the end of their lives, will they find meaning and satisfaction as they age? Will they thrive, flourish, take a slow ride off into the sunset?
This is an enormously important question not just because of the implications it has on the happiness of real people, but also for the consequences it will have on society, social services, and our culture as a whole. As Pew points out, “By force of numbers alone, they almost certainly will redefine old age in America, just as they’ve made their mark on teen culture, young adult life and middle age.”
The baby boomers are becoming characterized by startlingly high rates of depression and pessimism. Boomers are more depressed and less satisfied with their lives than both those who are older and younger than them, according to a study published in the American Sociological Review in 2008.
Women, in particular, are suffering. In the American population generally, women tend to be more depressive than men, and this is true of the boomers as well. In 2008, the Centers for Disease Control and Prevention found that between 1999 and 2004, rates of suicide increased by 20 percent for 45-to-54-year-olds, a far greater increase than that experienced in nearly every other age group. Among women who were 45-to-54-year-olds, the increase was a staggering 31 percent. Suicide aside, boomers have found another way to cope with their doldrums: according to the National Institute of Health, between 2002 and 2011, the number of illicit drugs users aged 50 to 59 tripled.
What is going on? This is a generation that is better educated, more successful, and has better access to health care than the generations that directly preceded it. This is the generation whose women benefitted from the gains of second wave feminism.
Experts on aging, depression, and happiness are at a loss for what is causing the boomers’ funk. One explanation is stress. “Much of the research is pointing to daily stress as a precipitator of their depression,” according to Donald A. Malone, Jr., the director of the Mood and Anxiety Clinic in the department of psychiatry and psychology at the Cleveland Clinic.
Related at PJ Lifestyle:
As recently as a decade ago, clinicians believed that only 5 percent of anorexics were male. Current estimates suggest it’s closer to 20 percent and rising fast: More men are getting ill, and more are being diagnosed. (One well-regarded Canadian study puts the number at 30 percent.) It’s unclear why, but certainly twenty years of lean, muscular male physiques in advertising, movies, sports, and of course, magazines like GQ—from Marky Mark to Brad Pitt to David Beckham—have changed the way both men and women regard the male body. And thanks to the web, those images are easy to seek out and collect. For American men, the chiseled six-pack has become the fetishized equivalent of bigger breasts. Like all fetish objects, it stands for something deeply desired: social acceptance, the love of a parent or partner, happiness.
But many afflicted men feel too stigmatized to go to a doctor—and many doctors don’t recognize the early, ambiguous symptoms. “It is not what a primary-care physician will consider at first glance,” says Mark Warren, founder of the Cleveland Center for Eating Disorders. “Often it won’t be what they consider at fourth or fifth glance.”
Diagnosis is hard. Finding treatment is even harder. Many residential centers don’t admit men, out of a belief that treatment should be sex-specific. There is no data to support this belief, though clinicians think that certain gender-specific issues are best addressed in therapy or in single-sex groups within a larger coed facility. Some centers prefer not to treat men, because they may inadvertently remind female clients of the trauma they have endured at the hands of abusive fathers, husbands, or lovers. Of the fifty-eight residential treatment centers listed in the Alliance for Eating Disorder Awareness’s 2011-12 guide, only twenty-five admit men. “Most men with eating disorders are living with them quietly and painfully,” says Warren. “I would guess at least three-quarters of them don’t get any treatment. They’re suffering without help.”
More on health at PJ Lifestyle:
(AP) Researchers have identified a mysterious new disease that has left scores of people in Asia and some in the United States with AIDS-like symptoms even though they are not infected with HIV.
The patients’ immune systems become damaged, leaving them unable to fend off germs as healthy people do. What triggers this isn’t known, but the disease does not seem to be contagious.
This is another kind of acquired immune deficiency that is not inherited and occurs in adults, but doesn’t spread the way AIDS does through a virus, said Dr. Sarah Browne, a scientist at the National Institute of Allergy and Infectious Diseases.
She helped lead the study with researchers in Thailand and Taiwan, where most of the cases have been found since 2004. Their report is in Thursday’s New England Journal of Medicine.
“This is absolutely fascinating. I’ve seen probably at least three patients in the last 10 years or so” who might have had this, said Dr. Dennis Maki, an infectious disease specialist at the University of Wisconsin in Madison.
It’s still possible that an infection of some sort could trigger the disease, even though the disease itself doesn’t seem to spread person-to-person, he said.
The disease develops around age 50 on average, but does not run in families, which makes it unlikely that a single gene is responsible, Browne said.
Image courtesy shutterstock / Sebastian Kaulitzki
More on health and diseases at PJ Lifestyle:
Last summer, a former schoolteacher from Georgia named Besse Cooper became the world’s oldest living human. She was 114 years old—the same age at which nearly everyone earns the distinction, and an age that only a few titleholders ever surpass. Exploring that apparent age barrier in a Slate piece at the time, I wrote that “if historical trends hold, (Cooper) will likely be dead within a year.”
But historical trends did not hold. In defiance of the odds, Cooper, who was born in 1896, was alive and smiling on Sunday to celebrate her 116th birthday. She became just the eighth person in human history to verifiably reach that age.
More at PJ Lifestyle on living longer:
via “Universal Mediocrity” by Theodore Dalrymple – City Journal.
In April, the British Medical Journal published “How the NHS Measures Up to Other Health Systems,” a report about two studies conducted by the New York–based Commonwealth Fund that compared the health-care systems of 14 advanced countries. On the 20 measures of comparison, Britain’s famous or infamous centralized system, the National Health Service, performed well in 13, indifferently in two, and badly in five. Was this a cause for national rejoicing?
If popular satisfaction is the aim of a health-care system, the answer must be yes. According to the report, the British were the most satisfied with their health care of all the populations surveyed; they were the most confident that in the event of illness, they would receive the best and most up-to-date treatment; and they were the least anxious that their personal finances would prevent them from receiving proper treatment. One could doubtless raise objections to these measures of comparison, but let us for the sake of argument take the results at face value. Subjective satisfaction and relief of anxiety are not minor achievements. Indeed, though the free market’s ability to satisfy more needs and desires than any other system is usually cited as one of its principal advantages, here was an apparent instance of the contrary: a nonmarket health-care system that yielded the most satisfaction.
Still, the studies contained a paradox that the authors of the BMJ article failed to notice or, at any rate, to remark upon. On several measures of actual achievement, rather than subjective assessment, the NHS came out the worst of all the systems examined. For example, it ranked worst for five-year survival rates in cervical, breast, and colon cancer. It was also worst for 30-day mortality rates after admission to a hospital for either hemorrhagic or ischemic stroke. On only one clinical measure was it best: the avoidance of amputation of the foot in diabetic gangrene. More than one reason for this outcome is possible, but the most likely is that foot care for diabetics—a matter of no small importance—is well arranged in Britain; the amputation rate is four times higher in the United States.
Related at PJ Lifestyle:
The list of effective antibiotics has been dwindling as the bacteria became resistant, and now its down to one. Five years ago, the CDC said fluoroquinolones were no longer effective, but oral cephalosporins were still a common/easy treatment. Now injected ceftriaxone is the only recommended effective drug we have left. And it has to be given along with either azithromycin or doxycycline.CDCSo, yes, getting gonorrhea now means that you have to go in and get antibiotics through a needle. And then everyone with whom youve had sex in the last 60 days has to get tested, too.
Once gonorrhea becomes resistant to the last of our cephalosporin antibiotics — “its only a matter of time,” according to Dr. Gail Bolan, Director of STD Prevention at the CDC in todays announcement — we will have no treatment. Then when it gets into your bloodstream, it will be lethal.
I always have this sense that someone will figure it out before that time comes, but there is very little research and development going on right now in this area. Dr. Bolan mentioned one set of ongoing clinical trials.
Image courtesy shutterstock / Arkady
More on health and medicine at PJ Lifestyle:
Theodore Dalrymple: Is Obesity a Disease or a Moral Failing?
Dr. Helen Smith: Fruit Flies Give Clues to Why Women Live Longer Than Men
Dr. Peter Weiss: Defensive Medicine Kills
The critics of genetic engineering in agriculture—also known as “genetic modification” (GM) or gene-splicing—for decades have relied upon and promulgated The Big Lie: that food from genetically engineered crops is untested, unsafe, unwanted, and unneeded. All of these assertions, made by radical anti-technology organizations such as the National Resources Defense Council (NRDC), Environmental Defense, the Center for Science in the Public Interest, the Center for Food Safety, the Union of Concerned Scientists, and Greenpeace are demonstrably false.
The benefits of genetically engineered crops are proven. According to the International Service for the Acquisition of Agri-Biotech Applications, from 1996 to 2010, the use of modern genetic engineering technology increased crop production and value by $78 billion; it obviated the need to apply of 443 million kg of pesticide active ingredients to crops; in 2010 alone, it reduced CO2 emissions by 19 billion kg, the equivalent of taking approximately 9 million cars off the road; it conserved biodiversity by saving 91 million hectares of land; and it helped alleviate poverty by increasing the agricultural productivity and food security of 15 million small farmers who are some of the poorest people in the world.
The Journal of Nutrition reports that the superfood contains antioxidants epicatechin and catechin that shield skin from the sun- only when you eat it. Women who consumed 326 milligrams of high-flavanol cocoa per day for 12 weeks had decreased sensitivity to UV light versus the participants who ingested just 27 milligrams per day. Another bonus for chocolate lovers? The study also showed that the women who ate more chocolate had smoother, more hydrated skin. (A note of caution: the study doesn’t claim that chocolate will provide adequate UV protection all on it’s own, so for now we’ll be slathering on our broad-spectrum SPF while we stuff our face with the sweet stuff.)
(CBS) Olympic champion Michael Johnson believes descendants of West African slaves have a “superior athletic gene” that gives black American and Caribbean sprinters an advantage.
“Over the last few years, athletes of Afro-Caribbean and Afro-American descent have dominated athletics finals,” Johnson told The Daily Mail in the United Kingdom. “It’s a fact that hasn’t been discussed openly before. It’s a taboo subject in the States but it is what it is. Why shouldn’t we discuss it?”
You know how we’re supposed to need “peace and quiet” to be able to concentrate? Maybe that’s true if the task at hand is paying bills or something else similarly straightforward, but according to a new study, “moderate ambient noise” is more likely to trigger “creative, innovative and abstract” thoughts than silence. So what qualifies as moderate ambient noise?
Think Starbucks on an average afternoon — busy but not packed, buzzing but not booming. Because when that moderate noise gets to the not-so-moderate level, that creative thought thing goes right out the window. Much like a recent not-so-average afternoon I spent on my laptop at a Starbucks where the very enthusiastic young barista pumped the music up really loud and sang along really really loud to each and every song.
A friend recently said to me, “I often feel like my thoughts are hammers, and they keep on hammering down on this thing known as my brain.” This is a pretty apt description of how a lot of people feel every day; I’ve certainly experienced it. I’ve written, here and there, about how the brain copes and how to dial down the background level of stress. But an even more fundamental question is how to deal with the negative, or otherwise undesirable, thoughts we have, on a moment-to-moment basis. In other words, when just you and your brain are alone together, how do you get it to quit assaulting you and just let you be?
In principle, the answer is beautifully simple – thoughts don’t have to be believed. You can just acknowledge the ridiculous or negative thoughts that pop into your head, chuckle at them, and then release them. This is the essence of mindfulness.
The problem is that, depending on your outlook or your level of stubbornness, this practice doesn’t always work so easily, at least in the beginning. To help try to figure out how to quell the thoughts that we don’t want to have, I turned to neuroscientist and mindfulness expert, Judson Brewer, MD, PhD, who has done some beautiful work in his lab at Yale on the neural (and behavioral) changes that come from mindfulness practice. Not only does he study it, he’s lived it: He knows firsthand both how challenging it can be to bring attention away from the negative chatter, and, fairly recently, how rewarding it can be when it does work.