I just saw this article linked on Drudge stating that older Americans are hooked on vitamins even though there is scarce evidence that they work:
More than half of Americans take vitamin supplements, including 68 percent of those age 65 and older, according to a 2013 Gallup poll. Among older adults, 29 percent take four or more supplements of any kind, according to a Journal of Nutrition study published in 2017.
Often, preliminary studies fuel irrational exuberance about a promising dietary supplement, leading millions of people to buy in to the trend. Many never stop. They continue even though more rigorous studies — which can take many years to complete — almost never find that vitamins prevent disease, and in some cases cause harm.
“The enthusiasm does tend to outpace the evidence,” said Dr. JoAnn Manson, chief of preventive medicine at Boston’s Brigham and Women’s Hospital.
There’s no conclusive evidence that dietary supplements prevent chronic disease in the average American, Manson said. And while a handful of vitamin and mineral studies have had positive results, those findings haven’t been strong enough to recommend supplements to the general U.S. public, she said….
People who take vitamins tend to be healthier, wealthier and better educated than those who don’t, Kramer said. They are probably less likely to succumb to heart disease or cancer, whether they take supplements or not. That can skew research results, making vitamin pills seem more effective than they really are.
I used to be skeptical of vitamins but my primary care physician told me to take Coq10 and I take these from Nature Made. I have to admit that they have really worked for me as my energy and endurance is much better on them than off. I also feel better with Vitamin D and magnesium.
I think excessive vitamins might be bad but certain ones seem very helpful, at least in my experience. What do readers find: do vitamins make you feel better?