I’ve recently read articles that say over the counter vitamins are not good for you and may, in fact, be bad for you. A FRONTLINE program was aired a few months back, proving through testing done at Children’s Hospital of Philadelphia, that vitamins, as we know them, are dangerous. Any thoughts?
My thought is if we eat veggies from our garden, fruits from the trees and meat we raise, why do we need a vitamin created mainly for someone’s profit? The same for most modern medications.