Should schools teach healthy eating?
I know schools in some areas teach a bit about health. But overall there seems to be a severe lack of education about healthy eating.
I remember I had a discussion a while back on here with a woman who insisted that fruit loops are healthier than an apple. It's that sort of lack of education about nutrition that baffles me!
Do you think this should be something we teach our children in school? Or is this something people should go out and learn about on our own?
Add Comment
Let's Glow!
Achieve your health goals from period to parenting.