What the Health?! 🐄
Last night my husband and I were flipping through Netflix searching for something to watch. I came across a documentary called "What the Health". I've always found documentaries about the food industry and it's effect on our health very interesting. I particularly found this one interesting because it's focused on a plant based diet, which my husband and I have recently taken on due to health issues. (were not 100% plant based, or vegan, but we definitely have restricted our diet. We're essentially paleo)
So we went ahead and watched the movie and it was very insightful. I personally felt it was over exaggerated and I doubt that 100% of the information shared was totally true. BUT I see a lot of validity in the point they were trying to convey. If you haven't seen it, I suggest you do.
But it was basically about how the animal product industry (meat, dairy, eggs etc.) and the government (its greed $) are taking advantage of the American people and promoting unhealthy lifestyles as being healthy, and as a result it's causing an epidemic of diseases (diabetes, high blood pressure, high cholesterol, asthma, cancer etc.) Their main point was to show that a plant based (vegan) diet is exponentially better for you, and prevents and/or even heals diseases.
Id love to know if anyone else has seen this documentary and what are your thoughts on it?? Or even if you haven't seen it, what's your opinion on this subject?
Achieve your health goals from period to parenting.