Benefits in America
It's no secret that the US is lagging seriously behind other developed nations in the areas of healthcare and employee benefits like maternity/paternity leave and paid sick leave. Most developed nations especially in Europe offer much better benefits, for example Germany requires employers to pay up to 6 weeks of sick leave and I believe that maternity leave is full pay for 6 weeks before birth and 12 weeks after birth and 65% pay for up to a year or two (I used to live there so I was never eligible for these benefits, but I had friends that were).
It seems like common sense that a "rich" nation like the US would offer these same benefits, especially considering the taxes that we pay, but obviously we don't have that here. What confuses me is people who have the attitude of "if you can't pay your own way, that's your fault. No one is responsible for you but you" and I understand that to an extent but I would argue that a healthy society is partly responsible for your well being, like in situations of keeping you from having to choose between sending your kid to school sick or not being able to pay the bills (for example) but many Americans will look down on you for suggesting that the government is responsible for creating this security net for its citizens. It's interesting to me because the US is one of the only developed nations where the people believe that they should be on their own with no support from their society. I wonder if this attitude is what is preventing us from moving forward with better benefits because people just don't feel like they are owed them?
So my question is, do you think that a "rich" nation like the US should be responsible for creating policy that supports the citizens and encourages healthier communities?
Vote below to see results!
Let's Glow!
Achieve your health goals from period to parenting.