Do you think women were forced into the workforce and now we basically have to work and do a majority of things in hetero relationships too?
So I'm looking into this, but I often think it seems like women's suffrage was co-opted by capital interests, basically whenever capitalists realized that places where more women work have higher economic growth, they wanted women to come to the workforce- besides they were lower cost labor back then- after many years where women were barred from working or could only work certain jobs.
I think it's wonderful that women can work in any field now and go to school for anything. It's good that women aren't treated as live in housekeepers or the property of men... because, ew, none of us really like that...
However, now wages are so low, and there's no social safety net- so that a lot of women can't even choose to be a stay at home mother or wife. I also think social expectation being that women all work or "aren't independent"- really devalues care work even more- like raising children, caring for elderly or sick relatives, managing a household, etc.- but this is important work for society, and women still do a majority of it, particularly if in heterosexual relationships.
This work being unseen and devalued is, itself, an effect of misogyny- but also of capitalism- because unpaid work isn't even seen as work, many times- no matter how difficult, time consuming or important it may be. People who do this work deserve respect.
At the same time, I'm reading and seeing that women being expected to do this work has been forcing them OUT of the paid workforce, especially mothers, because childcare costs are too high.
I don't know. Women deserve choices- not to be forced into or out of the workforce because the US can't get it's act together. Wages are too low & childcare too unaffordable. (I don't know how it is in other places in the world in much detail.)
Thoughts?
Vote below to see results!
Achieve your health goals from period to parenting.