Being a woman is terrifying
Anyone else think that being a woman is absolutely terrifying? I'm watching the Handmaids Tale and I'm like shitting my pants terrified. (For those of you who don't know what it is it's a book adapted to a show and it dipicts a dystopian society where women who are fertile are stolen away from their families and then forced to bare the children of the leaders of the country). It draws many comparisons to real world events which makes it that much scarier. Has anyone else watched/read it? Anyone have any thoughts on it?
Add Comment
Let’s Glow
Glow is here for you on your path to pregnancy
Glow helps you navigate your fertility journey with smart tools, personalized insights, and guidance from medical experts who understand what matters most.
25+ million
Users
4.8 stars
200k+ app ratings
20+
Medical advisors