So tired of “woke” culture
I can’t watch any of the TV shows I used to enjoy. They have all been ruined by going woke. I can’t watch football, Grey’s Anatomy, The Bachelor, This is Us, etc. It’s so freaking cringey these days.
TV is supposed to be a way to escape from the world for a bit. To enjoy something. Now it’s all pandemics and if you’re white you’re a racist. 🤦♀️
Add Comment
Let's Glow!
Achieve your health goals from period to parenting.