Has feminism lost its grit?
I’ve noticed myself cringing at women saying things like ‘down with the patriarchy’ etc and It’s disappointing to see serious concepts trivialised by ignorant people who don’t actually know what their arguing for. Do you think that companies can profit off of modern day feminism by selling products with popular feminist slogans on them which then lose their meaning, and so in turn they benefit from trivialising a movement and making all women involved appear to be trend following and mindless? I think feminism would retain its dignity if there was a more precise intention for the cause, to me, I think feminism has been lost to the spectacle and at this point it’s a little embarrassing. Thoughts?
EDIT : I would consider my beliefs to be aligned with feminism. I am critiquing the movement, read it a couple times if u don’t understand.