Do you feel the media and society romanticize toxic relationships?
I was on AskReddit and one of the posts there was about things that people need to stop romanticizing. A good few of the answers were abusive and toxic relationships (I've included some of the responses that I feel best describe the thread's general opinion; The usernames are blocked out for their privacy).









1. Do you agree with these comments? Do you think people romanticize and normalize dysfunctional relationships? Why? Do you think this warps people's ideas of what makes a good relationship? Do you think the media portrays enough healthy relationships in books, TV shows, movies, etc.?
2. What do you think should be done to promote healthier relationship behavior, if the media isn't going to do it? One user recommended classes for young people that would teach them what is acceptable behavior towards a partner and what isn't, red flags, and what makes a healthy and happy relationship (think of it like a sex and relationships ed, instead of a sex ed). Do you like that idea? Got anymore?
Achieve your health goals from period to parenting.