Something needs to change
Don't know if this is the right place to post this but something needs to change in the United States. My own experiences alone I have been told by family members that I should be "sterilized" because I'm dating a man of a different race, my best friend was taken advantage of at a party and nobody believed her, I was discouraged in going to medical school because "that's not a job for a woman" I've been asked at jobs if I have plans to get pregnant soon. My other best friend committed suicide after she was taken advantage of. Another was kicked out of his home for coming out to his parents. I carry pepper spray on my college campus because I'm afraid. That's wrong, and that's not even bringing up the school shootings, violence against LGBT+ individuals, individuals of other races, and the like. America seems to be moving backwards and it worries me, I live in a small town with conservative views but I've always known you should treat people how you wanna be treated, they teach us that in elementary school. I'm disgusted by how people who are "different" are treated and how women are treated nowadays. Something needs to change. And I'm sorry for the rant I just needed to get this off my chest.