Pissed
I'm so sick of every time I turn around rather I'm in town or just sitting around watching something there's always a woman showing all skin. Does people know the meaning of self respect anymore. I'm sick of looking at ass and titts. Sure maybe some people enjoy seeing that everywhere but I'm not. I'd like to be able to go out to eat for once and not look at that shit. Something really does need to change.
Add Comment
Let's Glow!
Achieve your health goals from period to parenting.