Don't tell me to smile
I fucking hate when men or boys tell me to smile. It'll be anybody from someone I know to a stranger in the street. Telling me "oh you should smile more" "smile" "smile" "smile!!" It's like nigga ain't nobody need to smile especially around dudes like you. If a bitch don't wanna smile then she's not going to. Get the fuck over it and don't ever speak to me about my appearance. It shouldn't have any affect on you how I walk around!