America isn't that bad.

The media is making people that reside in different countries believe that if you come to America you have a high chance of being shot and dying. Truthfully, you don't. Unfortunately, if you visit areas with a high percentage of racism, there's a higher chance of minorities being murdered or being treated poorly by law enforcement or just being harassed by the republican white citizen. But, you dont just come here and get shot randomly. Guns aren't out everywhere you go, at least on the West side of the U.S. (California, Las Vegas NV, Arizona, Oregon.) Racism is low within these areas and guns aren't just out everywhere lol. I've seen a gun one time and it was in its holder thing on the man's waist. 
So, don't be "scared" to come to the United States. You just have to go to the right places. There are dangerous areas everywhere in the world. The U.S. is a place that you should come and visit, it's great. Don't let the media make you believe if you walk outside the hotel room that you have to watch out not to get shot. That's just unrealistic. 
I really don't want anyone to take offense to what this post says. I'm fully aware of the racism and poor gun control in the United States. It's appalling and we should all fight to change it. Please give honest, but not disrespectful opinions because I didn't say anything to deserve disrespect (: