Guns make you safe... Is the U.S. not safe?

I keep seeing these reoccurring comments from Americans. Do you really think this? Is this a cultural fear that is being instilled? I keep reading people say they "need" a gun to prevent rape, burglary, murder, etc, from happening to them virtually any second. Do you seriously feel every day that there is a chance of all of that happening to you? It's almost like a fear of the monster in your closet you've been told about since you were a kid... It's just crazy to me that you can't just enjoy a walk down the street without being armed... Especially with your baby! Is America really that dangerous? This is not a post to bash Americans, it's an honest question. I know there are less safe place in the world. I just am wondering why Americans feel so unsafe where they live.