Serious question...what do you think?

Is this world really getting worse and dangerous these days? Could it be that it only seems to be worse than ever before because almost everyone has social media and access to the Internet so we are able to get any/all information about what is going on in our hometown and the world around us? Or is it ACTUALLY getting worse? Examples...racism, police brutality, terrorism, politics, child and animal abusers, etc.
I'm just scared knowing that I have to raise my kid in this scary world, but was it always this way and we just didn't know it because we only used to have the newspaper and television?