Religious Rant!

I'm not shaming anyone, I am actually trying to better my relationship with God so it's nothing like that. But it drives me so crazy when something natural happens in the US only and all the Christians are screaming it's the end times! Like do you really think that only the US would be affected and warned?? It makes absolutely no sense to me. The solar eclipse, "oh it's a sign". The hurricanes? "oh that's also a sign". No matter what it is. Don't they stop and think for a minute that God doesn't prioritize the US? I can't stand getting on social media during any kind of natural disasters anymore!!!