White people vs white Americans

Keitu

Okay, so I saw something on Facebook which has been going on for a while and I’m genuinely curious

Why are white people and white Americans often used as synonyms

Ex. White people ruined America

I know it doesn’t happen all the time but I’ve seen similar things a lot lately. People are blaming white people all around the world for things only white Americans are responsible for (ex. electing Trump)

Am I over thinking this ? Is this just something that’s common in America and the states?