White people vs white Americans

Keitu

Okay, so I saw something on Facebook which has been going on for a while and I’m genuinely curious

Why are white people and white Americans often used as synonyms

Ex. White people ruined America

I know it doesn’t happen all the time but I’ve seen similar things a lot lately. People are blaming white people all around the world for things only white Americans are responsible for (ex. electing Trump)

Am I over thinking this ? Is this just something that’s common in America and the states?

Glow Resources

Let’s Glow

Glow is here for you on your path to pregnancy

Glow helps you navigate your fertility journey with smart tools, personalized insights, and guidance from medical experts who understand what matters most.

25+ million

Users

4.8 stars

200k+ app ratings

20+

Medical advisors