Should America be known for two official races, Native American and "White"?
Our holidays are white centered - white Santa, the Fourth of July independence from Britain, thanksgiving with white pilgrims, stuff like that.
Our currency has white men on it, except for the new very recent Harriet Tubman bills.
Our presidents have all been white for the past 200 something years except for Obama for the recent past 8 years.
All of these things and more kind of make America known for bieng white. Everything is white centered. The British colonized America and established power and thus made everything since then white centered.
What is the ideal state of America? (Equal diversity?)
How do you picture an American yourself?
How should America be represented?
Vote below to see results!
Add Comment
Let's Glow!
Achieve your health goals from period to parenting.