Should America be known for two official races, Native American and "White"?

Native Americans are the rightful native race of North America (the US). But because of history and population and power in government, the default race of America the rest of the world and America itself knows is white. But, should it be white? When you think of an American, how do you picture them? 
Our holidays are white centered - white Santa, the Fourth of July independence from Britain, thanksgiving with white pilgrims, stuff like that. 
Our currency has white men on it, except for the new very recent Harriet Tubman bills. 
Our presidents have all been white for the past 200 something years except for Obama for the recent past 8 years. 
All of these things and more kind of make America known for bieng white. Everything is white centered. The British colonized America and established power and thus made everything since then white centered. 
What is the ideal state of America? (Equal diversity?)
How do you picture an American yourself? 
How should America be represented?

Vote below to see results!