Foreigner or American

I am American, but I highly trust a foreign Dr rather than an American. I feel like they take the job more serious and aren't just there for the money. I'm just curious who else feels the same way? No offense to an American Drs on here and I know alot of drs regardless of skin color take the job seriously. Just judging from personal experience.

Vote below to see results!