Do all or majority of Americans have health insurance? Why is it a bad thing if you don't have it?

I'm from Australia so insurance is basically non existent unless you pay for private health insurance every month. The people who don't jave insurance just use the public medical system