Why do people think that America is the ‘greatest country on earth’?

Genuinely curious because from the outside looking in, it looks like a truly horrible place to live.

Edit: we have freedom in other countries too lol