Is it normal to hate americans who think their country is the absolute best?
I'm so tired of Americans who seem to think that their country is the best thing that ever existed, as if it was pooped out by God himself. The majority of Americans believe that they invented freedom, technology, morality, law and a ton of fundamental stuff, as if nothing had existed before 1776. They are convinced that all other countries are stinky hellholes filled with misguided primitives. That the US is a shining beacon of hope, and that everybody wants to become American.
The US is a pretty cool place and above average but it's far from perfect, just like any other country. I don't understand why Americans have such a self-centered worldview.