Is it normal to think that the usa is the greatest country in the world?
Hey everyone, I think the USA is the best country in the world, because we have the strongest army, the most proud people, the most awesome flag and we never lost a war.
We are also the richest and the smartest because we invented most of the things we use TODAY.
We destroyed the british empire when it tried to colonize us and we almost single-handedly destroyed the nazi and the japanese empire.
You can say the world speaks American, because they learn American from American TV, not british.
I am making this post because I just came home from a trip to Europe and people didn't show me much respect for being an American. Even in england I was in a bar and some asshole called me a yank and when I called him a piece of shit motherfucker he told me to go back to mcdonalds..I'm not even that fat he only said that because I'm American..
But not to anyone's surprise when I told him to come at me, he didn't and told me to calm down, yeah chicken out.
This person obviously isn't even grateful that we saved their asses in WW2.
It may have something to do with the USA being about 80% Christian nation, and God blesses our country, that we are so advanced compared to the rest of the world.
Some people say china is going to be the new superpower yeah right, chinese aren't even smart enough what did they ever do for the world? Nothing.
If china will ever be a threat to America we will nuke it to the ground. That's why china wont attack America.
Just like europe, china can learn from America, don't even get me started on the rest of the world those people are even more backward.
Is it normal to think like this? I think so.