Why do americans think that they won workd war 2?
Yet again i hear intelligent young American journalists mentioning WW2 and commenting on it about how they won.
The USA only got involved in the war years after it started.
Is it just because these people are of a generation to far removed from the time or is it that in American schools they are taught that The USA fought Germany in WW2 ?