Why do americans think that they won workd war 2?

Yet again i hear intelligent young American journalists mentioning WW2 and commenting on it about how they won.
The USA only got involved in the war years after it started.
Is it just because these people are of a generation to far removed from the time or is it that in American schools they are taught that The USA fought Germany in WW2 ?

Voting Results
75% Normal
Based on 4 votes (3 yes)
Help us keep this site organized and clean. Thanks!
[ Report Post ]
Comments ( 3 )
  • techpc

    ??? They did... fight the Germans...? What are you talking about?

    I've heard of Holocaust deniers, but Americans-Fighting-In-The-War Deniers is a new one.

    Who the fuck taught you about WW2 if they didn't mention Americans?

    Comment Hidden ( show )
  • olderdude-xx

    The United States did most of the fighting that turned the tide and finished WW II.

    Also the United States supplied almost all of the military supplies that allowed the other allied nations to continue fighting as most of their factories had been destroyed.

    Without the USA equipment, supplies, and troops... Europe and Russia would have been totally conquered by Germany and Italy. Most of the Pacific would have been conquered by Japan.

    Comment Hidden ( show )
      -
    • Good to hear some people know our history. To listen to most(maybe its just the younger generation) in America the war was a matter of the U.S. kicking Hitlers (they think Hitler was the only opponent) ass. I hope others have read your reply and learnt something.

      Comment Hidden ( show )