May 13, 2023
As of the end of 1940 Hitler had essentially won WWI: he had accomplished the war aims Germany set out for itself in that war--as pointed out in this article, a couple of decades or so too late. Still, had he not attacked the U.S.S.R. the world probably would have begrudgingly allowed those conquests to stand.