So my father and I were talking a few weeks ago and he brought up a question that he didn't understand. My father seems to know ridiculous amounts of stuff about history, but this seemed to stump him. Germany caused the 2 deadliest wars in history that lead to the destruction of an entire continent twice, yet when all the dust had settled, Germany was allowed to stay a nation. Sure they were banned from having a standing army, but why were they allowed to even exist after world war 2? Even today, if for example North Korea invaded South Korea, and the North were to be destroyed (as one at this point would expect), I don't think anyone would feel the North would be allowed to exist as a sovereign nation once the dust settled. So what was the deal with Germany? I would think they would have just parceled out the land to neighboring countries.