History
ReannaBassler114
14

Can you explain the political and social changes that occurred in the United Stated during WWI.

+1
(1) Answers
graveschristi3

Socially, with the new found use of plane in the war, more and more countries used them to their advantage. After the war for the Americans, it made them generally more happy because their economy was risen due to the war. For other countries that lost the war, they were in an economic depression. Many countries would always keep an eye out for others after the war ended, not able to have their backs turned after that.  Politically, The Treaty of Versailles made Germany make significant territorial concessions. Near the end of WW1, Germany experienced a socialist revolution from 1918-1919, resulting in the new left-leaning party "Weimar Republic". Which lasted until Hitler's Nazi Party took control.   Hope this helps you out. If you are looking for different questions leave a comment and i'd be glad to help you out. 

Add answer