blog




  • Essay / World War I continued in America after the war ended - 920

    After the United States' involvement in World War I, Americans were strengthened economically and diplomatically, but their country was been radically disrupted. Conflicts over race, origin, labor, and equal rights for women arose throughout America. The war may have ended around America, but deep down there was a war between it, socially, politically, and economically, for women, immigrants, African Americans, and American men. World War I affected African Americans socially through the Great Migration of African Americans traveling to the North, politically for women who were fighting for their rights and the ability to work, and immigrants were excluded based on their origin, and economically for men currently being conscripted and their previous jobs. became vacant. These issues could no longer be overlooked and stuck out like a sore thumb, damaging America's reputation. As men were conscripted for war, women took over working in factories to help with production for the war effort. Journalist Jean Godden Seattle wrote on June 30, 1918, patriotically exclaiming that women had won the right to work in various shipyard shops and depicted women wanting to participate in the production of goods for their troops (Doc. 5) . Women jumped at the idea of ​​working to earn money while their “breadwinner” was fighting in the war, because it allowed them a taste of what it meant to be considered a working citizen. However, they will not let go of the big step towards achieving equal rights for men. The women pushed further and the Joint Resolution was passed on May 19, 1919, by Congress, granting women the right to vote and extending the right to vote to women, making their dream of becoming an American citizen a reality (Doc. 6). They...... middle of paper...... used because of what they did. American men were called to war after America broke its isolationism and they lost their old jobs in order to display their patriotism. In America, even though World War I was fought abroad, many Americans had to fight necessary battles at home. African Americans fought to work in the North to improve their lives, women fought for the right to vote and to help by working in the war effort, German Americans and other immigrants were suppressed so that no radical uprisings occurred and the men fought for their country. Overall, America experienced political, economic, and social changes, but it also showed its patriotism and how it was able to cope with its involvement in foreign affairs, which brought it later not endorsing the League of Nations because of the pain and change internationalism caused. caused them.