Was the United States justified in entering the WW1?
I have to write a paper on whether I think the US was justified in entering the WW1. I know that if they didn't, Germany would have taken over Britain and then defeat the US. But I'm still not sure if they had the right to get involved.
Please state your opinion and reason why. It would help me a lot. Thank you.
Add Comment
Let's Glow!
Achieve your health goals from period to parenting.