How did US trade with England and France bring the into WW1?

US trade with England and France did not bring the U.S. into World War I. The United States did not enter the war until April 1917, after Germany's unrestricted submarine warfare campaign led to the sinking of several American ships and the deaths of American citizens.
Copyright Wanderlust World © https://www.ynyoo.com