Some Myths about the American Civil War — Market Mad House
The US Civil War is the pivotal event in American history. The Civil War transformed the United States from an agricultural society into an industrial nation.
In addition, the Civil War transformed the United States into a fully capitalist republic by expanding mass democracy to the South. The Union achieved that goal by smashing both slavery and the plutocracy that profited from it.