Did America Ever Really Work?
umair haque
48466

I kind of get your point, but I think you’re way off the mark about what made America economically prosperous and what has led to the decline of wages.

Understandably, racial politics does play some role in the economy (especially during the time of slavery) but there were much bigger forces at play.

Post WW2, the United States was pretty much the only industrialized nations the escaped the war practically unscathed. Europe was in rubble, so was Japan & Russia, and China was still in the midst of modernization.

As a result, America was uniquely suited to produce all of the goods and materials needed to help with reconstruction; leading to a boom in US manufacturing jobs.

But by the 1970s, much of Europe had rebuilt itself and Asia began undergoing rapid industrialization; increasing demand for cheap foreign goods & labor while decreasing demand for expensive American manufacturing.

Integration had nothing to do with declining wages; inflation, globalization, and poor public policy is what led us to this point.

Like what you read? Give William Sumner a round of applause.

From a quick cheer to a standing ovation, clap to show how much you enjoyed this story.