Neoliberalism
Simply explained
Neoliberalism isn’t a word to describe the incompetent Left in the U.S., as some people tend to think. It’s a word used to describe a much more sinister force in politics that’s penetrated both parties.
Neoliberalism is the belief that everything in our lives should be framed as a market interaction. It’s the ubiquitous ideology suggesting profitability is more important than anything…