Harbingers of a New Age
What publicly available data can tell us about how life has changed in 21st century America
21st century America has already seen major changes in technology, culture, and economics. With these changes have come a wealth of contradictions. We live in an age of unprecedented social connectivity and access to information, and yet we’ve seen increasing social isolation and the formation of self-contained echo chambers. We’ve lived through the largest economic recession in generations, and yet the data shows that we have an ever-increasing GDP and a historically low unemployment rate. We’ve been at war for virtually the entirety of the 21st century, and yet the experts claim that violence is at an all time low. It’s the best of times, the worst of times, and above all, the most jumbled and confusing of times.
With all of these seemingly contradictory effects at play, it can be hard to get a big picture understanding of how life has actually changed for the people who live in America. In this essay, I’ll present data on macro-level trends relating to how quality of life has changed for in the past 15 or so years. I’ll caution that these data are drawn from a wide variety of sources. The start and end dates of the data sets I reference vary, as do the methodologies and sample sizes. Accordingly, I’ll avoid making any claims about what’s causing the trends that I’ll discuss in this article, and I’ll also avoid any probability-based statistical analysis. Instead, I’ll just show you the data and provide a bit of context. It’s up to you to draw your own conclusions as to what these trends mean.
Scarcity and Debt
In an economic sense, the two key variables that most influence your quality of life are 1) how much money you make, and 2) how much everything you need to pay for costs. The economic necessities I’ll discuss in this article are healthcare, housing, food, and education, but I’ll acknowledge up front that there’s a legitimate debate to be had about what “everything you need to pay for” actually includes.
To get an idea of how much money people are making these days and how this compares to historical trends, I referred to this article from Business Insider. They show a time course of inflation-adjusted incomes over time, broken down by income groupings. I’ve included this chart below for you to review. In the late 20th century, real incomes steadily increased for all social strata, although the increase was greater for the rich than for the poor.
Since the beginning of the 21st century (which I’ve indicated with a vertical red line- a convention I’ll keep throughout this article), real incomes have been volatile, but with no real increase or decrease for the wealthier 40% or so of the population, and have declined slightly for the poorer 60% or so of the population. The very rich (as represented by the top 5% line in this graph) do not appear to have been exempt from the stagnation and/or decline of American incomes. This data suggests that the earnings increases of the late 20th century halted at the beginning of the 21st century, and now most people are earning slightly less in real terms than they used to.


The same Business Insider article includes a chart comparing the inflation-adjusted changes in medical costs and college tuition and fees alongside trends in income over the same time period. I’ve included this chart below. In the late 20th century, education and healthcare were steadily increasing, and were already increasing faster than income. Since the beginning of the 21st century, the increase in healthcare costs has not slowed, and the increase in education costs has accelerated. In the context of stagnating or declining incomes, this means that education and healthcare have grown much less affordable so far in the 21st century.


Of course, it wouldn’t be terribly concerning for some necessities to grow less affordable if there were other necessities becoming much cheaper. In that case the net change in the affordability of living might be neutral. So I decided to look at housing costs also. Based on the data I could find, home prices have been highly volatile (partly thanks to the 2007–2008 housing bubble and crash), and they’re also confounded by interest rates and the prices difficult to interpret (if people are moving from mansions with 5 acres to 1 bedroom condos with no lawns and the price stays the same, the affordability has gone down). Trying to make sense of real estate prices turned out to be over my head.
So I gave up on trying to make sense of real estate prices, and instead decided to use median rent as an index of the costs of housing. The rental market has fewer confounding variables, and apartments are generally more uniform than purchased real-estate is. It took a bit of searching, but I eventually found an article from Apartment List analyzing trends in median rent based on Census data. The story these data tell is very similar to the story I saw with healthcare and education costs. Rent increased steadily in the late 20th century, even faster than incomes, and this increase has continued so far in the 21st century, despite slightly decreasing incomes. Housing has grown gradually less affordable so far in the 21st century.


Another basic necessity of life is food. It’s hard to quantify the “cost of food” as such, but I did find good data on how the prices of certain kinds of food has changed. I found the following chart in this article by Ezra Klein, showing the relative price changes in fresh fruits and vegetables as compared to beer, butter, and sodas. Starting around 1985, healthy foods have grown gradually less affordable, while junk food has grown gradually more affordable. This trend has continued so far in the 21st century. Thus, while the affordability of getting enough calories has remained fairly consistent, the affordability of a healthy diet has declined.


With the affordability of healthy food, housing, education, and healthcare all declining, you may also expect that an increasing number of people are in an increasing amount of debt. You would be correct. I found the following chart in this article from The Atlantic.In the late 20th century, federal debt, private debt, and GDP all increased more or less in tandem. From 2000–2008, however, private debt far outpaced the increase in GDP. With the financial crisis of 2008 (indicated in this chart by a vertical black line), the increase in private debt appears to have been offloaded to the federal government, but the overall increase in debt does not seem to have slowed. Without commenting on the relative merits of private vs federal debt, one thing is clear: we’ve being going deeper into debt so far in the 21st century.


Losing Our Places in the World
Besides simple material needs, most of us also have an emotional need for “belonging” as a member of society. This sense of belonging is difficult to quantify, but there are certain elements of “belonging” for which publicly available, objective statistics are available. For purposes of this article, I’ll look at labor force participation rate, percentage of young adults living with their parents, and the number of people on food stamps as benchmarks. I’ll focus this section of the article on those objective metrics, and let you draw your own conclusions as to how they may relate to the average person’s subjective sense of belonging.
The official unemployment rate is quite low at the moment, but that statistic doesn’t quite capture the essence of “how many people are left out?” given that it doesn’t include people who are disabled or who have given up completely looking for work. To include those people, you have to look at the labor force participation rate, which I found data for on the Bureau of Labor Statistics’ website. After remaining relatively stable in the 1990’s, labor force participation has dropped substantially so far in the 21st century. This trend accelerated with the economic crisis in 2008, but even before that, labor force participation was already trending downward.


Another quantifiable metric of a person’s success in finding their place in the world is whether or not they succeed in establishing an independent household as a young adult. Pew research has compiled data on the percentage of 18–34 year-olds living in their parents’ homes, and although there is clear evidence of cultural influences (e.g. ethnic differences in trends over time), it is also clear that more young adults live with their parents than at any time since the great depression:


Finally, thanks to the federal Supplemental Nutritional Assistance Program (food stamps), we can roughly quantify the number of people who are unable to provide for themselves. Food stamps are highly politicized, so it can be difficult to find data that includes 1) a long enough time frame to distinguish real trends from noise, and 2) a high enough time resolution to visualize short-term fluctuations . The most useful visualization I could find (shown below) came from Matt Trivisonno’s blog, which relied on Department of Agriculture Statistics. During the late 20th century, the number of people on food stamps remained relatively stable. Since the year 2000, however, the number of people on food stamps has increased dramatically.


Going a Little Bit Mad
In a time of increasing costs and declining wages, along with increasing numbers of people out of work, on food stamps, or living in their parents’ basements, you might expect that we’re all a little more stressed out than we used to be. Some people may even experience outright mental illness as a result of this stress. Although I was unable to find a line graph similar to the rest of the graphics in this article, the World Health Organization’s recent report, titled America’s State of Mind, shows differences between 2000 and 2010 in terms of how many people are using psychiatric medicine. All gender and age groups have increased their use of psychiatric medicines so far in the 21st century, consistent with increases in mental illness.


Of course, there have also been major changes in the prescription drugs available and practices for prescribing them in the past 20 or so years, so the number of people using psychiatric drugs might paint a misleading picture. One metric that may be more objective, albeit only relevant to extreme cases, is the suicide rate. The CDC website shows the following graphic for how the suicide rate has changed so far in the 21st century. Both men and women have grown somewhat more likely to kill themselves in recent years.


Another quantifiable, prescription-independent metric of poor mental health that’s more applicable to the majority of people who don’t kill themselves is drug abuse. The National Institute on Drug Abuse’s website tracks trends in illicit drug use over time. Overall drug use has increased slightly, consistent with the increase in psychiatric drug use and suicides, and it appears that marijuana is the most common drug people use. Looking at some of the other charts they report which I have not included in this article, it looks like drug use among young people has remained relatively consistent, and middle aged to elderly adults are the primary source of the increased drug use.


I should note that increasing illicit drug use, and especially increasing marijuana use, doesn’t necessarily indicate worsening mental health. It may just mean that people are more apt to party than they used to be. However, in the context of the rest of the trends discussed in this article, my judgement is that the mental health-based explanation is somewhat more plausible than the party-based explanation. Furthermore, even if increasing illicit drug use isn’t a symptom of worsening mental health, increasing psychiatric drug use and increasing suicide rates almost certainly are.
Fatter and Sicker, but Not Dying
Besides being able to afford the necessities of live, finding a place in society, and maintaining sanity, physical health is another important and measurable attribute of a person’s well-being. Although modern times have seen major advances in medical technology, we also live in an era of obesity, sedentarism, and all the diseases that come along with them. The CDC website includes the following graphic for how obesity rates have changed in recent years. Although this particular graph doesn’t include data from the late 20th century, the increase so far in the 21st century is actually slightly slower than the increases during the late 20th century. In any case, we’re already very fat and growing fatter still:


Although the increases in the obesity rate have slowed somewhat, it takes time for obesity-related diseases to develop. I found a good article on yourdoctorsorders.com, which included the following CDC visualization of diabetes rates over time. The prevalence of diabetes increased gradually throughout the late 20th century, and has increased dramatically so far in the early 21st century. We’ve now reached the point where half of adults in the United States have diabetes or prediabetes.


Similar trends are present, unsurprisingly enough, for other metabolic diseases and for cardiovascular disease. What may still surprise you is that the incidence of certain cancers (read: cancers not caused by cigarettes or asbestos) is also increasing. The American Cancer Society has an excellent article detailing current cancer trends in the United States as of 2016. Shown below is data on changes in the incidence of thyroid cancer, as a reasonably representative example of non-tobacco, non-asbestos cancers. Incidence of thyroid cancer was largely stable throughout the late 20th century, but has increased dramatically so far in the 21st century.


There are many diseases I could choose to visualize trends in human health, however, and it’s entirely possible that the examples I’ve chosen are misleading. Although there’s no rigorous way to quantify everything that falls under the umbrella of “disease” with a single number, what we can quantify is the number of people using prescription medications. According to the CDC website, the percentage of people using at least one prescription drug increased from 43.5% to 48.3% from 2000 to 2008. Although I couldn’t find a graphic extending all the way to the present, this trend does not seem to have slowed. Quite the opposite, in fact. Now nearly 60% of Americans use at least one prescription drug.


After all of this bad health news, here’s a bit of a reason to be optimistic. According to the same American Cancer Society article I referenced above, cancer mortality rates are in decline despite the increase in incidence of most cancers. I also happen to know from my job that diabetes and cardiovascular diseases are gradually becoming less deadly. It looks like what medical science can do, it does quite well. On the flip side (and speaking from experience), making diseases less deadly without curing them may well result in increased human suffering. In 21st century America, we’re living longer, sicker lives.


Summary
As individuals in America, we’ve grown a bit poorer so far in the 21st century. Meanwhile, the costs of housing, healthcare, and education have increased substantially. We’re in more debt, less likely to have a job, more likely to live with our parents, and more likely to be dependents of the federal government. We’re slightly more likely to have mental health issues, commit suicide, or turn to drugs. We’re also fatter and more likely to have a chronic disease, but not to die from it. Compared to the late 20th century, the early 21st century is a time of increasing scarcity, debt, dependency, and disease superimposed upon a backdrop of spectacular technological advances and social upheaval. Many of us are going a little bit mad as a result. These issues should not be ignored, but they also should not be overstated. Moreover, where things are trending in a bad direction, those problems should be addressed. It’s up to us as Americans to address them.