photo credit: https://flickr.com/photos/jondoeforty1/

Homelessness: When Data Doesn’t Match the Public Perception

Ryan P. Dalton
Oct 29 · 7 min read

Los Angeles, along with much of the urban west, is experiencing a growing crisis in homelessness. As of last count, there are roughly 36,000 people experiencing homelessness in the city of Los Angeles — nearly 1% of the city’s population.

One of the most powerful tools available for understanding this crisis is outreach. By speaking directly to people experiencing homelessness, and collecting their health and life histories, researchers and city officials can make strides towards understanding the forces that push people onto the streets.

As is often the case with data collection, not all of this data will be consistent with our preconceptions. It is not supposed to be: if data and public perception were perfectly aligned, there would be little reason to collect data in the first place. Data offers a cold counterpoint to opinion, and that is precisely why it is valuable.

In the old fable of the frog in the kettle, if the frog had paid closer attention to the thermometer instead of its own perception, it wouldn’t have been boiled alive. Homelessness in Los Angeles is no fairytale. But a recent report from the LA Times — which used a methodology meant to coerce data until it matched public perception — should have us wondering how hot the water will have to get before we start paying attention to the thermometer.

If the Shoe Doesn’t Fit, Get a Bigger Shoehorn

The Times report relies on surveys collected by the Los Angeles Homeless Services Authority (LAHASA) in 2019. These surveys are long questionnaires, designed according to federal guidelines, and intended to identify several important factors in the health and housing history of the people answering the questions.

When the LAHSA analyzed these surveys, they found a sizeable gap between popular perception and data. In particular, LAHSA data show that contrary to received wisdom, only 29% of people experiencing homelessness in LA in 2019 have severe mental illnesses or drug addictions. In other words, about 71% of these people have neither. These findings came as a surprise to many — including, apparently, editors at the Times. But it is worth pointing out that when taken in the broader context of the LAHSA findings, this data should not necessarily be surprising. According to LAHSA figures, the demographics of people living on the street are changing very quickly. For example, more than half of the people on the streets in 2019 have not been homeless before. This is a number that is both shocking and an indicator of the growing degree to which purely economic forces are putting people on the street.

But the Times, noting that these data are inconsistent with public opinion, set out to identify — and then ‘correct’ — the discrepancy. To do so, they re-analyzed the LAHSA questionnaires and drew a new and very different set of conclusions. Most strikingly, they reported that 67% of people experiencing homelessness ‘reported, or were observed to have, a mental illness and/or substance abuse disorder’. That is, their reanalysis reported a percentage more than twice as high as that reported by the LAHSA. How could they possibly have arrived at such a different figure?

It’s helpful to understand how the data was collected in the first place. In the LAHSA study, survey collectors asked homeless participants to self-report any existing diagnoses of mental illness or substance abuse they had received from licensed clinicians or other healthcare professionals. As part of the broader survey effort, the team also documented past diagnoses that were no longer current, and were at liberty to jot down editorializing notes, recording their perceptions of their subjects.

In the report LAHSA then published from these surveys, rates of mental illness and substance abuse among the present LA homeless population are (appropriately) derived from participants’ self-report of their current clinical diagnoses.

Past diagnoses are not germane to this calculation, which makes good sense: The vast majority of mental illnesses recognized by the DSM, psychiatry’s diagnostic bible, are not considered permanent lifelong conditions. Just as importantly, what counts towards a diagnosis has shifted significantly over time: In 1973, homosexuality was still on the books as a classifiable mental disorder. Today, no working clinician would make such a diagnosis if they wanted to keep their license.

Another key feature of the LAHSA report is that it does not figure in the perceptions of the survey workers themselves. Without proper training, those workers were almost certainly subject to ‘perception bias’, a cognitive bias that arises from stereotypes of how other groups of people should act or behave. While documenting worker perceptions may have provided a useful log for targeted outreach and follow-up, the LAHSA knew not to include it as objective data. There is no scientific journal in which such records would pass muster for publication.

So how did the Times arrive at a percentage twice as high as LAHSA’s own analysts? It’s quite simple, really: They went back through the data, dug out all the past (but not current) diagnoses, figured in the non-expert judgments, and added everything back in. Or in short, they did it by breaking with sound scientific practice.

In sum, the Times chose to pick through the survey questions, re-litigating who should be counted as what. Their analysis did not cite any overarching strategy besides a desire to produce data that ‘match public perception’. But data is collected to inform public perception; public perception is not measured to inform how we massage data.

A Lesson in the Tactics of Scientific Persuasion

Had the authors of the Times study left it there, their piece would have had significant problems but would not necessarily be worthy of disqualification. The federal guidelines for how to count survey answers are by no means perfect, after all. And some of what was discussed above was erroneous but didn’t seem to be intentionally misleading.

But this piece really takes a turn for the worse when it employs a common technique in scientific persuasion: including corroborating evidence. While corroboration is usually thought of in terms of its positive effects (i.e., strengthening one’s case), it is just as powerful — and much sneakier — in its negative effects. By citing a study that corroborates your own, you are also implicitly rejecting studies that don’t look like yours.

The Times includes a corroborating report from the California Policy Lab at UCLA. That study shows numbers even higher than those reported by the Times. Much higher, in fact: they report a 78% rate of mental illness among LA’s unsheltered population. This has two distinct effects: it refutes the LAHSA analysis and, by actually overshooting the Times analysis, it paints the Times analysis as intermediate. Not an outlier, just a happy and hard-to-dispute moderate.

That sounds reasonable, doesn’t it? It would certainly be reasonable if the UCLA report had studied homelessness in LA. (Which it didn’t.) Or if, given the rapidly shifting demographics described above, the UCLA report had provided a 2019 measurement. (Except it didn’t.) In fact, the UCLA report measured homelessness across 15 states, from 2015 through 2017. While this is mentioned in passing in the Times piece, the UCLA data are nonetheless presented in graphics labeled to indicate that they are a study of homelessness in Los Angeles. And throughout, the UCLA data are suggested to corroborate the reasonableness of the Times analysis, and by implication, the unreasonableness of the LAHSA report.

When Data Disagrees with Public Perception

Homelessness in Los Angeles rises to nothing short of a humanitarian disaster. And it is in the interests of all Angelenos to address it; this would be the case regardless of the particulars.

Mental illness is not magical, or a sign of some darker evil, but is a physical illness of the nervous system. And a person with such an illness is no less worthy of care than someone with any other kind of illness. Even having to explain this — which many advocates have found themselves doing recently — is a sign of the unhealthy state of this discussion.

In reality, homelessness is a problem that runs much deeper and wider than mental illness and substance abuse. But the desire to force the problem back under the umbrella of mental illness — and to align it with public perception — is a sign of a collective desire to stigmatize those experiencing homelessness, such that we carry no guilt for ignoring a magnitude of suffering that would be more at home in a war zone than in a rich city in a rich country.

Although the Times study was probably not meant cynically, this is how it has shifted the argument. Because the piece is highly visible and aligns with our own perception, it has acted as a loud voice in the echo chamber.

It would behoove us not to be distracted by this: the best data we have are telling us something that should scare us. The face of homelessness is changing, and for many Angelenos — under rent strain, under threat of eviction, addled by stagnant wages and medical bills — that face is too familiar for comfort. Smashing our collective mirror may buy us a bit of time, but it will also leave neighbors, friends, and colleagues, wondering when we stopped seeing their faces in our own.

The Startup

Medium's largest active publication, followed by +525K people. Follow to join our community.

Ryan P. Dalton

Written by

Neuroscientist, Data Scientist, Writer, Educator, & frequent contributor to Scientific American’s Mind Matters.

The Startup

Medium's largest active publication, followed by +525K people. Follow to join our community.

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade