Well, Tracking College Student Migration is Surprisingly Easy

Earlier this month, I wrote a post about how college students are treated in major migration data sources. Basically, I said tracking college-student migration was really hard, indeed next to impossible.

Specifically, I said:

All of this adds up to a fairly simple problem: ACS data may have very large errors if you want to estimate “college student” characteristics because the sampling frame is inconsistent, student residence can be ambiguous for respondents, and response rates are very low.
This doesn’t mean ACS is useless! It’s just that the ACS is not designed to track college students. It’s designed to track GQ- and non-GQ populations. College students straddle this line, and have unique characteristics that make them hard to study. This isn’t the fault of the ACS, it’s the fault of data users ignoring the source’s limits.

So the problem was that ACS isn’t designed to track college students.

I didn’t write this at the time, but what I thought to myself was, “To track this issue, you’d need some kind of university-level survey of students.” I thought this didn’t exist.

Turns out, it does. The New York Times has produced a neat graphic showing major flows of students across state borders to attend out-of-state public universities:

I asked them where the data came from, and they helpfully replied:

Great! I’ve never used this tool before, and I’m always excited to learn about new data sources. It sort of gives me an adrenaline rush, like, “Oh gosh, there’s a whole new world to explore out there!”

So what is IPEDS? Well, it’s a mandatory survey sent to virtually every higher-ed institution in the U.S. asking them a whole barrage of questions. One of those questions asks about previous residence of students. How cool is that? I had no idea this source existed, but it turns out the Department of Education directly tracks student migration via a rigorous, mandatory, annual census of the entire population of universities! This is awesome! Thanks NYT data team for highlighting this data source!

There are, of course, a few caveats.

  1. This data tracks only first-time freshmen. Students who transfer, or those taking a 2nd go at college, are excluded. This could in fact be a very substantial number of student migrants. So we need to keep in mind that DOE is tracking just freshman migration.
  2. Data is based on residency at admission. It’s possible that “residency at admission” may not directly correlate with where someone actually lives. This could be true if they come from a divided household, or if their family has recently migrated. Indeed, families that move to take advantage of in-state tuition may have a time-delay, and could be treated as out-of-state even if their actually living in-state, but that would be, at most a 1–3 year time-lag on migration.
  3. Residency may change between admission and attendance. A student admitted in December could move to a different state in June, then move from that state to college. So, for example, a person graduating in Richmond, VA and attending UVA could move be admitted to UVA in February, move with their family to Florida in June, then attend UVA, and be counted, for survey purposes, as a non-migrant. It’s not clear how ACS would count this student.
  4. It’s not clear how schools determine the residency of their students at time of admission. Whatever methodological biases may exist in this residency-determination process would persist to the higher-level data.

These caveats, however, are very minor compared to the caveats for the ACS in assessing student migration.

So the New York Times Did a Good Job?

Well, yeah. They did. And I’m rarely very enthusiastic about migration-reporting. But this was a neat piece of research.

Their work, of course, was restricted to only public universities. Their topic isn’t actually student migration, but state university funding cuts. So they look at just public university migration. Truth be told, I don’t think their linking of the migration data to the policy story is very good (public funding cuts should induce shift to private as well, and you need to compare across years, and even then a reduction in public funding should lead to increased recruitment of out-of-staters, so it should boost inflows and outflows!), but their coverage of the data itself was perfectly reasonable, and their highlighting of this unique source is a real service. Maybe others were aware of this tool, but I was not.

Now, where the NYT really did us a favor was in aggregating. The IPEDS data is intended to be used to compare across institutions. But I don’t care about institutions; I care about states, or cities, etc. Grabbing geographically aggregated data from the IPEDS database takes a lot of legwork, and I have to think Nick Strayer, the NYT intern who made the visualization, for answering some of my questions about IPEDS.

What Can We Do With IPEDS Data?

We Can Poke Illinois With a Stick!

There’s another way we can use IPEDS too, of course. We can look at all students, and we can do it on a long time-horizon. Let’s look at Illinois to see how we can use this data. It’s a good example, because the NYT piece singles it out for its budget cuts.

First of all, let’s make a time series of student inflows and outflows for Illinois.

First thing’s first: I apologize for not using my usual visualization tool, Datawrapper. It was really struggling with representing inconsistently reported data in multiple bars. Excel it is.

Second of all: what the heck is going on with the even/odd year stuff? Well, turns out that the IPEDS reporting is only mandatory for all schools in even-numbered years. In even-numbered years, many private schools especially do not report. You’ll notice a few data quirks. First of all, inflows are really wonky around 2006–2008. There appears to have been maybe some kind of reporting delay or year-mismatch there, not totally sure. Second of all, for inflows after 2007, and outflows in all periods, the voluntary-reporting odd-numbered years have lower numbers. That’s to be expected.

The result of these data discrepancies is the following net migration chart:

Note: all these charts are to the same scale for easy visual comparison.

As you can see, Illinois is a net-freshman-loser no matter which year you look at. Except for the wonky 2006–2009 period, the more complete mandatory-reporting seems to make the net outflows more severe.

So Illinois loses something like 15,000 would-be-college-students a year. Fire up the panic engines! This here is what we call a good ‘ole fashion brain drain crisis! Well, actually, no, not really. Such concern might be ill-advised.

See, we really ought to adjust these figures for, I dunno, some kind of population baseline. There are two baselines we could use. We could go with the total number of IL-originating college students, so outflows + stayers, or we could go with the total number of college students in IL, so inflows + stayers. Let’s assess both.

By now, you’ll recognize those sharp annual changes as the result of the even/odd year reporting cycle. But helpfully, our two methods of assessing the net migration rate for IL college students yield pretty well the same results.

Here’s a “stylized” version of the above chart, where I’ve smoothed out lots of the volatility and tried to strip it down to some kind of recognizable trend:

We see relatively severe negative net migration of college students in the 1980s, then it gets a bit better and stays about steady until the mid-2000s, then it begins to decline pretty steadily until the present day. To be clear, this decline is in fact more-or-less due to rising outflows, rather than falling inflows.

Illinois’ recruitment rate, that is, the percentage of each new freshman class coming from out of state, has actually doubled since 1986, from 6% to 12% or more: IL universities are becoming more competitive for national recruitment, not less so. Meanwhile, the state’s departure rate (outflows / outflows + stayers) has chugged steadily upwards too, rising from 18% to 28%. That 10% gain in departures outweighs the 6% gain in recruitment.

If we do some fun fill-in-the-blanks games for the empty years in the data, and some ballpark corrections for the odd-year data, we can get a very rough idea for the total number of college students who’ve moved in/out of Illinois. It turns out, since 1986, Illinois has recruited about 300,000 students from other states, while it has seen about 760,000 students depart for the rest of the nation.

Is Brain Drain Actually Bad?


We are conditioned to think of negative numbers as bad. Lost people means a place is not very good, right?

Well, in some cases, yes. People tend to leave not-very-good places. But college is a different beast.

Consider a town with really good public schools, like, really top-flight. Are those kids more or less likely to remain close to home for college? Well, obviously, they’re less likely to: they will receive offers from many schools, get scholarships, have options, and, on the margin, a larger number will leave. Student departures can be indicative of undesirable local higher education or strong performance by local secondary schools. Or both! Now, off the top of my head, I don’t know which factor is dominant in the case of Illinois. But we should not be too hasty to make value judgments here.

This “brain drain” may be a symptom of a bad, expensive higher education system in Illinois. Maybe. Or it may be a symptom of relatively good secondary education. Or it may be a symptom of Illinoisans simply feeling less attached to their state for whatever reason. Or it may be something else altogether. Given the state of the data, I cannot readily assess how many of the Illinois student diaspora perhaps eventually return to Illinois. It’s possible this could be a substantial number.

But I’ll leave you with this. We can use IPEDS data to calibrate ACS data. First of all, we can just subtract out all IPEDS-migrants from ACS migrants. For 2014, for example, we have 14,000 freshmen moving into Illinois, versus 209,000 total ACS inflows. That means that college freshmen were about 6.8% of total inflows into Illinois: a substantial population.

But that assumes that ACS actually caught all the college freshmen. We can take a crack at that. The IPEDS data also allow us to track only freshmen who just finished high school, i.e. most likely 18-year olds. We can compare that to 18-year-old migration. Illinois received about 14,000 18–19 year-old inflows. College freshmen who graduated high school in the last 12 months make up about 11,000 migrants.

So it’s possible that they’re all being captured by the ACS! That’d be swell! Then again, with just 3,000 non-college migrants, 18–19 year olds would go from being a very high-migration population to being very low migration; indeed one of the lowest migration rates of any age group. To get their migration rates back up to parity, you’d need to assume ACS missed between 2,000 and 6,000 migrants, which is, to be clear, a pretty substantial error rate, and beyond ACS’ stated margin of error.


A huge thanks to the hard work put in by the team at NYT to produce a fun and thought-provoking visualization. Maybe others were aware of the IPEDS migration data, but I certainly wasn’t. IPEDS data is a very useful supplement to other migration data sources for the specific population it covers. That’s especially valuable since college students are a major component of migration, but very hard to measure. Using IPEDS data, we can have an additional check on ACS data, to see if it seems within the bounds of plausibility.

Check out my Podcast about the history of American migration.

If you like this post and want to see more research like it, I’d love for you to share it on Twitter or Facebook. Or, just as valuable for me, you can click the recommend button at the bottom of the page. Thanks!

Follow me on Twitter to keep up with what I’m writing and reading. Follow my Medium Collection at In a State of Migration if you want updates when I write new posts. And if you’re writing about migration too, feel free to submit a post to the collection!

I’m a graduate of the George Washington University’s Elliott School with an MA in International Trade and Investment Policy, and an economist at USDA’s Foreign Agricultural Service. I like to learn about migration, the cotton industry, airplanes, trade policy, space, Africa, and faith. I’m married to a kickass Kentucky woman named Ruth.

My posts are not endorsed by and do not in any way represent the opinions of the United States government or any branch, department, agency, or division of it. My writing represents exclusively my own opinions. I did not receive any financial support or remuneration from any party for this research. More’s the pity.