By Michael Surran (Students taking a computerized exam) [CC BY-SA 2.0 (http://creativecommons.org/licenses/by-sa/2.0)], via Wikimedia Commons

Student Privacy and Social Infrastructure

Does the public have a right to use personal data?

Current debates over student privacy should remind us that infrastructure leaves legacies for decades, including policy and technological infrastructure. Earlier this summer, the New York Times published a column by Sue Dynarski about one proposal to change student privacy laws, a proposal that would make it almost impossible to use administrative records for educational research. Dynarski identifies this problem proposal as S. 1341, a bill sponsored by Louisiana Senator David Vitter. Vitter’s bill is one of several in Congress addressing student privacy issues and currently has no co-sponsors–in other words, at the moment it is highly unlikely to become law. The provision that worries Dynarski is the requirement for active parental consent for the transfer of any educational records to a third party, and a separate consent process for each such transfer.

At its core, Dynarski’s argument is that educational research on the effectiveness of programs is too important to wipe away with concerns about privacy. On some issues she is wrong. Dynarski wrote that she knows of no “Target-like” breaches of educational data. She works at the University of Michigan. I am guessing that its servers are under regular attack by hackers as are the servers of many other colleges and universities; there were several publicly-known breaches last year. I could also quibble with her claims about the “original purpose” of student data records. However, both issues are tangential to the central question of whether it is possible to protect student privacy today and still allow all of the types of educational research Dynarski and I value.

One problem is that we inherited the federal legal structure to protect student privacy from its creation in an era that was different from our in three fundamental ways: the original Federal Educational Rights and Privacy Act (FERPA) language in 1974 was motivated by a different problem, long before easy data transfers, and when most educational records were held directly by school districts, colleges, and universities. We can easily see today that technology creates new avenues for threats to data privacy. It is harder to see the other two differences.

The history of the original language makes clear that a primary concern motivating FERPA was not privacy in an Internet era: it was the regular practice of schools at the time that hid educational records from students and their parents. From the Congressional Record of December 13, 1974:

The purpose of the Act is two-fold–to assure parents of students, and students themselves if they are over the age of 18 or attending an institution or postsecondary education, access to their education records and to protect such individuals’ rights to privacy by limiting the transferability of their records without their consent…. Under the Family Educational Rights and Privacy Act, a parent is given the right to challenge the contents of his child’s records to insure that they are not inaccurate, misleading, or otherwise in violation of the stu- dent’s privacy or other rights. (vol. 120, p. 39862)

FERPA is the federal law that established a fundamental right that you can see your own or your child’s educational records and, if they are in error, challenge the accuracy of those records. Congress enacted the language in an era when schools regularly classified students as mentally retarded without seeking the permission of families to assess children and without having to show parents the relevant data. FERPA also has prevented schools from casually sharing educational records without the consent of students or their parents. As is the case today, the early 1970s was a time when Congress scaled back some of the national-security surveillance state, and FERPA’s creation was partly in that spirit, but not entirely. FERPA did not eliminate the ability of law enforcement to request information from schools; on the other hand, it did prohibit sharing of information such as discipline and transcript records with potential employers or many others without student or parent consent.

After FERPA’s passage, school systems raised questions that suggest they had a casual approach to sharing information about students: while the Senate approved amendments to FERPA on December 13, 1974 (the date of the passage above), it is clear in context that school officials had to ask for clarification because many of them had never considered protecting student privacy to be something that required a broad and carefully considered approach. They scrambled in response to the law because they panicked, and they panicked because they had never prioritized privacy.

After that initial scrambling, schools created procedures to protect student records. Those procedures are the legacy of the law, a legal and policy infrastructure. Like roads, procedures are paths that schools and their employees can follow because it is easier and safer to do so than to drive in the territory of ad hoc decisions, where all sorts of accidents happen. Smart teachers and administrators like procedures because they want to avoid legal accidents in the same way that good drivers avoid accidents on the road.

Here is the key point: All of the violations of good sense prevented by FERPA are actions related to records created and held by schools. That was fine in an era when records were non-digital or held only by schools and colleges. That era is over today, not only because of technology but because of the relationships that technology makes possible. Today, schools are not the only entities that create educational records: if your school or college contracts with a private vendor for some educational service, the private business is often creating educational records tied to individual students. If an individual teacher or faculty member uses a service such as Voicethread, Edmodo, or any of dozens of others, they are often creating records held on the servers of those private businesses. If an individual teacher or faculty member keeps records on their personal computer, and that computer is backed up in the cloud, some private party has that data. In some cases, but not all, school systems and colleges and universities require their vendors to meet standards that protect FERPA. In almost no case am I aware of do individual teachers and faculty read the Terms of Service for private services they take the initiative to use. The procedural infrastructure of FERPA was built for the era of paper records held by schools and colleges. It is not just technology that has ended that era. It is how technology has enabled different relationships between teachers, schools, and private businesses.

But education is not the only area in which privacy is threatened by how technology makes oversharing so easy. Ask everyone whose IRS or commercial records have been hacked. Nor is “technology for sharing” limited to computers: ask the family of Henrietta Lacks, whose ovarian cancer cells became the first cell culture to become broadly used without the consent of either Lacks or her family. So addressing student privacy is a tiny piece of the puzzle. Instead of focusing just on student privacy, we need to address privacy more broadly, with a general principle that can then be fleshed out by specifics for each area, whether it be medical records, financial records, or educational records.

One place to go might be the concept of multiple layers of rights to one’s data, where the layers can be tuned to specific uses. In 2009, MIT researcher Alex Pentland proposed what he called the New Deal on Data:

The first step toward open information markets is to give people ownership of their data. The simplest approach to defining what it means to “own your own data” is to go back to Old English Common Law for the three basic tenets of ownership, which are the rights of possession, use, and disposal:
1. You have a right to possess your data. Companies should adopt the role of a Swiss bank account for your data.You open an account (anonymously, if possible), and you can remove your data whenever you’d like.
2. You, the data owner, must have full control over the use of your data. If you’re not happy with the way a company uses your data, you can remove it. All of it. Everything must be opt-in, and not only clearly explained in plain language, but with regular reminders that you have the option to opt out.
3. You have a right to dispose or distribute your data. If you want to destroy it or remove it and redeploy it elsewhere, it is your call. Ownership seems to be the minimal guideline for the “new deal on data.”
There needs to be one more principle, however — which is to adopt policies that encourage the combination of massive amounts of anonymous data to promote the Common Good. Aggregate and anonymous location data can dramatically improve society. Patterns of how people move around can be used for early identification of infectious disease outbreaks, protection of the environment, and public safety. It can also help us measure the effectiveness of various government programs, and improve the transparency and accountability of government and nonprofit organizations.

Pentland is vague about the specific mechanisms to allow data sharing and use in ways that are consistent with the principles of ownership. What is the limit of disposal/distribution rights? I think I know where Europe is headed: people will have a right to dispose of their data, including telling Google to erase their data in web indexing. That would probably eliminate the viability of research using administrative data. Pentland’s specific proposal is unworkable. Yet the idea of different layers of rights is a useful one. If it can be figured out, it would establish a different kind of social infrastructure, something that might be more flexible, less tied to a specific era’s way of organizing information.

I have a suggested name for the bill that could carry this forward–not David Vitter’s bill by a long shot, but something that would (among other things) balance the rights of students and families against the fact that public investment in education exists because it is a public good. The name for that bill: The Henrietta Lacks Privacy Act.