Big Data Quality: Waiting a Decade to Correct Listing on Terrorism Watch List

Why is this data quality story appearing in the Big Data story stream?

Because Big Data’s persistence, in the absence of archiving, means that errors made in data acquisition persist beyond the processes and people responsible for the collection. When the data is critical to an individual, this data quality problem looms larger than in some smaller data scenarios.

As Raymond Bonner @RayBonner1 wrote for his story in Pacific Standard, the use case studied by ProPublica shows just how persistence and pervasive such lapses can be:

Rahinah Ibrahim, a Malaysian architect with a doctorate from Stanford University, knows from personal experience that they have a compelling point. Ibrahim is the only person since 9/11 to file a court challenge that ultimately removed her name from the watch lists. It took her almost a decade to prevail in court and even that victory has proved phyrrhic. While a federal judge agreed that her inclusion on the no-fly list was groundless, she remains unable to obtain a visa that would allow her to visit the United States even to attend academic conferences. A close look at her case by ProPublica provides dramatic evidence of what was argued this month in Washington: It is indeed remarkably easy to get on the list and nearly impossible to get off.

While most would agree that the problem is not new (GIGO), Big Data amplifies the impact and can make remedies more difficult, especially as the time from collection to use increases.

Image Credit: Wikipedia. Stories are collected for the Big Data Standards public working group collection.

Originally published at Big Data Standards Feed.