A simplified political history of Big Data
In the Cultural Encyclopedia of the Body, Victoria Pitts-Taylor traces the racial classification and subsequent quantifications of human beings to the Enlightenment. She points out, however, that the period leading up to and continuing well beyond the Enlightenment wasn’t filled with explorations, data collections and quantifications alone. The human taxonomies and classifications were part of a process of colonial conquest, pillage and subjugation. The data collected to “prove” the supposed inferiority or non humanity of certain groups was used to justify the indigenous conquista in the Americas and the transatlantic chattel slavery. The Black body, subjected to statistical analysis, calculations and skin tone measurements, was deemed non human as a result of these taxonomies which, in turn, were created by a collection of data involving physical and/ or behavioral characteristics of the groups in question. Along the same lines, in the Americas, the Spanish crown installed the encomienda, a system of forced labor for indigenous people under which a certain percentage of the indigenous population was forcefully assigned to work in agriculture or mining under the directive of conquistadores who, in turn, paid taxes to the Crown for this service. As early as 1495, Columbus already conducted a census of the Taino population in present day Dominican Republic and Haiti so as to quantify the numbers of natives to be allocated to the Spanish Crown for labor and tribute.
The human taxa based on European colonial expansion reached its peak in the late 18th century, when Swedish scientist Carl Linnaeus classified humans based on racial markers and temperaments. His classification, Systema Naturae, divided groups in five main races with the European race at the top of the taxonomical scale. His work was foundational for the field known as scientific racism. Its repercussions felt centuries later with rippling effects that echo to this day.
In 1840 the United States government conducted a Census that claimed that Northern, free Black people suffered mental illness at higher rates than their Southern, enslaved counterparts. This data, later on contested as flawed and the result of manipulation, was used not only to justify slavery but also to silence dissenters and abolitionists.
In the 19th century, Cesare Lombroso, to this day considered the founder of modern criminology, became a notorious scientist with his theories of criminality based on the physical characteristics of certain groups. He drew inspiration from Carl Linnaeus’ taxonomies and sought to expand them by methodically measuring skull sizes, nose inclinations, eye distance and ear placements to determine the propensity of certain groups to commit crimes. His systematic data collection, based on the pseudoscientific method known as phrenology, led him to conclude that “only we white people have reached the ultimate symmetry of bodily form.” Lombroso based his theories about the criminal propensities of non white people on “biological theories of deviance” that tied physical appearances to specific behaviors. These theories, developed through careful data collections and mappings of the human body were used by police forces across Europe and North America to determine the likelihood of suspects to commit crimes. They were also used by the criminal justice system to impart sentences to those who were believed most likely to re-offend based on their race or physical appearance.
In early 20th century, all immigrants entering the United States were required to pass an IQ test before being allowed to disembark into the country. In turn, the test results were used to influence policy and social programs that benefited those who were considered white at the time. The widespread belief was that only white people (Anglo-Saxon and Northern European white) would benefit from these programs due to their superior intelligence. In 1912, the Psychological Bulletin (a scientific journal still in print) published Columbia psychology graduate Frank Bruner’s studies of available “scientific data” with conclusions about “the mental qualities of the Negro”. According to the analysis of this “data”, Bruner claimed that Black people “lacked in filial affection, had little sense of veneration, integrity or honor” and a long list of etceteras inferred from this supposed “scientific study and statistical analysis”. As recent as the 1920s, the US still had programs in place that allowed for the sterilization of individuals based on IQ test results. Simultaneously, in Europe, “scientific studies” extolled the “superiority of the Nordic races” based on intelligence measurements and socioeconomic data sets.
The algorithm, data sets and the myth of objectivity
Yesterday The Guardian published an analysis of How Google’s search algorithm spreads false information with a rightwing bias. One of the most widespread myths around technology states that algorithms and data are “neutral” fields. Both spawned from some supposed objectivity that transcends personal bias. “Beyond the subjective and into the machine”, as if there was no human intervention mediating the creation of these algorithms, data sets and their subsequent use. In the next few weeks, I intend to expand on these notions with a series of posts that will continue looking at data (and more importantly, privacy protections afforded to data) not only as fields imbued by and within subjectivity but as tools of political intervention over marginalized groups. Stay tuned because the best and least objective is still to come.
Read Part 2 in this series: Private Internet and Public Streets
Read Part 3 in this series: Surveys, vigilance and the myth of neutral data
I am an independent writer with no affiliations. If you find value in the type of work I do, please consider making a donation. Any funds, no matter how small will allow me to continue this ongoing research and analysis. Follow me on Twitter for daily updates.