The Truth of Critical Race Theory is in the Data

Campbell Law Innovation Institute
Assembling
Published in
9 min readSep 7, 2021

By Kevin P. Lee, Professor of Law, and Director of the Campbell Law Innovation Institute. The opinions expressed are mine and not those of Campbell University, or any of its agents, subsidiaries, and associates.

An early version of this essay referred to the Anti-CRT Bill what was vetoed by Gov. Cooper.

Jurisprudence and Politics

I have been teaching jurisprudence for nearly twenty years, and it has never attracted so much attention as it has this past year. Sadly, most of the attention has only brought awareness to how little the field is understood and how poorly respected it is in American thought. Jurisprudence is not immediately political; it is concerned with esoteric topics like ontology, epistemology, ethics, democratic theory, and the philosophy of language. But, the political concerns that have brought Critical Race Theory (CRT) to the fore this year have little concern for abstraction or nuanced questions about how knowledge is achieved or how the law reflects moral meaning. The concerns that have thrust CRT into political awareness are concerns about the implications of its conclusions for power relations. It is the implications of the theory, not the correctness of the theory, that is the concern of politics. The politicization of theories, particularly social theories, is fairly routine. Social theories have political consequences, and thus they are never free from political concern. The progenitors of anti-CRT rhetoric argue that it is an “evil” theory, that it is socialist, and that it distorts the record by arguing that systemic racism exists as a fact of American history and culture.

This is a sad misstatement of CRT and one that shows multiple misunderstandings of the theory of CRT and of the nature and purpose of jurisprudence. CRT is, in fact, a view that is inconsistent with contemporary empirical social science. I truly believe that this debacle should inspire a renewed interest in teaching jurisprudence because the issues today are not what they were 50 years ago. And, there is an urgent need for a better understanding of how law functions in the American Republic. Contemporary computer science investigations of the data of law, sometimes call computational law, are now an essential part of the field.

[One of the best aspects of the Campbell Law School program, where I teach, has been its steadfast commitment to teaching jurisprudence as a required class for all Campbell students. One hopes that, given the many successes of Campbell, other law schools might follow suit.]

Law and Information Science

At the heart of the CRT debate is the concern about the objectivity of law that characterized most of the twentieth-century jurisprudence. To clearly understand the contemporary debate about CRT, it is useful to understand some of that debate about the objectivity of law, and particularly how that debate intersects with the development of information technology, which is changing our understanding of objective approaches to social analyses.

For legal positivists, this debate focused on distinguishing law from moral rules. H.L.A. Hart, the most renowned positivist legal theorist of the period, was influenced by the philosophy of the early century, particularly the so-called “linguistic turn” in philosophy that was initiated by Ludwig Wittgenstein. For Wittgenstein, words do not have a one-to-one correspondence between concept and object. In fact, words can still be functional, even when their meaning is sharply disputed. To illustrate this, he considered rule-following. Must one understand a rule to follow it? Wittgenstein concluded that one need not. Rule-following is a matter of behaving in a way that is expected by people within a common practice. It does not require an understanding of the rule, at least not in the sense normally ascribed to the word, “understanding.”

A similar development occurred in the field of electrical engineering in the 1940s, although there is no evidence that Hart was aware of it. The separation of a communications signal from the meaning it conveyed was a breakthrough achieved by a Bell Labs engineer named Claude Shannon. Shannon was working on the problem of the clarity of a communications signal in the long telephones lines that were being laid across the country. He realized that it was useful to consider the quality of the signal apart from the meaning of the content it carried. The signal does not have to be understood by a human ear. The signal could be a series of clicks, like the ones that telegraphers tapped out on their keys. They have no meaning until the human receiver assigns meaning to them.

The similarity between Shannon and Wittgenstein is significant. Both suggest that there is nothing distinctly human about some phenomena that are typically considered to be human. Rule-following or using information may even be present in many natural phenomena. This was the conclusion of Alan Turing, the founder of modern computing science. Turing was investigating a mathematical problem that involved knowing whether a problem could be solved before it was solved. Stated differently, he wanted to know if there are observable features of solvable problems that distinguish them from unsolvable problems? Turing’s solution was similar to Wittgenstein’s rule-following and Shannon’s information theory. He imagined breaking problems into a long series of steps that could be mechanically followed without regard to the meaning of each step. This is what a computer does. We call the series of steps an algorithm. A computer does not need to understand the problem or the steps to solve it. It merely executes them and thereby follows a rule by doing what is expected of it, without understanding the steps that it follows.

Of course, there are many differences too, but the basic theory of computer science bears a useful resemblance to Wittgenstein’s rule-following. We have no doubt today that inanimate objects follow rules. For example, machine learning systems are used in autonomous vehicles precisely to ensure that the rules of the road are followed. An autonomous drone or car does not understand the rules it follows. It merely acts in an expected way within the context of the practices in which it is engaged.

There is an important lesson here about the objectivity of law. As legal functions, including the dispute resolution function, become automated, there is a serious question of the computational limits of legal reasoning. What, if anything, do judges do that cannot be automated? If judging involves blindly and mechanically applying rules, then can the judge be replaced with an autonomous system that blindly follows rules in the sense of doing what is predictable and determinant? Prediction, using the powerful analysis of statistical mechanics on the data of millions of judicial decisions, is growing in significance as a means for obeying the rules without the need to understand them. In the information age, what distinguishes the rule of law from the rule of an autonomous robot judge?

This question strikes at the heart of the anti-CRT argument. What is the rule of law if it is not the outcome of power relationships? Is it the mechanical application of rules to all who come before it? Or is there something in judicial reasoning that is not computable in this sense? And, if there is, what is that intangible sense to rule-following? And, why is it good? There is much at stake in these questions because respect for the rule of law is a foundational pillar of our system of government. Its automation is a concern to the extent that it challenges the function that law has had in the democratic plan.

Emergence and Systemic Injustice

Another feature of a computational approach to jurisprudence follows from the first. Not all actors in society, including the legal system, are individual persons. This is a fact of everyday observation. Collective groups of persons are not simple compilations of individual actors, although they are sometimes treated as though they are in the eyes of the law.

What I mean here is that when individuals are collected into groups, they have behaviors that are not centrally controlled, and these behaviors matter. Social scientists today call this decentralized behavior of the collective “emergent behavior” because it emerges from their individual, uncoordinated choices. (For more on emergence see this previous post) A simple example of emergent behavior is seen in a flock of birds. There is no central command and control system for the flock — there is no hierarchy among them. And, their collective behavior is not that of a super bird. It is something different entirely. The emergent behavior of this sort is something that could not be modeled until recently. The vast data sets and computational power, coupled with new techniques in machine learning, have enabled the modeling of virtual societies where emergent behavior can be studied and explored.

What can be seen in these studies of complex systems is that the collective behavior of a system is not reducible to the intentional actions of individuals, just as a flock of birds is not reducible to the actions of the individual birds. Patterns emerge from the group that is not reducible to the actions of individual agents.

This is a significant development for thinking about race in American history. Even without understanding or intending it, racial competition occurs in hidden patterns that are not immediately apparent. It happens in the human actors who might have subtle biases, and it happens in collectives like corporations, governments, and mediating institutions. It is all around us, in data and patterns that can be easily overlooked or ignored. But, these are the objective facts of contemporary social thought.

The implications for CRT are clear. The data suggests that the system is indeed full of racial competition, power struggles among the races, and systemic disadvantages that are built-in features of how we do business today. Some notable examples have been found in areas like resume screening, consumer credit, bail bonding, sentencing, countless other tasks that are now automated. It is present in the data of history too. From overt acts of flagrant racism that go unnoticed to subtle histories of neglect, bad policies, abusive practices, and unfair laws. It shows in outcomes, from education to health to wealth to longevity. It is the inevitable consequence of the legacy of naive social thought that systematically enslaved Africans, committed genocide against Native Peoples, and later pitted classes of people against each other in the mad belief the eugenics movement that the morality of society could be improved by treating people like cattle.

The Danger of Obfuscation

Note the danger of this obfuscation that the anti-CRT argument poses to the rule of law in the information age. If the judge is not to be replaced by an autonomous system, it is because the judge possesses personal reasoning that is morally good and beneficial to society. This type of personal reasoning is intuitive though, and it is subject to bias. A smart judge might be able to use some technology to reason better, to detect unintended bias, and to root it out from judicial decision-making. But, for this to happen, the subtle complexity of the role that law plays in a democratic society must be at the forefront of the act of judging. To do this, judges will need to be trained in jurisprudence that embraces the empirical findings of contemporary social science. Above all, they need open minds and hearts. The anti-CRT argument misses out on the opportunities that engagement with CRT can have to open minds and hearts.

In past ages, people were convinced that law should reflect the goodness of nature, that it should be in harmony with it, and that it should be targeted toward the creation and maintenance of a flourishing society. Modern thought sought to turn law into technique. The idea that a judge should mechanically apply rules without regard to their broader social meaning and consequences is a legacy of early modern thought that in the age of virtual social science is difficult to maintain. The demands of justice are never easy to contain. They take into account areas beyond law and spill out consequences that run far beyond the litigants in any lawsuit. It takes nuanced judgment to apply laws justice. It takes personal insight and a good heart — perhaps most importantly, a good heart.

In his classic essay, “The Abolition of Man” Christian author C.S. Lewis described the modern desire for technique over engagement with the full richness of the moral imagination as the creation of “men without chests.” He meant by this, people who have no heartfelt desire for the Good which is revealed in nature. Lewis wrote of modernity:

“We make men without chests and expect of them virtue and enterprise. We laugh at honour and are shocked to find traitors in our midst.”

These words resound with insight and truth. Judges today, particularly ones who profess to be Christian, should take Lewis’ warning to heart. We need judges and lawmakers with chests.

The conclusion here is two-fold: the objective reality of racial bias calls on lawmakers and judges to recognize the moral meaning of historical facts and present realities about race in America. If the law is to be applied blindly and mechanically without regard to this emotionally charged history, a computer might do a better job of judging than a flesh and blood judge, who will undoubtedly bring human frailties in the margins of interpretation. The alternative is to have judges who apply the law with the judgment that was once called wisdom. But, this requires the courage to have hearts and minds open to the moral meaning and possibilities that are enshrined in the highest ideals of the law and legal profession.

--

--