Against Fragmentation: The Case for Intellectual Wandering

Carnegie Corporation
Carnegie Reporter
Published in
9 min readJul 6, 2017

by Vartan Gregorian

In our daily pursuit of “facts,” are we — scientifically — obscuring the greater picture from our vision?

Gutenberg’s Triumph — Comprising nearly 1,300 pages in two volumes, the famous Bible of Johann Gutenberg (ca. 1390s–1468) is the first substantial book printed from moveable metal type in the West. Probably completed between March 1455 and November of that year, Gutenberg’s Bible is arguably the greatest achievement of the second millennium. In this mural by the American artist Edward Laning (1906–1981), Gutenberg shows a proof to the Elector of Mainz. The Gutenberg panel is part of Laning’s mural cycle depicting “The Story of the Recorded Word,” painted under the auspices of the Works Progress Administration (WPA) for The New York Public Library’s landmark building at Fifth Avenue and 42nd Street. (Photo: © Steven Brooke Studios)

We live in exciting, exacting, dizzying times. Science has expanded the horizons of our knowledge of nature, while the limits of our physical world are no more the boundaries of our earth. Outer space is merely one of many frontiers that beckon the human imagination. We have sped from the Industrial Revolution of the 19th century straight on to the much-vaunted Information Age, only to plunge headlong into the ocean of social media and the Internet of Things.

What’s in store next? Could it be the “quantum leap” in computing, whose applications are set to revolutionize cryptography, chemistry, biology, pharmacology, nanoscience, artificial intelligence, material science, and more? The great physicist and Nobel laureate Niels Bohr cautioned: “Those who are not shocked when they first come across quantum theory cannot possibly have understood it.” Should we be alarmed at the pace of change, at the inexorable, massive growth of knowledge and facts, at the speed with which our tools to manage and process all of that become obsolete?

Years ago, I read the text of Professor Wayne Booth’s Ryerson Lecture, given at the University of Chicago in 1987, in which he explored the fragmentation of knowledge. His words left a great impression on me, and since then I have followed the debate surrounding the information explosion. Indeed, many great minds — poets, scientists, social scientists, humanists, and more — have confronted the topic, this dilemma of our age: how to reconcile the fragmentation of knowledge while ensuring that the elements of our culture remain intact.

Today, we face a fundamental challenge: how can we resolve complex problems that cut across the artificial barriers between the disciplines without opportunities for creative discourse among educated men and women, without the broad understanding of the premises and assumptions of various academic disciplines?

Albert Einstein, in his inimitable fashion, went right to the heart of the matter, asserting that materialists try to explain all phenomena by cause and effect. But, he added, “This way of looking at things always answers only the question ‘Why?’ but never the question, ‘To what end?’” In our daily pursuit of “facts,” are we — scientifically — obscuring the greater picture from our vision? I would argue that the deep-seated yearning for knowledge and understanding is endemic to human beings and that it cannot be fulfilled by science alone. To search for even a glimpse of the answers to the timeless philosophical conundrums one needs more than Mr. Gradgrind’s “nothing but Facts!” Of course one must also learn how to think for oneself, for each of us has it within us — potentially — to be our own Library of Alexandria. But the seeming insignificance of individual human lives can terrify (a terror that Pascal, the philosopher and scientist, knew and described in his theological work Pensées). Indeed, each of us is but a small corner of the universe. To approach an answer to Einstein’s question — “To what end?” — or to confront Pascal’s fright at the infinities that surround all of us, one must turn to the humanities. To philosophy, religion, poetry, even to mythology.

Hungarian legislator Ágnes Kunhalmi displays an EU flag at a window in the parliament building in Budapest as protesters supporting the Englishlanguage Central European University (CEU) gathered below, on April 4, 2017. Set up in 1991 after the fall of communism, CEU has long been seen as a hostile bastion of liberalism by the government of Prime Minister Viktor Orbán. (Photo: Attila Kisbenedek/AFP/Getty Images)

The truth is that at present there are too many facts, theories, subjects, specializations to permit the arrangement of all knowledge into an acceptable hierarchy. Can there be too much knowledge? It has been claimed, for example, that more information was created in the last 30 years than in the previous 5,000. And that claim was made in 1984. What might those figures be today? Given this amount of ever-amassing information, Sir Martin Rees, the distinguished astronomer and past president of the Royal Society, observed, “It’s embarrassing that 90 percent of the universe is unaccounted for.”

Yet T. S. Eliot’s challenge remains: “Where is the wisdom we have lost in knowledge? Where is the knowledge we have lost in information?”

So, what can be done? What must be done?

Especially today, in the age of information, when we are bombarded from all sides, every minute, every hour of the day and night, it can seem that we are living in the least analytical, the least insightful of times. How to transform raw information into useful, structured knowledge remains a great challenge.

And it is important to note that this “infoglut” is not a new phenomenon, as the media theorist and cultural critic Neil Postman reminded us in “Informing Ourselves to Death” (1990). Postman explained that the Information Age began when the goldsmith Johann Gutenberg created a printing machine from an old wine press, beginning a veritable information explosion. Fifty years after the machine’s invention, more than eight million books had been printed, spreading information that had previously been unavailable to most of society.

So, the explosion of information begets specialization, which in turn leads to fragmentation. However, our age of specialization and the attendant (often excessive) fragmentation of knowledge does not require us to abandon specializations or even subspecializations. As José Ortega y Gasset put it, complexity by its very nature requires specialization. In his seminal 1930 study, The Revolt of the Masses, he observed:

In order to progress, science demanded specialisation, not in herself, but in men of science. Science is not specialist. If it were, it would ipso facto cease to be true. Not even empirical science, taken in its integrity, can be true if separated from mathematics, from logic, from philosophy. But scientific work does, necessarily, require to be specialised.

Each generation has its spokesman who grapples with these tensions. Thomas S. Kuhn, the philosopher of science and author of the now classic The Structure of Scientific Revolutions (1962), once raised an interesting and important point:

Though the increasing specialization that often accompanies professionalization may be deplorable, I take it to be as unavoidable in cognitive development as in biological science. Both necessarily produce branching trees, and intercourse between branches can at best be partial. In this respect professionalism in science and in the humanities are alike.

We cannot and must not castigate those humanists and social scientists who avail themselves of scientific methods and attempt to provide rigid analysis of literary texts, social trends, and historical facts. To condemn science as purely quantitative, while reserving for the humanities the sole jurisdiction of qualitativeness is to indulge in unwarranted snobbishness. To scorn sociology for its jargon, while exonerating philosophy, philology, aesthetics, and literary criticism from that “sin” is equally unwarranted. The scientific passion for verifiability, the habit of testing and correcting the concept by its consequences in experience, is just as firmly rooted in the humanities and social sciences as it is in the sciences.

Nor should we castigate some knowledge and some scholarly disciplines as useless.

In this connection, one must underscore the importance of academic freedom in the cultural life of America, providing as it does a venue for students, scholars, and researchers of all stripes to be wildly creative in their intellectual journeys, to investigate anything that interests them without being constrained by marketplace pressures. This is essential. After all, developing theory is as important as developing practical knowledge. And big ideas generally evolve from small ideas, and small ideas, from smaller ones, still. So there really is no such thing as useless knowledge, as the legendary educator Abraham Flexner argued in his timeless essay “The Usefulness of Useless Knowledge” (1939). He also noted the paradox that we must live with: namely, that human curiosity — and not societal need — has been the driving force behind most of scientific progress. The truly great and ultimately beneficial discoveries of science, said Flexner, were those made by scientists “who were driven not by the desire to be useful, but merely by the desire to satisfy their curiosity.”* In our time the theoretical physicist Freeman Dyson has echoed this by writing that “unfashionable people and unfashionable ideas” have often been of decisive importance to the progress of science.

The challenge facing all of us — from Virginia Woolf’s “common reader” to the futurists of Silicon Valley, from pre-K teachers to the scholars and researchers of higher education who explore all aspects of human knowledge — is the difficulty of achieving the unity of knowledge and the reconciliation of the universal validity of reason and our understanding of the diversity of social and cultural experiences. The challenge calls for integrating and resynthesizing the compartmentalized knowledge of disparate fields: the ability to make connections among seemingly different disciplines, discoveries, events, and trends and to integrate them in ways that benefit the commonwealth of learning.

A little more than a year after September 11, 2001, in reply to an interviewer’s question, I noted how critical it is that Americans become more knowledgeable about the complex world beyond our borders, and how we must strive to acquire a better understanding as to how our national interests fit, or don’t fit, with the national interests of others. The importance of these issues, though certainly deepened and given greater urgency by the events of 9/11, have always been central to much of the work of Carnegie Corporation of New York.

It is apparent that, more than ever, in this age of globalization, Americans need to know more about diversity than uniformity; more about centrifugal forces than centrality; and more about other people’s aspirations, ideals, and anxieties in order to understand the rest of the world. True globalization must involve the universalization of particulars and not just the particularization of universals.

No single culture is universal — but cultures open up universes.

These issues have preoccupied me for decades. More than twenty years ago I gave a lecture to the American Philosophical Society in Philadelphia, titled “Education and Our Divided Knowledge.” In it, I observed that, while it is true that attention to detail is the hallmark of professional excellence, it is also true that an overload of undigested facts is a recipe for mental gridlock. Not only do undigested facts not constitute structured knowledge, but, unfortunately, the explosion of information is also accompanied by its corollary pitfalls, such as inflation, obsolescence, and counterfeit information.

“Counterfeit information” (aka, “alternative facts”) indeed! Truly, as Ecclesiastes has it, “There is nothing new under the sun.” Or, to quote the more elegant King James version of the Bible: “The thing that hath been, it is that which shall be; and that which is done is that which shall be done: and there is no new thing under the sun.”

In 2004, these concerns led me to approach a colleague of mine from Brown University, Stephen Graubard, the former editor of Daedalus, the official publication of the American Academy of Arts and Sciences. I asked him to write an essay as an accompaniment to Carnegie Corporation of New York’s Carnegie Scholars program. Entitled Public Scholarship: A New Perspective for the 21st Century, the piece conjures up the “ghost” of Alfred North Whitehead, paying tribute to the philosopher’s Science and the Modern World:

That impressive tract included the important idea that “Modern science has imposed on humanity the necessity for wandering. . . . The very benefit of wandering is that it is dangerous, and needs skills to avert evils.” Arguing that the “spirit of change” was as necessary to intellectual inquiry as the “spirit of conservation,” Whitehead suggested that “mere change without conservation is a passage from nothing to nothing.” . . . [T]he imperative to be critical of today’s interpretations may be Whitehead’s most important intellectual legacy to a world as different from his own as his was from that of the 19th century.

Can we today — not only scholars, but all of us — show, in Graubard’s words, “greater tolerance for the values and concerns of other societies, emphasizing those elements that make societies distinctive”? That is, can we reject the 19th century’s “Culture of Uniformity”?

Professor Graubard does not stop there: “We live in a new world, made so not by Islamic terrorists, but by the incomparable scientific and technological knowledge created in the last century. It behooves us to understand that world in all its diversity, seen as something other than a new political and economic creation that has eradicated all previous historical roots.”

As Graubard notes, thanks to technology, never has the world been more accessible to scholars — to all of us — willing to master the skills and languages to allow for the inquiry into societies only superficially resembling our own culture (American, or whatever it may be). Access to knowledge is no longer an obstacle. If we hunger for it, we can find it.

Paradoxically, the same information technologies that have been the driving force behind the explosion of information, the growth of knowledge, and its fragmentation also present us with the best opportunity and tools for meeting the challenge of that fragmentation. If the new information technologies themselves seem fragmenting, they are also profoundly integrative. Social media is not the enemy — ignorance is.

At the start of the Roaring Twenties, J. Alfred Prufrock asked, “Do I dare to eat a peach?”

Nearly a century later, I ask: Do we dare to wander?

__________

Vartan Gregorian is president of Carnegie Corporation of New York.

--

--

Carnegie Corporation
Carnegie Reporter

Carnegie Corporation of New York was established by Andrew Carnegie in 1911 “to promote the advancement and diffusion of knowledge and understanding.”