by Emilio DeGrazia • In his novel Light in August, that profound exploration of how one man, Joe Christmas, is violently victimized by his lack of knowledge and his murderer’s perverted knowledge, William Faulkner begins Christmas’ narrative with these enigmatic words: “Memory believes before knowing remembers, longer than knowing even wonders. Knows remembers believes a corridor in a big long garbled building of dark red brick.” These suggestive words take us into the psychic corridor of the school where the conspiracy of circumstances that dooms Christmas as an adult is set in motion. The passage at once distinguishes the mental powers––belief, knowledge, memory, and wonder––we so ineptly confuse as we try to find our way in our own schools and life journeys. More importantly, Faulkner prioritizes the mental powers and suggests how they work in combination as driving forces. “Memory believes before knowing remembers. Believes longer than recollects, longer than knowing even wonders.” What we call the human mind works in complex but predictable ways. Belief, for better or worse, is the main force driving us, and memory (or recollection) follows belief, is prejudiced by it. Knowing comes in last.
If what Faulkner implies about human nature is true––that our hybrid psychic processes, and our actual lives, are driven most forcefully by belief––it would seem wise and practical, in institutions of higher learning like ours, to explore with deep seriousness the relationship between knowledge and belief. I’ll try to be intellectually honest with you. I’m not equal to the task and suspect that my conclusions would be driven by my beliefs, colored by my memories, and clouded by my wondering.
What I offer here is much less than a new profound theory about how we can balance conflicting claims and arrive at truth: I present instead a few impressions, something of a sermon actually, about “intellectual honesty,” a phrase I first heard in college many years ago.
Frankly, I haven’t heard those two words very often recently. I seldom used the phrase in my own classrooms, assuming perhaps that it was unnecessary in a climate of opinion where the concept was supposed to be well understood, just everywhere out there in that intellectual classroom air. But the words were not in that rare air, and rarely used by faculty or administration colleagues. So I’ve begun (what Faulkner perhaps would call) wondering: Has my memory been failing me? Does the concept still have currency, perhaps in other terms? If so, what real viability does it have? What do I believe and know about intellectual honesty? What is intellectual honesty? Is it something I just made up? Is the concept, and the reality it is supposed to represent, terminally ill, maybe dead?
If so, I’ll venture a revival by defining it into being, first by negation. In my mind, intellectual dishonesty should not be confused with sins such as stealing money, shoplifting, chasing after your neighbor’s husband or wife, and plagiarism. These sins strike me as simple-minded manifestations of small-minded moral inadequacy. These ways of missing the mark are what the Roman Catholic Church might call “venial,” and if unredeemed Dante would reserve a few unpleasant gnarled circles of his Hell, maybe merely a time out in Purgatory, for offenders of this type. In Dante’s scheme of things the deepest and worst circles of Hell are not for those who commit banal misdeeds or passionately hot sins of the flesh. The circles for the worst sinners are dark and cold. These are reserved for intellectual criminals, those whose violations of moral codes are coldly calculated and result in destructive consequences to great numbers of bystanders. Offenders in Dante’s ninth circle of Hell are inflated with pride as they sit high in positions of power and influence, and their decisions do harm far beyond the individual offending self. I see intellectual dishonesty in this light. It is, if you’ll pardon the religious language, a “Mortal Sin,” a malignant cancer that destroys the very soul of human integrity. Its tentacles are particularly malicious when they get a grip on powerful individuals and institutions and then are secreted into the body politic.
To commit intellectual dishonesty is to violate, through calculation and/or cowardice, the dictates of the best available knowledge, either by omitting, ignoring, or distorting that knowledge. Individuals often do this consciously, but when the practice is popular it is easier for cowardice, unchecked by conscience, consciousness or professional standards, to have its way with us. If, as Socrates alleged, “The unexamined life is not worth living [I would say less worth living],” and if the special calling of a university is the pursuit of knowledge, then we, as professors of knowledge, have a moral obligation to base our conclusions on the best available scientific and rational knowledge derived from ongoing investigation, research, and criticism. An intellectually honest person is one who believes in honoring the dictates of reason, logic, and evidence. Such a person understands the limitations of a field of study, factors in uncomfortable evidence, and confronts misconceptions and deeply rooted prejudices by insisting that available knowledge provides a better claim to credibility.
This, of course, is very easy to do when we are in the know about the prejudices of others, but quite dicey when we’re not in the know about our own. When “Memory believes before knowing remembers, longer than knowing even wonders,” it is very difficult to put honest checks on our own beliefs.
The climate of opinion in which we operate––the cultural soup in which we swim and perhaps are sinking into––does not make our pilgrimages toward knowledge easier. Anti-intellectualism as a force in American history is well documented, and despite our technical and scientific know-how––perhaps in part because of it––we have a wide variety of cults and sects, religious and secular, that make intense emotion the authenticator of experience. It seems somewhat beyond belief that the very premises of the Enlightenment on which the modern university is grounded––the idea that scientific knowledge and rational criticism are the pillars of humane self-government––should be under deliberate and well-funded attack not only by foreigners but by powerful self-interest groups within our society that benefit from the modern secular university. What many once took for granted as an inviolable way of life––the life of the mind as celebrated by institutions of higher learning––is now directly challenged by cadres of confident true believers openly dedicated to undermining those aspects of rational and scientific learning that do not square with their beliefs. The persistent public squabbles over the scientific validity of the Theory of Evolution and global warming are obvious examples of belief’s refusal to take a back seat to best available knowledge. The attacks on thousands of scientists painstakingly trying to gather and piece together the data to support plausible and probable assertions about these two subjects are waged with passionate convictions that obviate the need to play by the ordinary rules of evidence. Credibility is insisted on not as a matter of fact but of belief.
This problem is one of the subjects of Daniel Goleman’s intriguing book, Vital Lies, Simple Truths: The Psychology of Self-Deception (Simon and Schuster, 1985). We stick our heads in the sand, Goleman explains, by keeping our mental gates locked to information that causes social strain and psychological pain. Studies of memory suggest that we are better at recalling information that magnifies our strengths and minimizes our weaknesses, and that depressed people do the opposite. “Groupthink” occurs when we ignore contradictory evidence and fail to exercise independent judgment out of fear of going against the tide. We also have our ways of creating frames of reference that screen in or out of awareness certain opinions and actions. Out of convenience we forget and forget we have forgotten. White and other lies at first get dismissed out of politeness, then from habit, and studies also suggest that as we age we get worse at lie detection and better at subtle and often quiet ways of “turning up the noise” level that will keep us distracted from painful facts. Our history books often reflect our various forms of amnesia, reflecting the implicitly understood and accepted rules about what questions can and cannot be asked. What results are blind spots, individual and collective, that are projected from immediate self-protective instincts that may do us great harm later on.
It is grimly ironic that the habit of intellectual dishonesty has been empowered by those brought to positions of influence by democratic Enlightenment traditions. What the [Founding Fathers] did not anticipate, writes Robert Parry of Consortium News, “was how fragile truth could become in a modern age of excessive government secrecy, hired-gun public relations and big-money media: [that] sophisticated manipulation of media is what would do the Republic in.” The Orwellian slogan that Ignorance is Strength is detailed in a new book called Failure of Intelligence: The Decline and Fall of the CIA by former CIA analyst Melvin A. Goodman. Goodman relates how CIA analysts in the 1970’s were encouraged to deliberately minimize reports of Soviet stagnation and to grossly inflate Soviet military expenditures, so that U.S. military spending could be dramatically increased to compete with what many senior CIA analysts considered phony Soviet threats.
The deputy director who made a career of changing CIA culture in the eighties so that its analysts set aside intellectually honest assessments of data in favor of skewed reports that followed political agendas was Robert Gates, our current Secretary of Defense. For the past twenty-five years, concludes Goodman, the CIA’s moral compass has failed. The result has been “an unending cycle of failure to tell truth to power.” It is any wonder then that military spending under President Reagan more than doubled as the Soviet Union disintegrated and the Berlin Wall fell, or that compliant CIA operatives could be used by President George W. Bush to corroborate his false claim that there were weapons of mass destruction in Iraq?
Were we surprised to learn from David Barstow of the New York Times (April 20, 2008) that the Pentagon had a special program of hiring high ranking retired military officers to be the “hidden hand” delivering the Bush administration’s party line on the war? Were we morally outraged, merely depressed, or looking the other way when we learned that public funds were used to create a propaganda machine aimed at the public, that most of the so-called analysts had lucrative financial ties to military contractors, and that the TV networks ignored the business and political connections of these hired hands?
And doesn’t it seem like business as usual when Juliet Eilperin of the Washington Post (June 3, 2008) tells us: “An investigation by the NASA inspector general found that political appointees in the space agency’s public affairs office worked to control and distort public accounts of its researchers’ findings about climate change for at least two years.”
And are we surprised to learn from Discover magazine that researchers for drug companies have routinely designed and interpreted their own studies “in ways that make even ineffective drugs seem like life savers,” because industry sponsorship of studies is “likely to yield pro-industry results.” (Discover, July 2008).
I could go on and on with examples of this kind.
We have, I think, good reason to be alarmed and a long way to go to put our house in order. That decadence and corruption exist at the highest political and professional levels suggests not merely that it will trickle by example down to the rest of us, but that it has trickled up from us, who we as a democratic people have let ourselves become.
What should especially trouble us about these institutional distortions of truth is the quietly insidious way both the perpetrators and audience for the deceptions accept them as a way of life. We now think of “spin”––the twisting and turning of information so that it serves special interest purposes––as what makes the world go around, a skill now routinely taught to and learned by those selling products, ideas, public policies, and salvation schemes. I’d like to suggest that spin only seems natural and right if, to use Huck Finn’s words, we are “brung up that way.” As our impulse toward honesty has gone silent or in hiding, our leaders––political and professional––reflect in large measure who we have allowed ourselves to become. When knowledge is politicized, when science is driven by market forces and government contracts, and when a profession is conceived as a club or self-interest group rather than as a cooperative association of passionate practitioners defined by the codes appropriate to a field’s standards of inquiry and conduct, how do we tell the truth about a product or manufacturing process our supervisor does not want to hear? How do we take time to explore a glitch when further study might delay a production plan? How difficult is it for a newspaper editor to craft an honest well considered opinion piece when a corporation that doesn’t like the opinion has just taken out a full page ad in the editor’s newspaper? How can we be plain spoken about the professional inadequacy of a friend up for tenure or promotion when we bowl on Tuesday nights with that friend? Do we say, “It is impossible to praise this person highly enough”?
In a culture in which spin is the norm, how can spin not enter the classroom too?
Intellectual honesty is, I believe, under siege in part because of unprecedented technological and cultural circumstances not given sufficient critical scrutiny. Let me mention just three: First, the sheer quantity of information now available via the internet is mind-boggling and so often free of responsible oversight that it is tempting to become cynical about the possibility of establishing credible knowledge. On a whim I Googled “Capital Punishment” and was greeted with 4,590,000 possible sources of information, opinion, knowledge or wisdom concerning this subject. I then narrowed my search to “Capital Punishment Deterrent” (437,000 sources) and “Capital Punishment Does it Deter” (352,000 sources). If I were a student this overwhelming quantity of possible information would not inspire me to sort, distill and evaluate: It probably would send me quietly screaming away, or looking for an easy way out. What we used to call “news” is another example. Robert Darnton, discussing “The Library of the New Age” (NYRB, June 12, 2008) says, “News in the information age has broken loose from its conventional moorings, creating possibilities for misinformation on a global scale. We live in a time of unprecedented accessibility to information that is increasingly unreliable.” Though news has always reflected a point of view, the accepted professional standards of news reporting are now easily ignored by anyone with a web page or blog. We have allowed our democracy and technology to take us into an age when everyone has a right to an opinion, and all opinions, currently available by the billions to millions with a mouse, seem equal.
Secondly, our video technologies have made it possible for whole generations of youth and adults to accept consumerism’s self-promotional devices as a way of life that glamorizes deception. Most of us spend countless hours in front of various screens, with literally years of our lifetimes spent glued to commercials whose irresistibility depends on deception, distortion, and distraction, the clever manipulation of suggestions and symbols calculated to persuade us to buy things for mainly irrelevant and irrational reasons we are not given enough time to consciously think through. It is inevitable that we become in part what we consume––compliantly amused and conditioned by all those entertaining commercials and their alluring spin on things. How can we expect all this spin not to dizzy us, make the grounds of our knowledge uncertain, and the roads to narcissism and relativism wider?
Thirdly, we seem to be normalizing a culture in which the conflicting claims of religious belief and science are not being adjusted or reconciled. Our fragmentation of education is in part responsible for this failure, especially when tied to strictly vocational outcomes. The humanities too often turn a blind eye to the sciences and the sciences look down on the humanities, and the different modes of discourse they use to investigate and interpret experience are often––I think unnecessarily––seen as mutually exclusive. This bifurcation has allowed the gulf between popular science and popular religion to persist, indeed widen in recent years. While religious true believers see science and reason as threats, many in the science and technology fields live double lives: In their work they abide by the strict grammars of science, engineering, and mathematics while simultaneously believing, rather literally, in Noah’s ark, Jesus walking on water, and the Four Horsemen of the Apocalypse. If knowing follows belief, as Faulkner claims, and if we, as educators, have failed to resolve their conflicting claims, is it any wonder that religion and politics often trump science? When intellect, emotion and imagination fail to integrate their differences and that failure becomes a cultural norm, is it any wonder that intellectual honesty falls outside the norm?
We should not be surprised to see universities and the knowledge they profess lose their credibility and see their authority undermined. Several belief systems offer seekers after truth the opportunity to enjoy emotional expression, a genuine sense of community, and the secure sense of order provided by stories that imply simple, and in the worst cases, final solutions to all problems. As anxieties resulting from global pollution, plague, proliferation, and overpopulation increase we can expect the attraction of belief systems to intensify. This spells trouble for weak-kneed knowledge. “Memory believes before knowing remembers, longer than knowing even wonders.”
Belief is, I believe, necessary to our lives, and if, as the biblical wise man tells us, knowledge is sorrow, how can we justify what we do as professors of knowledge? How, in short, can knowledge have a say that might help balance a world trying to make sense of belief? We might begin by renewing our vows––make intellectual honesty the central article of faith in a deeply felt belief system we understand, share, and are willing to explain to people outside our comfort zones. We need no theology to do this, and it does not preclude merrily serving our private gods, or none at all. A recommitment to honoring and insisting on the rules of evidence and honest inquiry is long overdue. This creed would express itself in ritual practices, and above all it would require us to engage in an ongoing and open discussion of our moral choices and ethical standards. If we also deeply believe what we profess, and communicate to our students that we are passionate in our belief, then memory (call it our sense of history), knowledge (our arts, sciences, and criticism), and wonder (our hungering curiosity to know), may achieve new respect in our communities.
What are some basic ritual practices that might help generate this new respect?
First, professors of knowledge need to conduct ongoing discussions of professional standards. A profession is made credible by the quality of the knowledge its professors profess, not by the salaries they command. Nor is a profession made honorable when its members close ranks or slip into cowardly silence when the need to speak out is clear. Professionalism requires the articulation of the standards by which we empower ourselves to arrive at informed judgments in our disciplines.
It should go without saying that we need to maintain courteous, civil and open classrooms and lecture halls, in which individuals are invited to express contrary views. We especially need to invite into open discussion the most challenging contrary views, and to both present and address these views in their strongest terms. It is important that we approach conflicting claims from the inside, empathetically, so we see them in their best light. The hopeful promise that intellectual integrity is based on is that students will gravitate toward the more convincing claims, that they will be stronger for having seen the best claims against a position they take. And it is important to take these habits of mind on the road, into communities threatened by or unfamiliar with serious intellectual pursuits. Like a good basketball team, we need to gain respect away from home, especially when the referees of discourse are unfairly calling fouls on us.
We need to teach intellectual honesty as a concept both by example and by direct application to specific problems. A warning in a syllabus about plagiarism will not do. And we need new pedagogies to address the problems intellectual honesty poses. These pedagogies must go beyond insisting on and testing for “right answers.” They need to emphasize the processes of thought and investigation by which we arrive at answers. In particular they should confront the fact that many students link learning with entertainment. How can honest learning be taught in a way that brings deeply felt pleasure? We might begin by emphasizing that learning is a quest rather than a product to be bought and sold, and that as such it offers the allure of the unknown and pleasures of the quest.
We especially need to be critical of our language and able to explain in clear terms the theories of knowledge that are the bases of our fields of study. Though they belong to different orders of mind, words like “fact,” “statement of fact,” “knowledge,” “opinion,” and “belief” are often used interchangeably, and a scientific theory is often equated with a so-called “theory” based on mythology, uninformed opinion, or personal whim. And let’s be frank: We need to call silly ideas and beliefs “superstitious” when that’s exactly what they are. When we lose control of our language, fail to make it precise and meaningful, we can expect it to control us. Words like “maybe,” “let’s see,” and “we don’t know” should be regular parts of our vocabulary. We should be proud professionals wary of that other kind of pride, hubris, that likes to visit insecure people who prefer to think highly of themselves. Let’s be honest about what we profess: It’s limited, cumulative, revisable, and subject to the blind spots inherent in our minds and instruments of measurement. The knowledges we offer to the world are merely likely probable and only sometimes probably certainly the last word. Though the scientific conclusions about Evolution and greenhouse warming are highly probably accurate, the studies are ongoing. In a good university the book is never closed on any topic. Secure within the belief structure of our knowledge cult is a special agnosticism working to energize rather than depress our curiosity about life. It is this that makes us merely and wonderfully a community of learners working to improve our world.
I find useful the distinctions made by James Carse (longtime director of religious studies at NYU) in his new book, The Religious Case Against Belief. There Carse distinguishes “ordinary ignorance” (“I don’t know the dimensions of a standard canoe, but maybe I can figure them out”), from “willful ignorance” (“Don’t tell me what my wife did with him in that canoe, because I don’t want to know”). But then there is what Carse calls “higher ignorance” (“We may never know what gravitational effect one molecule of a canoe might have on the most distant star in our galaxy, but if we knew how to figure that out we would”). Higher ignorance is earned through careful study and through an exhaustive search of the sources of evidence. It is all the blank walls we hit after trying our best to see our way through them. This kind of searching is profoundly complicated––and perhaps made exciting and pleasurable––by the self-examination of our own private beliefs. Perhaps our exams should also test the quality of what we do not know. Higher ignorance is the degree every university graduate should attain.
Faulkner begins Joe Christmas’ life journey in the corridor of a garbled school when Joe is a five year-old, and Joe dies thirty-three years later crouched behind a kitchen table, never having stepped foot in a university and never having that possibility enter his mind. Faulkner makes it poignantly clear that what killed him was a man, yes, an individual named Percy Grimm, but also a character who personifies the pattern of belief systems that link sex, race, and religion with violence. It should be obvious to all of us that many Percy Grimms are still with us. The suddenly enlarged small global world has many passionate believers not above violently trying to impose their views about sex, race, and religion. And we at home also have our own profit-driven business and scientific interests eager to magnify the destructiveness of war machines. We have a lot of important work to do, and need to do a much better job of giving our educators and our own Percy Grimms something better to believe.
|Support people-powered non-profit journalism! Volunteer, contribute news, or become a member to keep the Daily Planet in orbit.|