Is it appropriate for professors—in the dark corners of university basements where curricula are determined—to label their students as naïve for believing in intelligent design?
Sadly, whether tacitly or explicitly, this is a common occurrence among college faculty.
Imagine sitting in a room with professors from several departments discussing the complexities of teaching and learning in order to improve learning goals. One professor, a geology expert, starts to smugly chortle (partly in erudite pomposity, partly in teacherly frustration), and begins to tell a personal anecdote: “I had a student the other day tell me that she can’t comprehend the age of rocks because she believes in intelligent design! [Professor sighs, shaking her head]. We give them all the scientific evidence and they just can’t accept it!”
The room lets out a collective laugh and then an English professor pipes in: “And some will just say that they feel intelligent design to be right!” Another collective uproar and a condescending sigh and shaking of heads from the group.
I’ve been in that meeting. But as an actively religious college professor (a seemingly rare pairing of characteristics these days), I was the one who didn’t laugh. Rather, I was puzzled and disturbed. Since when did accepting intelligent design become so easily equated with naiveté? And, more importantly, since when did it become acceptable to make fun of students—even if behind their backs—for their religious upbringing?
Oh, sure, in “informed” circles, it’s well known that Darwin “killed God” somewhere around the same time as the American Civil War. And academics and scientists have been riding that train ever since. And it is certainly no secret that religiosity is waning worldwide, a result of the spread of so-called reason.
But this conversation happened in a room with a dozen or so faculty who barely knew each other. Wasn’t it possible that someone, anyone (like myself) might actually have a predisposition for intelligent design? Wouldn’t there be fear of political incorrectness, in offending a possible member of the group? And wouldn’t there be some hesitation to make fun of students—on a college campus nonetheless—because of their socio-cultural roots?
Clearly not. Clearly, it seemed inconceivable that any of us “educated” folk would be so intellectually immature as to think otherwise; and, clearly, calling students out on such simple and wrong-headed thinking would only be natural and logical.
In numerous meetings throughout my career, I’ve heard similar open mockeries of the idea of religion. You’ve probably heard it, to: religion is for the weak, the uneducated, and even the bigoted. It isn’t for sophisticated folk.
Sure, I understand how people can, with such a contemporary cultural proclivity toward empirical evidence, “educate” their way out of religion. It isn’t exactly easy to believe and follow an unseen God, especially if you weren’t raised to do so from childhood. But I’m baffled at how many are so willing to accept science as the position of proof. And then consider dumb those who don’t follow suit.
Why is it that science has such a stranglehold on what we perceive to be real? Isn’t it obvious that science is an opinion, an ever-changing collection of human ideas, built out of theories, bureaucracies, and budgets? Science, albeit wonderful, is no more valid than any other form of knowledge, religious or otherwise.
But for an academic like myself to make such a claim seems borderline heretical. Or, at the very least, blasphemous. Isn’t science the physical proof of human progression? And isn’t it the byproduct of strict, rigorous method? Isn’t science what gives us accurate information about nature, about photosynthesis and cancer and solar energy? Isn’t science what’s real?
Yes. Certainly. But science also taught us that Pluto was a planet. Until it wasn’t. And did you know that, until the mid-1980s scientists actually thought that you couldn’t treat ulcers with antibiotics? A rogue scientist had to swallow a petri dish of bacteria just to prove to his scientific community that ulcers were caused by the bacteria h. pylori and weren’t stress-induced. And scientists still took another decade to get on board with the idea. This is a common occurrence. Science is in flux, changing through process of consensus, morphing from truth to error to a new truth.
Of course, I point to science’s flaws with a pure love and exhilaration for scientific discovery. I don’t hate science. I just don’t know why we have to feel like it is the end-all be-all.
In fact, it’s fairly simple-minded to assume that scientific evidence is the core and only valid form of knowledge. Most, for example, would consider historical and forensic evidence as valid, too. But human knowledge doesn’t stop there. What about our emotions? I certainly don’t need scientific evidence to tell me when I’m sad or scared or jealous. The proof is inside me. But is that knowledge any less real than something discovered in lab coats and algorithms?
And we’ve all heard of people being inspired through dreams or having epiphanies in the shower. Are those not valid forms of increased understanding? Or what about cultural knowledge, like the fact that we Westerners view black as a color of death and sadness and that we physiologically react to it? In India, white has that same effect. Brain researcher John Medina (Brain Rules) has even noted that much of what we see in the world around us is affected by our cultural and familial upbringing. Evidence, in other words, is idiosyncratic.
In fact, just recently, nursing homes are discovering that patients who suffer from dementia and Alzheimer’s are coming alive again through personalized music collections on iPods. The effect only works if the music is tailored specifically to the individual’s life history and personal proclivities. Such personal enlightenment is beyond the bounds of science, but so very real to these people.
Yet, for a reason rooted in history, we Western societies overwhelmingly want to believe the numbers that the scientists produce. No matter what. And we tend to believe that those numbers somehow trump, no matter what, our life experiences, our internal knowledge, and our cultural, spiritual, and otherwise “intangible” understandings.
But science, just like internalized knowledge, is nothing more than an educated guess. Much the same way that individuals come to know themselves, their emotions, and their cultures, scientific knowledge emerges through experience, through trial and error, and through collective reconciliation.
Assuming that intelligent design isn’t valid because its proofs aren’t as visible as evolution’s is shortsighted; such an assumption doesn’t recognize that the scientific explanation of the world is just as incomprehensible to those of us who adhere to an understanding of creation as intelligent design is to those believe in science.
If I dumped a pile of bricks, two-by-fours, and mortar in front of a scientist and told him that in 6 billion years it’d morph—through coincidence and evolutionary process—into a four-bedroom home, you can imagine the reaction I’d get. But that’s essentially what he is telling me with the Big Bang (although I think we’d all quickly admit that human life is far, far more complex than a brick-and-mortar building, let alone everything else that surrounds us). It’s not that the Big Bang is silly; it’s just that it is a collective theoretical perspective, grounded in one form of knowledge.
The Big Bang is a socially-constructed viewpoint built over centuries of scientific evidence and consensus. And I completely understand why some would choose that as their method of understanding the universe. But I have a hard time comprehending why, so recently in the human timeframe, it receives any more validity than thousands of years of historical and spiritual knowledge that has long defined intelligent design.
I once interviewed a computer scientist who studied the effects of virtual reality on test performance. Her work was part psychology, part human-computer interaction. She allowed me to sit in on a doctoral presentation in her academic department where the speaker addressed the way the human brain responded to sensors. The material was, of course, way over my head. But what I found fascinating was the dialogue that happened after the presentation. Everyone in the room (all well-recognized scientists with PhDs and other noteworthy accomplishments) seemed to collectively agree that the research was amazing and effective. But everyone also seemed to have a different opinion about which algorithm, which theory, and which previous research to rely on moving forward.
When I asked this scientist about my observations, she confidently told me that that was a very common part of the scientific process: bickering over theories, choosing and redefining algorithms, reworking projects until the results emerged, and choosing appropriate literature to support the findings. But when I mentioned that the process seemed much like what I did (I was studying to get a PhD in rhetoric at the time), she about flipped.
“We don’t persuade!” she insisted. “We let the facts emerge and we share the truth. The results don’t lie.”
In all fairness, I never said that the facts lied. But they emerged through a process of deliberation, of trial and error, of community recognition and acceptance. And this is how all science works, as a process “in action,” as scholar Bruno Latour once said. Science is, in other words, a continuum of ideas and it is no more definitive than a collection of stories told over centuries.
Much of our belief in science is rooted in centuries-old thinking that knowledge is derived from experimentation, observation, and repeated results. Scientific method can be traced to early empiricists like John Lock and Davide Hume and Francis Bacon. These thinkers were insistent that there was no room for language and persuasion in science. And through the creation of an elitist group, the Royal Society of London, such thinkers ingrained in the minds of governments and laypersons alike that science and humanities are separate, that they don’t belong in the same conversation. Believe it or not, until that point (in around the 16th century), spiritual, tactical, emotional, sensual, and experiential knowledge was considered a very valid part of knowing.
Imagine that. For thousands of years, human beings accepted inspired and divine knowledge as a part of truth. In the last 300 – 400 years, however, acceptance of “non-scientific” knowledge has more or less atrophied. And in the last few decades it has all but become a joke in many scientific and academic communities.
Since the inception of America, religious freedom has served as a key component to our culture. But while scientists and religionists haven’t always seen eye to eye, there has been at least a passive acceptance of both forms of understanding. Scientists did their work to improve daily life through medicine and technology; spiritual leaders did their work to promote the big picture, to give science its meaning. At the very least, they left it up to the individual to make their own decisions.
But we are arriving at a sad and difficult state where the leaders in our educational institutions have all but ruled out the possibility intelligent design. Scientific knowledge has become not just the impetus behind technology and medicine, but behind the sole and unified perspective on human understanding. And resistance to it is not only stifling religious tolerance, but religious freedom and the human spirit of learning and knowing.
It’s one thing to disagree or have a different educational perspective. But it is something entirely different to consider someone dumb and naïve for not accepting a particular way of knowing. The truth is, religious knowledge is just as valid as any other form of knowledge. And a truly holistic knowledge embraces simultaneously the many forms of understanding, beyond just science and religion.
As I sit through more and more situations where belittlement of religious knowledge becomes commonplace, I can’t help but wonder: What are our future students coming to expect when they enter college? Is it fair to say that their religious voice is being silenced through a fear of scientific ridicule? Are our institutions of greater learning actually restricting the breadth of knowledge by subtly (yet forcefully) insisting that religion is nothing more than a child fantasy?
If so, it seems that we’re headed towards a fairly dreary era in human existence.