Imagining Unbelief




My grandmother was a sturdy soul.  Her life consisted of taking care of her demanding German husband, incessant cleaning of a spotless house, speculating about the conjunction of rain clouds and her arthritis, and calling the church rectory for updates on mass times and confession.  She came from a large, loud, tuneful Irish family, pronounced film as “filum” and laughed at jokes three minutes ahead of the punchline.  “Hey Nonnie,” I would say, “Did you hear the one about the priest and the chiropractor?”  The laughing would start ere the words were out of my mouth.

She was patient, gullible, superstitious, carping and kind.  She didn’t like dogs or most of her neighbors, squinted at dust, sermons about Mary, and occasionally at me.  If she had secrets or dark corners to her existence they were buried with her and will remain forever unknown.

She now exists in photographs–often with the image of my grandfather standing in the background with a slight frown–not wishing to be in the picture but unwilling to move entirely out of range.

The photographs are important because when they were taken–mainly in the 1960’s–pictures were a bit of trouble: camera models, film, exposures (as in number of), light and focus were part of the vocabulary. No snapping your cellphone at any stationery or moving object that caught your fancy and then uploading images of you and your best friends by the dozen for the delectation of complete strangers.

I have a theory that the less complicated picture-taking and image- making have become the less sophisticated our memories and imaginations have become–a complaint some social theorists have leveled at “comprehensive” museums and zoos.  Imagination is not stretched.  Memory is not exercised.  Connecting impressionistic dots, sometimes captured years apart, is not required.  We live in the eternal present of the utterly familiar and the easily available Now. History is not needed to explain the familiar.  We know all about it. Thus history is a primary casualty of the widespread feeling that the unfamiliar–especially the past–is alien to the Now.

The tandem growth of religious illiteracy and EZ atheism emerges from the same matrix, one where what is “new” is regarded as good and what is old, or requires time, patience and interpretation, is regarded as irrelevant.  As the cultural gospel of America has always cherished this principle anyway (“A country without history for a people without memory”) the imagination crisis is especially prevalent in the USA.   Religious crudity is nowhere cruder or saturates politics more thoroughly or with greater dull predictability.  Discount atheism, especially of the new and in-your-face variety, is nowhere more disagreeable or less philosophical.

Henry Ford: "History is bunk."

It is enough for the American Catholic to know when the pancake breakfast begins (“after the 9 o’clock”), never mind the aesthetic torpor that his church offers as a sedative for his under-active conscience or the essentials of the faith he never bothered to learn.  It is enough for the liberal protestant to know that a collection is being taken up for Tsunami victims and for the conservative Christian to live in the cozy knowledge of Jesus’ saving grace–which entails the belief that abortion means killing babies and that Democrats want to demolish churches and put up mosques. It is enough for the atheist to see the deformed opinions of the religious majority as proof positive that he is right: God doesn’t exist and religion is for imbeciles.

The fact is, all four of the above have developed their beliefs through packthink.  Stem cell research does not entail killing babies.  America is not a Christian country.  Believing in God is not the same as belief in elves, fairies, and the Loch Ness Monster.  To be fair, the Catholic did not arrive at her position by reading Aquinas or the Protestant by reading Jonathan Edwards or the Muslim fanatic by reading Ibn Rushd or the atheist by reading Julian Huxley (an atheist supporter of Teilhard de Chardin, a Jesuit who had read Aquinas).  They got there by reading pamphlets and the back end of cars.

Julian Huxley

What each group seems to be happy with is the discounted version of the “faith” they have chosen to embrace.  Coming “out” atheist, a mildly cool social stance similar to coming out gay in the nineties, requires the same level of intellectual commitment as coming out Christian, a mildly cool stance of the 1970’s when unseen forces (in Washington) convinced the believing masses they were in for a new persecution by neo-pagans, secular humanists and freedom-hating liberals.

Our presentism, symbolized in the free flow of limitless images and text messages, no longer needs ideas to survive.  That is why bumper stickers have replaced chapters in books as the all-you need-to know summation of belief and unbelief.  “My Boss is A Jewish Carpenter,” “I Support a Baby’s Right to Choose.” “‘Worship Me or I will Torture You Forever’-God,” “Organized Religion: The World’s Biggest Pyramid Scheme.” The hostility among groups and even within groups is not about ideas but about what one side is prepared to believe about the other: fakery not fact, histories robbed of historical location and philosophical positions devoid of premises and analysis.  It is a contest for followers lifted out of the Forum and plonked down into the Colosseum–where both sides will eventually lose.

Which brings me back to the lessons we can learn from photographs.  It isn’t the case that religion has not evolved.  But it is the case that religion has been, in evolutionary terms, unsuccessful in explaining itself to the twenty-first century–and to much of the twentieth. The increasing drowsiness of the flock when it comes to core doctrines may be a blessing for beleaguered theologians who otherwise would have to go on defending what the faithful have ceased to care about.  “Average” believers have defaulted to ground where they are more comfortable–to social issues and sexual ethics, buoyed by a thin belief in scriptural authority and a woeful lack of information about the warrants and religious justification for their commitments.  As religion can only thrive when its explanatory mechanisms are coping with change, its explanatory failure will ultimately prove to be catastrophic, and no new theological idiom will arise to save it.  In my opinion, this has already happened, and not only in liberal and radical circles.

This should serve to make atheism triumphant, but it doesn’t.  If theology has lost its voice and credibility, atheism has lost its imagination and coherence. It has done this by offering, instead of a vision of the godless future, the absurdities and atrocities of religion as the sum total of its own rectitude.  There is nothing wrong with itemizing the failures and hypocrisies of religion; but it does get repetitious after a while, and then the question becomes the Alfie question: What’s it all about?

And there is this detail: The errors-of-religion-motif does not originate with atheists but with religion.  It goes back to the reform movements of the late Middle Ages, and to the Reformation itself, unique among the chapters of western civilization in its brutal treatment of popes, doctrines and sacraments.

Reformation cartoon of the Pope as Antichrist

Religion has traditionally been the best ensurer of reforms within religion, controlling the excesses and extremities of the religious appetite for a thousand years.  It did this and was successful in keeping the beast from devouring its own tail by offering better ideas, different “truths,” a simplified diet and an accommodating attitude towards movements that would finally grow up, leave home, and not write back–secularism and humanism to name two.  What it never did, or was never prepared to do, was to offer no religion in lieu of bad religion.  It has survived into an era where many opponents have joined the chorus that all religion is bad religion.

Yet for atheists to assume that their rejection of God is anything more than an opinion based on snapshots of what they know about Catholics, Jews, Muslims and Protestants is a misshapen view of their accomplishment.

The aggregate outrages of religion do not constitute a proof of God’s non-existence, nor establish a moral case for atheism.  The accumulation and “sharing” of snapshots of things that are plainly ridiculous about religion does not enhance the claim that unbelievers are smarter than believers.  The documentation of error is not the same as the discovery of the truth.  Ridiculing the beliefs of our distant faith-obsessed ancestors or the profanity of violence that seems to soak the pages of the Hebrew Bible, and more recently the Qur’an, belongs to other centuries: it’s been done.  It’s good for a laugh, or a gasp, not for a lesson.

And a final thing. If the contemporary atheist is really interested in the harmful effects of religion, he is up against two truisms that run counter to evolutionary wisdom: the adaptability and survival of religion, despite texts and practices assumed to be harmful to human society, and the fact that atheism has so far struggled unsuccessfully to replace religion with a new diagram of human values.  Unlike Alvin Plantinga, I don’t regard these phenomena as real facts, as “evidential” of the truth of religion, or as reliable justifications of religion based on common sense.  This is because I haven’t the foggiest idea of what it means for religion to be “true” in the sense analytic philosophy comprehends the term.

But I do have idea of what values religion expresses idiomatically and crudely in ways that have occasionally challenged the human imagination.  If religion has a survivability quotient, that can be expressed in evolutionary terms, it is a human quotient.  In their independent ways, the atheist Julian Huxley and the believer Teilhard got that much right.

Blessed are the peacemakers...

I personally believe that the survival of religion can be explained in purely rational ways, and with no guarantee of lifespan.  I also happen to believe that atheism, if it is an informed and historically critical atheism–aware of its own past as well as of the religious past from which it artificially emerges–can develop new templates for human value that test the imagination in the same way that the interpretation of images and artefacts from the human past test, and are resolved in, the imagination through religion.

The elevation of atheism from opinion to something of much greater consequence begins when we see that belief and unbelief are aspects of the same reality.   Looked at in the starkest light, belief is only the other side of unbelief.  It is not a distinction that has the valence of right and wrong. It is pretty clear which came first, what images became dominant, which ones were lost in wars, through subjugation, and by assimilation.  Just like your family album when images were scarce, real and not easily improvable, the total picture of religion that the atheist is called upon to interpret is complex and requires a thoughtful charting of the distance between the rarefied image and the inquirer, a conversation between past and present which is more than an indictment of crimes.  It requires, as Gauguin said about imagination, “shutting your eyes in order to see.”

Dimming the Brights? The Debris of the Dawkins Revolution

There used to be two kinds of atheists: those who lost their faith and those who never found it. The kind who never found it–people like Isaac Asimov and Richard Feynman–had fathers who actually never encouraged their kids to think there was anything to find.

Those who had it and lost it–people like Steve Allen, Julia Sweeney, Seth MacFarlane and George Carlin–seem to have been equipped by their church for a life of infidelity and enough material to last a lifetime.

There are atheists who came from the fields of course: the World Wide Church of God seems to be doing its share to produce them, and the nuttiest of the nutty brood will probably spin off dozens more by natural selection. Fundamentalism has been helpful in producing outrageous opinion and claims that have sent rational minds screaming from the congregation, and they deserve some credit for this.

The lesson in this highly informal typology is that “strong” religion seems to produce more unbelievers than mainline “soft” religion, for the same reason that oysters produce pearls. It’s the “grate factor.” –I hope I haven’t offended too many Episcopalians by saying that they are not doing a good job in this respect: the fact is, they are out in front on a number of social issues that wouldn’t be substantially improved by their becoming atheists. “God” is a small (very small in some cases) price to pay for social progessivism.

There is however a new wave of atheism, neither alienated Jew, Catholic, fundamentalist nor profoundly secular from birth. It worries me just a little–though it–the wave–is young, pretty smart, highly sociable and will probably vote for Democrats. That is reason enough in my book to go easy on it. After all, there are enough yahoos out there in Wonderland to worry about without offending our friends. For that reason, it doesn’t worry me very much.

New wave atheism follows in the wake of the Dawkins Revolution and book tours that featured the so called New Atheists–but especially Dawkins himself. I don’t think for a moment that other new atheists aren’t charismatic, but of the lot, Dawkins and Christopher Hitchens, who is the Hume and Dr Johnson of our time rolled up into one, take the prize for saying the kinds of things, in the right accent, that sound authoritative because they’re said in the tropes of Oxford, whence cometh our hope.

The bad reviews of The God Delusion and God is Not Great stressed that theologians had been having the conversation about classical arguments for the existence of God for a hundred years and had, basically, laid them to one side. They stressed that liberal and radical theology had long since moved beyond the ossified categories of Christian thinking: that no smart person took the Bible literally anymore. Aquinas? Who needs him? Ontology? Eleventh century stuff. Hadn’t theologians, the critics raged, especially in America and England, been using the term “post-Christian” for a generation? Perhaps, but almost no one had paid attention because no one reads theology except divinity school students and other theologians.

No one in any position to cause sales to jump was reading the “professional” books where radical theology had given up on God. And even if ordinary readers had read it, there was the honest sense that if you are at the point of saying that your theology is post-Christian, that Jesus is not the son of God, that miracles are hooey, and that the Bible contains ideas that have been retardant in our culture–you really ought to pack your bags and go away.

There was something elementally refreshing about seasoned scholars and journalists taking on the absurdity of some of the classical argumentation as though they had just discovered it, which for the most part they had. The criticism–which I made on this site as well–that journalists and scientists may not–odd to say–be especially well-qualified to talk about religion seemed petulant and jealous, which of course it was. Who wouldn’t rather have written The God Delusion than Defeasible Assumptions in Plantinga’s Epistemological Reliabilism Argument. I know I would.

So this is not really about getting to be an atheist by shortcutting: not all of us can have a radical Jewish father who wants to keep us away from Torah, or a run-in with Sister Mary Margaret (when there were Sister Mary Margarets) over the plausibility of the Assumption of Mary into heaven. (I was expelled for rolling my eyes).

It is about a rapid relocation of attitudes: people who have made a fairly quick progression from some belief (or not much of anything) to atheism without having at least some of the same background as the New Atheists themselves. It is about the danger of any kind of hero worship and fan-clubbism substituting for a critical assessment of sources. It is, frankly, about idolatry.

The conversation reminds me most of feminism, or rather the divide between first generation feminism and where we are now. The survivors of the sixties and seventies who broke down walls, challenged a sexist system, broke through ceilings and populated professional schools and academic departments with members of their own sex are now confronted with women who either don’t know the story or only know it as yawnable history. The world they have come to inhabit is not the world their grandmothers (yes, grandmothers) fought for. Judging from the number of African American Republicans maybe the same is true of that community: memory is short.

But the dues-paying comparison doesn’t work perfectly.  There are doubtless atheists out there who feel they earned their right to disbelief.  But a strong tranche of movement-atheists would argue that it doesn’t matter how you get there, just so you get there. There are no dues to pay. Atheism is not built on the abuse, bones and ashes of courageous predecessors, as was the case with the women’s movement or civil rights. If you get there from reading market paperbacks or children’s stories by Philip Pullman (who is a friend, by the way) or a couple of titles by Dawkins, so be it. It will do.

What matters to movement-atheists are the numbers, getting that meager 5% or 6% of professed unbelievers up to 10 or 15 percent– where it can claim some political advantage, and not be relegated to the irrelevance that has always been the lot of American atheism. As a movement, the American idolatry of the British atheist “style” has helped–so much so that bus campaigns and bumper stickers are now studiously modeled after the campaigns of the British Humanist Asociation, which itself promotes and benefits from the work of Dawkins and his comrades.

I feel terrible quibbling about this because soon enough it sounds like a quibble about being a good Catholic or a bad Catholic. Do you go to Mass Sundays? Great. Wednesdays and Fridays too? Even better. Hate abortion to the point you’ll march and picket? Best. It ought to be a cardinal tenent of the tenentless philosophy called atheism that no such gradients should arise within the movement. As in Islam, you really only have to believe one thing–or rather, disbelieve it.  In that sense, atheism is or ought to be a settled or definitive position, without qualification–like being pregnant, not like being a Presbyterian.  Atheists often write to tell me that I confuse their exquisitely simple position about God with more comprehensive philosophies like humanism, where gradients are possible.  Yet exquisitely simple atheism has long been the sine qua non of movement humanism, especially in England.

But my quibble is not with cynical efforts to jack up the numbers or the promotion of heroes as magnets to the cause. That’s the way movements work.  It’s the way religious denominations work as well, and they haven’t had a hero for a very long time.

My concern is over the fact that many of the idolaters are now not reading the sources of their distress, not really aware of any but the most contemporary reference points in their estimate of a fundamental religious question.  It is a destination without a journey behind it.

The Bible is considered toxic, in toto; religion, a long history of superstition, distress, and violence–even some of the art, music and literature of the western tradition, expendable expressions of priestcraft and supernaturalism.  In the most extreme cases, the present is regarded as having a juridical role to play toward the past, when people believed silly things.  History becomes a series of mistakes with respect to scientific outcomes and has nothing to teach us but the error of our ways.  What has been tainted by religion is not worth our time, not worth investigating because our vantage point makes it ridiculous. When this attitude takes hold, it is not just God who is disbelieved in: it is culture.

At this point, the debris of the Dawklins revolution becomes problematical on two counts. On the one hand, it permits the new wave atheist to reduce everything to a single proposition: God does not exist; and then to evaluate the entire history of western civilization according to an opinion that has been reinforced by similar opinions but never really tested against the sources. The opinion that God does not exist is an important one. It deserves scrutiny. But it does not deserve doctrinal security as though infallibly propounded by a secular pope.

We cannot cast off the literary and artistic history of our civilization, from Plato to Nato and Bible to Blues without knowing at least a little something about the creators.

In 2002, a number of students enrolled in my course in Civilization Studies at the American University of Beirut walked out of the classroom, in a staged protest, as we began to examine the book of Genesis. It was a book that had been excluded for a dozen years from the syllabus because it raised the temperature during the long Lebanese Civil War. I had made it plain that the story was a story; that some people thought it was historical, but that scholarship had shown it was a typical Near Eastern creation myth with a half dozen well preserved cousins from earlier in the millennium. But my careful historical framing was of no consequence. The students who protested were not Muslims; they were Lebanese Christians who regarded the Old Testament (which of course is in their Bible too) as “Israeli” propaganda.

The point is, of course, that an educated and informed atheism is a very desirable perspective. But an atheism that depends on the authority of others is no better than the political opinion that excuses Arab Christians from knowing something about the ancient history of the part of the planet they occupy.  Unfortunately for the new wave,  atheism has a long history–one that goes back far before 2005.


Matthew Arnold used the term Philistine to describe a set of values prominent among people who despised or undervalued art, beauty, and intellectual content. Despite his problematical approach to the Bible, which was neither credulous nor entirely respectful, he retained it as a key text in his educational canon.

The worst trait of the Philistine as Arnold painted him was his materialism, the preference for quick and easy fixes, a mass produced painting instead of a developed aesthetic sense.

Quick fix atheism is that kind of atheism. I think it needs to be worried about ever so little.

Skeptifying Belief, by Van Harvey

Van A. Harvey is an emeritus professor of religious studies at Stanford University. Twice a John Simon Guggenheim Fellow, he is the author of A Handbook of Theological Terms, The Historian and the Believer, and the award-winning Feuer­bach and the Interpretation of Religion, as well as many scholarly articles and reviews. This paper originated at the conference, “Scripture and Skepticism” (2007) at the University of California, Davis, under the auspices of the Committee for the Scientific Examination of Religion and the UCD Department of Religious Studies.

The Historian and the Dog

The two great intellectual revolutions in modern Western culture were the Enlightenment in the eighteenth century and the awakening of the historical consciousness in the nineteenth century. The themes of the first are familiar to us all: the notion of natural rights, the emphasis on reason rather than faith, freedom of the press, and the separation of church and state. The themes of the second, however, are not as easy to specify, though no less revolutionary:

  • Humankind is immersed in history like a fish in water.
  • Thoughtforms of ancient cultures were radically different from our own.
  • Most important, the recovery of the past requires the work of disciplined, critical, historical reasoning.

The awakening of the historical consciousness gave rise in the nineteenth century to a new discipline that soon took institutional form in the university: departments of history. The study of history became a profession with its own learned societies, journals, and organs of expression, together with prizes and hierarchies of prestige. As an intellectual discipline, history had its own subject matter, categories, and procedures for the identification and adjudication of issues.

Driving the practitioners of this new intellectual discipline was an almost Promethean “will to truth.” The aim of the new historian was, as August Wilhelm Schlegel once wrote in his review of the Brothers Grimm’s Old German Meister Songs, to find out “whether or not something actually happened; whether it happened in the way it is told or in some other way.” This formulation has been criticized by postmodernists, but it should not be forgotten how revolutionary it was. Only when this “will to truth” was consistently and radically followed were we able to separate myth, legend, and actual occurrence, and to realize how so much of what we had previously accepted as fact was, in truth, fiction. We discovered that so many long-trusted witnesses were actually credulous spinners of tales.

It was inevitable that the methods of critical historical inquiry would be applied to the Jewish and Christian Scrip­tures and that there should emerge what was shorthandedly called “the historical-critical method.” This was not so much a single method but a series of questions that could only be answered by using critical historical thinking.

  • When, by whom, and for what purposes were the texts written?
  • What sources did the authors use? What do the texts tell us about the self-understanding of the community that preserved them?
  • To what extent are the historical narratives in the texts reliable and constitute historical knowledge?

Just raising these questions threatened, naturally, those Jews and Christians who believed the Bible to be divinely in­spired and, therefore, historically inerrant. And, since the answers to those questions contradicted traditional answers, the fundamentalists in these religions attacked what they called “the higher criticism.” The Roman Catholic Church es­tablished a Biblical Commission to assure that no Roman Cath­olic scholar would advance any historical conclusion incompatible with church doctrine. But it was not long before liberal Protestant and even some Roman Catholic scholars saw that it was futile to resist the new biblical scholarship, and so they appropriated it, with some even arguing that it placed genuine Christian faith on a sounder historical footing. Lay conservative Christians clung to the traditional view of inspired Scrip­tures, but historical-critical studies of the New Testa­ment became the standard components in the curriculum of the most prestigious theological seminaries and university-based departments of religion. This more or less remained the situation until the past half-century. But there has suddenly emerged a new set of challenges to the critical historical inquiry of religious texts.

Strauss: Myth is the essence of the New Testament

These challenges come not from fundamentalists and evangelicals but from academics and intellectuals of various sorts. Partly under the influence of new philosophical and hermeneutic theories loosely grouped under the unimaginative rubric of “postmodernism,” there has been a backlash against the historical-critical method

It is not easy to generalize about this brand of postmodernism, because it is woven from many intellectual strands that are not always compatible—some are philosophically sophisticated and some are not. Among the sophisticated is the very influential interpretation, since modified, of science by Thomas Kuhn. He argued that science does not deal with facts “out there” to be interpreted, but that facts are only identified within some conceptual framework, some paradigm.

There were other philosophers of science who argued that there can be no representation of facts without some observation language, and no observation language is theory-free. There is, so to speak, no “given” that can be described neutrally and objectively. Along with these philosophical movements have emerged new hermeneutical theories. These theories tend to argue that there is no one “best” interpretation of a text and, consequent­ly, any reading of a religious text depends on the standpoint of the interpreter. The framework of assumptions and conceptions employed by a given interpreter is referred to as a “hermeneutics.” There is, it is claimed, a difference between a “hermeneutics of recollection” and a “hermeneutics of suspicion”: a difference between a sympathetic interpretation that seeks to retrieve religious meaning and a hostile interpretation that aims to debunk it.

This, in turn, has sometimes been formulated as follows: the interpretation of someone who believes in the truth of a given religious text will be different from someone who is a skeptic, and, since there are no objective grounds for preferring one interpretation over the other, a hermeneutics of belief is as legitimate as one of unbelief.

These various philosophies and hermeneutical theories have now emboldened religious conservatives and apologists to claim that their interpretations are as intellectually legitimate as those of the historical critic. Everyone has his or her own interpretations, the argument goes. The critical historian presupposes that the supernatural does not ingress in history and that miracle is impossible, whereas the religious believer not only believes this intervention is possible but that it happens in given cases. The conflict between them, then, is not so much a confrontation between naïve religious belief and objective scholarship; rather, it is a hermeneutical conflict. The historian approaches his or her subject matter with the presuppositions of a nonbeliever; the religious person reads it through the “eyes of faith.”

Hegel: "spiritual" father to left-wing biblical criticism

This point of view seems plausible to many laypeople, and, since few of them read biblical scholarship or grasp the structure of historical inquiry, they become hostile toward biblical criticism. Perhaps it was once possible to dismiss this public ignorance of critical historical inquiry, but the events of recent times show that this is no longer the case.

Public ignorance of critical scholarship and the rejection of critical historical inquiry in so many circles now profoundly affects our culture and politics. Many argue that the refusal to submit the Qur’an to critical historical inquiry has been disastrous for Islam. But one might also argue that it is equally catastrophic that the West, which invented historical criticism and employed it for a century, is now confronted by a widespread ignorance and rejection of one of its most impressive intellectual accomplishments.

In what follows, I will not take up all the various versions of postmodernism, or what I label the “everyone has their presuppositions” gambit. Ultimately, the answer to all of these arguments lies in a proper understanding of the nature of critical historical reasoning. But since I could scarcely hope to accomplish that task in this brief essay, I shall concentrate on advancing two related arguments. First, the widely referred-to distinction between a hermeneutics of recollection and a hermeneutics of suspicion is irrelevant to the practice of critical historical inquiry and cannot be used to justify what is called a “hermeneutics of belief.” Second, although the historical critical method does practice methodological skepticism, this skepticism is not necessarily rooted in hostility to religion but is inherent in the logic of critical historical inquiry itself.

The consequences of scriptural illiteracy

The distinction between two types of hermeneutics—one friendly toward religion, the other hostile—was first made by the philosopher Paul Ricoeur in his 1970 book Freud on Philo­sophy: An Essay on Interpretation. He called the first “the hermeneutics of recollection” and the second “the hermeneutics of suspicion.” The hermeneutics of recollection names a type of interpretation that is basically sympathetic to religion because it assumes that the religious consciousness is in touch with something real. The best contemporary practitioners of this type of interpretation, Ricoeur thinks, are the phenomenologists of religion, who claim that it is only possible to understand religion if one attempts to “get inside” the religious consciousness and apprehend what it apprehends, albeit, he writes, “in a neutralized mode.” The phenomenologist argues that interpreters of religion must take the religious consciousness and its object—the sacred—with the utmost seriousness; indeed, they must be willing to accept the possibility not only that there is a message imbedded in the symbolic utterances of religion but that this message might even have relevance for the interpreters themselves.

To use the language of Protestant theology, religious interpreters must be capable of living in the expectancy of a new “word” and thus achieving a type of faith, one that has passed through the fires of criticism—a second naïveté, to use Ricoeur’s language.

Practitioners of the hermeneutics of suspicion, on the other hand, regard religion as illusion. Their skepticism about religion is grounded in a theory regarding human nature and behavior that they think explains religion’s origins and persistence. They regard the religious consciousness as a false consciousness; the object of interpretation, then, is to expose this falsity. In order to do this, they rely on some underlying psychological or sociological theory that they think not only explains the origin of the religious illusion but provides the key for decoding the symbolism contained within it. Thus Freud’s theory of childhood dependence on parental figures, or Durkheim’s theory of the collective unconscious guides their interpretative work, explains how the manifest meaning of a religion is really a function of some latent meaning. The aim of these workers is not to understand religious expressions but to demystify them.

It is worth analyzing more carefully this distinction between two types of hermeneutics. When we do, I think it will become clear why it cannot legitimately be used by religious apologists to claim that their faith-based interpretation of scriptures is simply an instance of the hermeneutics of recollection, while the historical critic’s methodological doubt is a manifestation of unbelief and suspicion.

But before considering those issues, it is important to note that Ricoeur’s distinction hardly covers the range of religious studies. Many types of religious inquiries do not fit into either of his categories. They spring neither from an a priori sympathy nor from hostility toward religion, and they are concerned with subject matter other than what Ricoeur calls “the religious consciousness.” Religious scholars might want to know how a given doctrine or belief developed over time. Or they might want to know the status of women in Gnostic communities. Or they may be interested in the concept of heresy. In inquiries such as these, the nature of the religious consciousness and its object might never arise.

But, even if we accepted Ricoeur’s dichotomy as exhaustive, we would have to insist that being sympathetic to the religious consciousness is not the same thing as believing in the religious object of that consciousness. We can see this at once if we consider the method of the phenomenologist of religion, which Ricoeur thinks best embodies the hermeneutics of recollection.

Method­ologically, phenomenologists “bracket” or suspend their own beliefs and presuppositions in order to “get inside” the believer’s consciousness. Even if it were the case, as Ricoeur claims, that phenomenologists listen to what the religious believer says in the hope of hearing an existentially relevant “word,” listening in­volves a twofold possibility: that there is something to be heard, but also that there may only be silence. Listening implies openness, which is to say, listening is not yet hearing.

Religious belief, however, is neither listening nor openness. Belief is just the word we use to describe for having already reached closure, for having already heard and accepted. Moreover, religious belief has content. It does not consist of believing in a word in general, but of believing in particular words. The Muslim hears the words associated with Qur’an, while the Christian’s words have to do with Jesus of Nazareth and his resurrection. Insofar as they are believers, neither can be said to be mere listeners—or phenomenologists in Ricoeur’s sense of the term. A hermeneutics of recollection is not a hermeneutics of belief.

When we understand that religious interpreters’ beliefs are quite specific, we can also understand why a certain type of religious believer is not only hostile toward biblical criticism but makes critical historical reasoning impossible. This is quite clear in the case of the fundamentalist but also, as we shall see, in the case of the more sophisticated believer who takes certain narratives to be true on faith. Christian fundamentalists make critical historical inquiry impossible, be­cause they claim to know in advance what any such historical inquiry will yield. They foreclose all the questions for which critical historians seek an answer: What are the various strands of authorship in the books traditionally associated with Moses? How many of the Epistles attributed to Paul were actually written by him? Was there an oral tradition underlying the Gospels attributed to Matthew, Mark, and Luke? Did Jesus claim to be the Messiah? Did the earliest belief in the resurrection of Jesus make any reference to an empty tomb?

Of course most Christians are not fundamentalists, but there is, nevertheless, a very large percentage of them whose faith is bound up with the confidence that most of the narratives about Jesus are true, excluding perhaps some of the less-believable miracle stories. They believe that he claimed to be divine, that his preaching is found in the Sermon on the Mount, that he was crucified, raised again, and ascended into heaven. Insofar as these beliefs are held on faith, they are subject to the same criticism one might make of fundamentalism—namely, that they foreclose historical inquiry.

Behold, the Lamb of God

We tend to think that this confidence in the truth of the narratives of the New Testament apart from any historical inquiry is naive, but Alvin Plantinga, one of the most sophisticated analytic philosophers in the United States, who is known for his work on warrants and the logic of possible worlds, has argued in his lengthy book Warranted Christian Belief that it is not. Plantinga first maintains that epistemological foundationalism—that is, the position that all that we can properly call “knowledge” ultimately rests on certain noninferential truths—is incoherent. (This position is, incidentally, also widely held in contemporary philosophy.) He then argues that it is just as rational to believe in a sensus divinitatis that provides an immediate and noninferential awareness of the truth of theism. Moreover, just as the sensus divinitatis guarantees the knowledge of God, so the Holy Spirit testifies to the Christian that the great things of the Gospel found in the Script­ures are true.

The extraordinary implication of this view, as Patrick J. Roche, one of Plantinga’s critics, has pointed out, is that the ordinary Christian who has no knowledge of biblical languages, textual criticism, theology, or even of history can never­theless come to know by appealing to the Holy Spirit that these Gospel events are indeed true; furthermore, his or her knowledge need not trace back (by way of testimony, for example) to knowledge on the part of someone who does have this specialized training.

There is, however, a more widespread challenge to historical criticism inspired by postmodernism that does not depend on an appeal to the Holy Spirit. It is what I have called the “presuppositions” gambit, which goes something like this: every historian has a standpoint that rests on certain presuppositions. Christian historians simply have different presuppositions than those of unbelievers. They believe in the possibility of supernatural intervention, whereas the historical-critical method rests on the presupposition of doubt regarding this possibility. This doubt is equivalent to unbelief and suspicion. It determines who the historian will accept as a credible witness and what the historian will count as evidence.

The claim that “every historian has his or her presuppositions” seems initially plausible to educated laypersons, but, when we unpack that generalization, it becomes clear that it cannot be used to legitimize what the religious apologist claims it does. When historical relativists make this claim, they usually mean that every historian’s judgments reflect certain basic general assumptions, say, about human nature or factors shaping historical causation.

But, because of the generality of these assumptions, they are reflected in the historian’s work whether they are writing about the American Revolution or the rise of the papacy. The historian’s assumptions apply across the board of history, so to speak. A Marxist’s interpretation of the Protestant Reformation, for example, will differ from that of, say, a Freudian.

But, when Christian apologists eager to defend the uniqueness of the Christian perspective use the term presupposition, they generally refer to something specific, like supernatural intervention in history. In the former case, the force of the word presupposition is quite different from that when it is used by the historical relativist. In this case, the Christian is not using the term to refer to a set of general assumptions that apply across the board of history but, rather, to exempt one narrow stretch of history from all those general assumptions that historians of various stripes use in their historical inquiries. The religious apologist is using the term to justify the suspension of all those assumptions we normally use when interpreting our experience and those of others. The point is that the alleged sacred events are so unique that no normal presuppositions apply.

It is important to distinguish between these two uses of presupposition, because only one of them permits rational assessment of historical claims. Marxists and Freudians may disagree in their interpretation of the causes of the Protestant Reform­ation, but they can still rationally discuss whether Luther did, in fact, draw up the Ninety-five Theses, or whether he had the childhood experiences that Erik Erikson claims he had. They will argue over the relevant evidence, but, if one of them claims that an angel dictated the Ninety-five Theses, the argument will come to a standstill. In fact, even if two historians have the same religious presupposition—say, that divine intervention in history is a possibility—they will still have to pore laboriously over the evidence in order to decide whether any given event did, in fact, happen. In most cases, the presuppositions of historians are broad enough not to weigh the scales in favor of any particular factual argument. But, if we identify presuppositions with certain specific beliefs arrived at on faith about particular events, then there are no general principles to which one can appeal when differences of opinion arise.

But any sophisticated answer to the presuppositions gambit must stem, I believe, from an understanding of the aim and methods of critical historical inquiry itself. As the philosopher A.O. Lovejoy wrote:

Though the inquiries of the historiographer, especially if they relate to events remote in time, are often more difficult, and sometimes at a lower level of probability, than the inquiries of courts, they have the same implicit logical structure, which is simply the structure of all inquiry about the not-now-presented; and if they are historical inquiries, and not criticism or evaluation, their objective is the same—to know whether, by the canons of empirical probability, certain events or sequences of events, happened at certain past times, and what, within the existential limits of those times, the characters of those events were.

Plantinga: analytic apologetics?

The Christian apologist, of course, will argue that it is just this appeal to “empirical probability” that is at issue, because the events in the New Testament are admittedly empirically improbable. They are, in the nature of the case, unique and, therefore, require faith.

The critical historian could probably surrender the language of “empirical probability” if that seems overly restrictive, but what the critical historian cannot surrender is the notion that the inquiries of historians, although they relate to events more remote in time, have the same logical structure as all inquiries regarding the not-now-presented—which is to say they resemble the same sort of inquiries that take place in our law courts, newspapers, and investigative panels of all sorts. We think historically when as parents we try to ascertain who scribbled all over the bedroom walls, or when as journalists we try to ascertain the origins of the decision to attack Iraq, or when as detectives we attempt to solve a crime. Historical thinking is an ingredient in all of our thinking.

To acknowledge this, however, is to acknowledge that the judgments we make and the arguments we use to support them solicit the assent of minds like our own that share our same general understanding of reality. For the most part, we make no use of technical terms except those that have become a part of our knowledge. Our causal explanations are pragmatic and usually unscientific. Our arguments and the warrants we use in coming to our conclusions are grounded in the best present knowledge we possess—a knowledge informed by the intellectual disciplines of our educational system. As many modern philosophers have shown, we get most of our beliefs and what we call “knowledge” from our culture, and it is this fiduciary framework that influences our concepts of necessity, possibility, and improbability.

The sciences are a part, but not the only part, of this cultural heritage and background. We presuppose the physics of ballistics when we engage in arguments about the velocity and range of rifles used in the Battle of Gettysburg. We presuppose biology when, as jurors in a rape trial, we decide that the DNA of the defendant is incompatible with the evidence brought forward by the prosecution.

We presuppose astronomy when we evaluate a passage in the Hebrew Bible reporting that the sun stood still. And we presuppose physiology when we assess a medieval narrative about a saint who picked up his head after his execution and marched into a cathedral singing the Te Deum. It is against this background of present knowledge that we reject stories of snakes talking, the claim that the world is only six thousand years old, and the notion that Muhammad’s camel leapt from Jerusalem to Mecca in four giant steps.

What seems to be ignored by the religious apologists who use the presuppositions gambit, especially those in the scholarly world, is that these apologists employ ordinary reasoning grounded in present knowledge when they are serving on juries, reading newspapers, writing histories, and, especially, assessing the scriptures of other religions as the critical historian does. The reasons for this are clear—in all these areas, they are soliciting the assent of minds like their own that share the same general understanding of reality. It is only when interpreting their own scriptures that they suspend those criteria that they use in their ordinary thinking and reasoning. But, we must ask, on what grounds is this suspension consistent and justified?

It is this same present knowledge that justifies our methodological doubt in relation to both witnesses and narratives. As the great historian Marc Bloch once pointed out in The His­torian’s Craft, it was not long ago that three-fourths of all reports by alleged eyewitness were accepted as fact. If someone said that an animal spoke or that blood rained from heaven, the only question was not whether it happened but what significance it had. Not even the steadiest minds of our predecessors, Bloch argues, escaped this credulity.

If Montaigne reads in his beloved ancients this or that nonsense about a land whose people were born without heads or about the miraculous strength of the little fish known as the remora, he set them down among his serious arguments without raising an eyebrow. For all his ingenuity in dismantling the machinery of a false rumor, he was far more suspicious of prevailing ideas than of so-called attested facts. In this way . . . old man Hearsay ruled over the physical as well as the human world. Perhaps even more over the physical world than the human.

In short, methodological doubt is not some a priori presupposition but, as Bloch puts it, a practice that has been arrived at “by the patient labor of an experiment performed upon man [with] himself as a witness. . . . We have acquired the right of disbelief, because we understand, better than in the past, when and why we ought to disbelieve.”

Indeed, it is just this right to disbelieve that R.G. Collingwood marks as the Copernican revolution in historiography. Previously, it was assumed that the historian had the responsibility to compile and synthesize the testimony of witnesses. The historian was regarded as a believer and the person believed was the authority or witness. But this was “scissors and paste,” not critical history. “In so far as an historian accepts the testimony of an authority and treats it as historical truth,” Collingwood wrote in The Idea of History (1946), “he obviously forfeits the name of historian; but we have no other name by which to call him.”

It is just because the critical historian makes his judgments against the background of present knowledge that the concept of miracle has all but vanished from the work of professional historians. The reason does not lie in some philosophical presupposition that miracles are impossible; rather, it lies in the nature of historical argument and the grounding of most of our warrants in present knowledge. Critical historians confronted with an alleged miracle as an explanation for an event or even as a description of an event have, first of all, no way of deciding whether the event is a miracle or not. They have no way of judging whether some alleged supernatural reality—a jinni, angel, or deity—is the cause of the event. They have no way of judging what would constitute evidence for attributing an event to this or that supernatural cause, and evidence is crucial for the critical historian. It is evidence that bears on whether such an event can be said to have occurred, and it is evidence that bears on what causes, if any, explain that event. At best, all historians can say is that such an event was anomalous. Reflective historians say this not because they are unbelievers, but because they are critical historians. They would hold with Collingwood that “History has this in common with every other science: that the historian is not allowed to claim any single piece of knowledge, except where he can justify his claim by exhibiting to himself in the first place, and secondly to any one else who is both able and willing to follow his demonstration, the grounds upon which it is based.”

Critical historical thinking in general, and its application to religious scriptures in particular, is one of the great intellectual achievements of Western civilization. It has its heroes stretching from Benedict de Spinoza through Julius Well­hausen and Albert Schweitzer to Rudolf Bultmann and Gerd Lüdemann. It is not a hermeneutics of suspicion rooted in hostility to religion. Indeed, it takes as its motto that scriptural injunction that “ye shall know the truth and it will make you free.”

But coming to know the truth is no easy matter, especially when the objects of one’s inquiry are treasured religious beliefs. As Friedrich Nietzsche, the anti-Christian thinker who nevertheless acknowledged his debt to Christianity, observed in The Antichrist: “At every step one has to wrestle for truth; one has to surrender for it almost everything to which the heart, to which our love, our trust in life, cling otherwise. That requires greatness of soul: the service of truth is the hardest service. What does it mean, after all, to have integrity in matters of the spirit? That one is severe against one’s heart . . . that one makes of every Yes and No a matter of conscience.”

Further Reading

Bloch, Marc. The Historian’s Craft. Trans. by Peter Putnam. Manchester: Manchester University Press, 1954.
Collingwood, R.G. The Idea of History. Oxford: Oxford University Press, 1946.
Harvey, Van. The Historian and the Believer: The Morality of Historical Knowledge and Christian Belief. Urbana and Chicago: University of Illinois Press, 1996. See especially chapter 7.
Lovejoy, Arthur O. “Present Standpoints and Past History.” In The Philosophy of History in Our Time, edited by Hans Meyerhoff, Garden City, N.Y.: Doubleday Anchor Books, 1959.
Plantinga, Alvin. Warranted Christian Belief. New York: Oxford University Press, 2000.
Ricoeur, Paul. Freud and Philosophy: An Essay on Interpretation. Trans. by Denis Savage. New Haven: Yale University Press, 1970.

Is Islam Secularizable?

The following is a reprint of an article by Sadik J. al-Azm from the Journal for the Critical Study of Religion Volume 2 Number 2 Fall/winter 1997 , published by Prometheus Books.  The article is especially poignant in the light of events in the Middle East, Egypt, and North Africa.  The article sheds a fascinating pre-9-11 light on events that have transpired in the Islamic world in the last decade.



Sadik J. Al-Azm, emeritus professor of modern European philosophy at the University of Damascus, is visiting lecturer at Princeton University. Al-Azm’s research specialty is the Islamic world and its relationship to the West, and he is known as a human rights advocate and a champion of intellectual freedom.


The question of whether Islam can be secularized has been on the agenda of modern Arab and Muslim thought and history since Bonaparte’s occupation of Egypt in 1798.

Arabs have been attempting to settle the issue since at least the last quarter of the nineteenth century; i.e., since what we Arabs often refer to in our recent past as the Arab Renaissance, the Arab Awakening, the Islamic Reformation, or what the late expert on the period, Albert Hourani, aptly called the “Liberal Age” of Arab thought.

Response to Change

In my attempt to formulate a realistic answer to the question Is Islam secularizable?, I shall start by raising another question: was the simple, egalitarian, and unadorned Islam of Mecca and Medina (Yatherb) at the time of the Prophet and the first four Rightly-Guided Caliphs (chosen by the then-emerging Muslim community as his successors) compatible with the dynasties of such complex empires as Byzantium and Sassanid Persia at the time of their Arab-Muslim conquest?

The accurate answer is No and Yes. Yes, the two became very compatible in an incredibly short period of time. But the early Muslim purists were absolutely right at the time of the first Arab conquests to insist that nothing in the Muslim orthodoxy of the day could make the Islam of Medina, Mecca, and the four Rightly-Guided Caliphs compatible with hereditary monarchy.

Similarly, in Christianity the movement of Monsignor Marcel Lefebvre and his followers in Europe and the United States was an excellent example of the Church’s persistence in response to purism evolving into secular humanism, religious pluralism, mutual tolerance, freedom of conscience, a scientifically based culture, and so on. The Second Vatican Council, convened by Pope John XXIII, is an equally excellent example of triumph over classical dogmatism.

By the same token, I would argue that the accurate answer to our primary question, Is Islam secularizable?, is also twofold: dogmatically, No; historically, Yes. I would contend that without a good grasp of the ups and downs of the secularization process of contemporary Islam, no explanation of the ferociousness of the current fundamentalist reaction can be adequate.

Islam, as a coherent static ideal of eternal and permanently valid principles, is of course compatible with nothing other than itself. As such, it is the business of Islam to reject and combat secularism and secularization to the very end. But Islam is a dynamic faith and has responded to widely differing environments and rapidly shifting historical circumstances, proving itself highly compatible with all the major types of polities and varied forms of social and economic organization that human history has produced.

Similarly, Islam as a world-historical religion stretching over 15 centuries has unquestionably succeeded in implanting itself in a variety of societies and cultures, from the tribal-nomadic to the centralized bureaucratic to the feudal-agrarian to the mercantile-financial to the capitalist-industrial.

Doubters that Islam can be secularized should consider the evidence coming from the most unlikely quarter of the Islamic Revolution in Iran. The Iranian Ayatollahs, in their moment of victory, did not proceed to restore the Islamic Caliphate-and there was a Shi’i Caliphate in Muslim history-nor did they erect an Imamate or vice-Imamate, but proceeded to establish a republic for the first time in Iran’s long history. The republic had popular elections, a constituent assembly, a parliament (where real debates take place), a president, a council of ministers, political factions, a constitution (which is a clone of the 1958 French Constitution), a kind of supreme court and so on, all of which has absolutely nothing to do with Islam as history, orthodoxy, and dogma, but everything to do with the practices and institutions of modern Europe. What makes this phenomenon doubly important is the fact that the Iranian clerics and guardians of Shi’i orthodoxy have always been ferocious opponents of republics, denouncing them as absolutely un-Islamic. They had successfully frustrated all previous attempts at declaring Iran a republic by earlier reforming rulers.

Note also that, in spite of the Islamic idiom, the politico-ideological discourses of the Iranian clerics and guardians of correct belief are substantively dictated by the historical “Yes” of the present socio-economic-political conjuncture rather than the exigencies of the dogmatic “No” of orthodoxy. This is why we find the public discourses of Iran’s ruling mullas dealing not so much with theology, dogma, and the Caliphate and/or Imamate, but with economic planning, social reform, re-distribution of wealth without forgetting such issues as identity and modernization. Consider the following words of admonition addressed by a Third World leader to the country’s religious schools:

If you pay no attention to the politics of the imperialists and consider religion to be simply the few topics you are always studying and never go beyond them, then the imperialists will leave you alone. Pray as much as you like: it is your oil they are after-why should they worry about your prayers? They are after our minerals, and want to turn our country into a market for their goods. That is the reason why the puppet governments they have installed prevent us from industrializing, and instead establish only assembly plants and industry that is dependent on the outside world.

These could have been easily the words of such secular leaders of the sixties as President Nasser of Egypt, President Sukarno of Indonesia, and/or the very early Fidel Castro of Cuba, but they are in fact the words of Ayatollah Khomeini himself.

The clash between traditional dogmatism and new ideas tends to work itself out in human affairs and societies quite violently with all the attendant destructions, dislocations, and innovative outcomes. This is attested to historically by the ever-recurring inter-Islamic civil wars and insurrections and at present by the current violence of fundamentalist Islam.

To be noted in this connection is the fact that in such key countries as Egypt, Iraq, Syria, Algeria, and Turkey hardly anything is run anymore according to Islamic precepts, administered along the lines of sharia law, or functions in conformity with theological doctrines and/or teachings. Outside the realm of individual belief the role of Islam has unquestionably receded to the periphery of public life. In other words, inspect, in any one of those states, the factory, the bank, the market place, the officer corps, the political party, the state apparatuses, the school, the university, the laboratory, the courthouse, arts organizations, the media and you will quickly realize that there is very little religion left in them.

Split Personality

Even in a state like Saudi Arabia where the ruling tribal elite wraps itself so conspicuously in the mantels of strict Muslim orthodoxy, moral purity, bedouin austerity, and social uprightness, the contradiction between outward official pretense on the one hand and real life on the other has become so wide, sharp, and explosive that those still taking religious pretenses seriously staged an armed insurrection at a Meccan holy shrine in 1979, shaking the kingdom to its foundations in the process. The declared goal was no more than rectifying the schizophrenic condition, i.e., putting an end to that ludicrous discrepancy between official ideology and reality by bringing the substance of Saudi life again in strict conformity with religious orthodoxy.

In the above-mentioned countries, the modern secular-nationalist calendar, with its new holidays, symbols, heroes, and ceremonies has come to fill the public square, relegating the old religious calendar and its landmarks to the margins of public life. This is why the truly radical Muslim fundamentalists complain not so much about the unsecularizability of Islam, but rather about the absence of Islam from all realms of human activity, because it has been reduced to mere prayer, the fast, the pilgrimage and alms giving, about how “Islam faces today the worst ordeal in its existence as a result of materialism, individualism and nationalism,” about how “school and university curricula, though not openly critical of religion, effectively subvert the Islamic world-picture and its attendant practices,” about how “the history of Islam and the Arabs is written, taught and explained without reference to divine intervention causal or otherwise,” about how “modern and nominally Muslim nation-states, though they never declare a separation of State and Mosque, they, nonetheless, subvert Islam as a way of life, as an all-encompassing spiritual and moral order, and as a normative integrative force by practicing a more sinister de facto form of functional separation of state and religion.” Obviously these radical fundamentalists have a superior appreciation of the nature of the modern forces and processes gnawing at the traditional fabric of Islam than the social scientists, and mainstream mullas who keep repeating the formula: “Islam is unsecularizable.”

Consequently, these radical insurrectionary Islamists keenly resent the fact that contemporary Islam has allowed its basic tenets to turn into optional beliefs and rituals. To reverse this seemingly irreversible trend they literally go to war in order to achieve what they call the re-Islamization of currently nominally Muslim societies.

They also resent the extent to which traditional gender hierarchies continue to be altered in contemporary Muslim societies. There is slow erosion of the traditional power of males over females accompanying such major social shifts as urbanization, the switch to the nuclear family, and the wider education, training, and gainful employment of women; the steady growth of opportunities attracting women from strictly traditional roles; the tendency towards egalitarian gender relations in marriage and life in general; the reproduction of society, through the socialization of children, according to norms that they regard as totally un-Islamic. There are militant demands for such measures as the re-imposition on women and children of the norms of traditional respect, obedience, gender segregation, and undivided loyalty to the male head of the household.

Naguib Mahfouz’s trilogy of novels date the collapse of the male-dominated and dictatorially run traditional Muslim household in Cairo at exactly the moment of Egypt’s revolution against British colonial rule in 1919. The Muslim Brothers—the mother of all Islamic fundamentalisms in the Arab world—was founded a few years later as a reaction to the secularizing forces and processes unleashed by that revolution.

An excerpt from one of Naguib Mahfouz’s articles describes the murky and confused condition of a typical Cairene Muslim struggling with the paradoxes, generated daily by a long-term historical secularization process, glimpsed by most only intermittently and through a glass darkly:

He leads a contemporary [i.e., “modern”] life. He obeys civil and penal laws of Western origin and is involved in a complex tangle of social and economic transactions and is never certain to what extent these agree with or contradict his Islamic creed. Life carries him along in its current and he forgets his misgivings for a time until one Friday he hears the imam or reads the religious page in one of the papers, and the old misgivings come back with a certain fear. He realizes that in this new society he has been afflicted with a split personality: half of him believes, prays, fasts and makes the pilgrimage. The other half renders his values void in banks and courts and in the streets, even in the cinemas and theaters, perhaps even at home among his family before the television set.

This account feels so genuine and true to the actually lived experience of Muslims everywhere that no a priori unsecularizability formula should ever be allowed to obscure it.

The Political Plunge

One source of confusion concerning this question of unsecularizability lies, as it seems to me, in the fact that Arab societies never witnessed a high dramatic instant where the state is declared from the top secular and officially separate from religion as happened with the emergence of modern Turkey from the ashes of the First World War. This process attained its climactic moment in Mustafa Kemal’s (Ataturk) famous abolition of the Caliphate in 1924.

Now, to sensitize Western readers to the enormity of Mustafa Kemal’s act and the great dismay and shock it spread throughout the Muslim world at the time, all that is needed is a moment’s reflection over what would have happened had the triumphant Italian nationalists in 1871 proceeded to abolish the papacy-after annexing the papal domains to the Italian kingdom-instead of recognizing the pope’s sovereignty over the Vatican City and his spiritual leadership of all Roman Catholics everywhere. We know, of course, that in 1922, Ataturk did toy with the idea of an “Italian” solution to the problem of the Caliphate, but he ended up rejecting all such compromises to cut at the root all future legitimist claims and restorationist movements.

In contrast to the Turkish example, the secularization process in key Arab societies has been slow and hesitant. The same sort of climactic point could have come to pass at the hands of President Nasser of Egypt soon after the nationalization of the Suez Canal in 1956 (a heroic and immensely popular act all over the Arab world). But Nasser never took that step and the real high drama arrived with Islamic fundamentalism and armed insurrection.

The subscribers to the unsecularizability of Islam thesis, both East and West, should have received a rude shock from the way in which the Soviet Union collapsed. Some were expecting the break up of the “Evil Empire” to come at the hands of its Muslim people and components.Homo Islamicus will always revert to type under all circumstances and regardless of the nature and depth of the historical changes he may suffer or undergo.

The main components of the union that brought it down were Christian and in the European part of the empire. The Muslim republics inclined to the last minute in the direction of saving the communist union. Even after its collapse they did their best to attach themselves to its remnants, in spite of the neighboring models of revolutionary Islam in Iran and of armed insurrectionary Islam in Afghanistan.


(c) 2011 The Institute for Science and Human Values

The Council for Critical Studies in Religion is a Research Project of ISHV

A Secular Ethics?

Radical secularism calls for radically secular moral alternatives to religious ethics.  No one has been more vigorous in his defense of this project than Paul Kurtz.

I have claimed frequently on this site that if skepticism at a minimum, and unbelief at the extreme, is a kind of prerequisite to such a project, it’s not because either position is self-affirming.  It is because whether God does or does not exist, the secularist believes that human values are made by humans and do not originate on mountaintops.  Even if one believed in a God who demanded obedience to such laws, it would be the duty of the secularist to defy him.

Religious doctrine calls itself into question because it has lingered into an age where religious explanations of the world and human choice are no longer persuasive.  In the long run, it is the failure of the Church, the mosque, and the synagogue to explain and to persuade that leads to skepticism and atheism, the loss of faith, and the erosion of ethical absolutism.  It is the death of belief in a god whose laws rule both the universe and human choice,  as Sartre said, that invites human beings to construct a system of values that deals with a world shot through with doubt about the old explanations and mythologies.

Hammurabi receives his law code from the god, Shamash

Some people continue to maintain that there is a law of God, that this law is sovereign over conscience and that all other law is subordinate to it.  It is probably true that these people have a very imperfect understanding of science, history and the development of ideas.  In general, a secular humanist would consider this view malignant in the sense that it is not harmless: that it has both moral and political consequences, and that when it is enforced or advocated in educational or democratic contexts it is toxic and has to be defeated.

For that reason, secularism, and secular ethics can never be quiet about religion.  It must place the burden of proof squarely on the shoulders of people who believe unsupportable truth claims based on the authority of faith.  These people may belong to any religious group, and they exist in every corner of the cornerless world.  What they have in common is the fantasy that rules and laws crafted in the first millennium before the common era have not merely historical interest but eternal force.  That is the position that secularism opposes.  There is a “secular moral imperative” to resist this kind of thinking in the same way that there is a duty to call attention to error in other factual domains–especially the sciences.

There are others who believe that God exists, that not much can be known about the subject, and that there is no special connection between the life we lead, or the moral choices we make, and this belief.  This position might seem to make the existence of God superfluous, irrelevant or a matter of diffidence–the sum of the difference between two equal improbabilities.

Secularism, it seems to me, has no reason to quarrel with people who believe in what Kurtz has called the “common moral decencies,” and lead a life committed to the discovery of virtues and moral excellence without the dictates of revelation and divine law. For the same reason we use metaphors of love, hope and compassion to describe states that are essentially emotional, there is no additional privilege to be gained by insisting on the rejection of all conceptions of God.  Yet the more personal and “described” this being is, the greater the risk of identifying it with the gods of mythology–the gods whose rules are seldom relevant to the planet we occupy.  For that reason, a secularist may insist that any idea of god is an idea too far.  It’s at the point of this insistence that secularism and unbelief converge.

As in all ethical matters, the primary nostrum for secularists is “to do good and to do no harm” (Hippocrates).  Like other ideological systems based entirely on human wit and imagination, religious beliefs are accountable  to the ancient formula. A secular ethic  will always require that this interrogation take place–that religion enjoys no privileged status based on assertions of authority that are widely regarded as untrue.

Laïcité: The Radical Secular Imperative

You need to join us. Now. You need to take a stand against the deadening of the American brain. You need to do this whether you think America is already brain dead, or if you are an American worrying about just how much life is left in you.

The Europeans have long had a word for what radical secularity is, at its heart: it is based on challenging the prerogatives of religion in society–something Americans have long thought their First Amendment made it unnecessary for them to do. It is called laïcité in France, and sometimes gets translated into English as laicity: the rise of the common woman and man (the laity) who were not in clerical orders nor members of the aristocracy in cahoots with the Church. It goes back to the time of the Revolution (theirs, not ours) when the Catholic Church was greatly diminished in power and prestige among members of the third estate–ordinary people.

I’m happy to call it secularism, as long as we understand it in the most radical sense of that word. The term laïcité has the advantage of naming the thing after what it is: people. And when you get down to it, it is ordinary people (not bishops and theologians) who have suffered most at the hands of religion–and still do. It has the disadvantage of being French in a country where some states still serve Freedom Fries, though they have forgotten why.

It is amazing to me that the Catholic Church is still standing. We now know that the Church of Rome has used its prestige and its illegitimate claim to be the protector of conscience to tamp down the fires of outrage over the rape of children. Children were raped in Boston. In New York. In Brussels. In Dublin. In Frankfort. In Philadelphia. In Sydney and Toronto. We are just beginning [see note below] to get a sense of the scale, but on the basis of what we know–the number of priests and children involved and the inaction of the Church to stop the abuse–the crimes can only be compared to multiple serial killers being permited to go about their routine with the police watching and winking.

It is amazing to me that Islam has not petitioned the World Court in the Hague for forgiveness from the international community. There is no central authority to lodge such a petition, of course, and no desire to lodge one–which is part of the problem: The death in Pakistan last week by assassins who became national heroes overnight was conducted with the بركة of a dozen radical clerics, each claiming legitimate authority to issue licenses to kill in the name of God. I am not very interested in social explanations of why such killing occurs. I want to know why a liberal West is so willing to accept the rationale that it occurs because the liberal West created radical Islam. Or why the United Nations can pass a resolution declaring that the “defamation of religion” is a violation of international human rights, a premise eerily like the Blasphemy laws that led to the murders of Shahbaz Bhatti and Salman Taseer. I am saddened that innocent soldiers have to die to make a point about living without fear or reprisal and in the hope of freedom, sadder still that the atrocity of religious violence usually ends up not merely short of its objective but in the rubble of another Muslim household.

I am outraged at the religious sources of ignorance. Gallup 2010 says that only 39% of Americans “believe” in evolution while a further 36% have “no opinion,” a conclusion almost as stupefying as the first. And while the religion marketplace is competitive, and while church attendance is slightly down, Pew Research suggests that between 80 and 85% of Americans are either “religious” or “very religious.”

They are also anti-science and pro-ignorance: Abortion is not a science question, but a healthy 52% (Gallup) oppose it, exceeded by the 57% (Rasmussen, 2010) who oppose embryonic stem cell research because opponents think it involves killing babies for their brains.

I am angry at the teaching of absolute falsehood and mythology as truth, whether it is put across as history or geology or geography. The entropic principle in American democracy has always been the insistence that there are two sides to every story, and then applying this notion to facts.

There are not two sides to facts. It is self-evidently a crime against reason to tell “learners,” as we like to call the innocent these days, that a fact has the same epistemological value as an opinion or a perspective, thereby encouraging them to think that things that really are just opinions, like religious doctrines, have higher status than facts.

Scientists know this about facts or they could not do their work. You cannot treat cancer like a cold. There is nothing to be said for the idea you can get to the moon in a cardboard box. But there are still people in postions of authority over mind and heart, some of them passing laws on our behalf, who believe the world was created in six days and that Jesus walked on water and ascended into heaven. There is no doubt that this did not happen: there are not two sides to it.

Neither is there any merit in the idea that God created marriage for the procreation of the human race. The human race was doing very nicely without the god of the Hebrew tribes before the story was invented, and the Church cared almost nothing about the religious value of marriage until the 12th century. Procreation is a fact. Interpretations of its sanctity or exclusivity are opinions.

This list could be extended, should be extended. What these cases have in common is not only that they offend against our intelligence and perhaps basic sense of decency–a phrase that needs to be revived–but that religion is implicated in all of them. There is no secular child abuse scandal. There are very few secular suicide bombers. Among seculars facts are, in the main, valued and Darwin is permitted to speak. This doesn’t mean that secular women and men have not done evil things, but they have done them through malice, not in the name of secularity. In cases where the State simply replaced God, as in Soviet Russia, the motivation was essentially religious.

I am not happy to say Leave the dims to their dimness and let’s get on with converting the world to atheism. For one thing, that is not going to work. For another, we see what happens when the religiously craven are left to their own devices. It is a question of how long before they come knocking at your door and require you to have a Bible or a Quran in your house—just like pistol packers who want you to pack a pistol, too.

And I am also not prepared to say, “We need to start talking to each other, find out where the other side is coming from.” I have limited faith in the powers of this conversation. There comes a point, and we have reached it, that to indulge religious illiteracy is the same as saying there are two sides to every fact. But we can bring with us people with sincere, peaceable religious commitments who are nonetheless equally committed to secularity. That is not dialogue; it is common cause. It can be carried on with kindred spirits still living and long dead.

It may be true that atheism, agnosticism, interfaith understanding ,and various interest domains share with the Laïcité an interest in opposing and—to be perfectly militant—defeating the repugnant positions I have mentioned here. But the battle line has to be made up of people who see the world in a particular fashion and who do not think that the truth that constitutes knowledge of the world is negotiable. That is what Laïcité is all about. That is what a radically secular worldview requires.

All of the people who do these things, who believe these things, who teach these things are terrorists, not only the ones who throw bombs. The Catholic Church has committed acts of terror against children. Ultra-conservative protestants continue to promote intellectual feebleness among millions of people worldwide. Significant numbers of Muslims have adopted an anti-rational posture toward their domestic critics and towards all outsiders, especially in the west. That is the world we live in.

Slogans about there being No God (Live with it), about “Being good” without God–or about it being possible to be loving, gentle, and kind without God, besides being laughably obtuse, are almost hopelessly irrelevant to the problems we face. They shift the emphasis from causes to the moral rectitude of unbelief, a different matter, a game being played on a different field. Atheism and Goodness without God may be perfectly worthy subjects of discussion over coffee, among friends. But they are not relevant to this discussion, which is how very badly a great many people who believe in God are behaving. The problem requires a great many more than the 16% of Americans who aren’t especially religious to solve, since the religious ennui the statistic may betoken is not the same as laïcité–a radical secularity.

I hope that those of you interested in joining a cause, an organization, and a movement that is both targeted and appropriate to what’s happening in real time on the world stage will join the Institute for Science and Human Values. We affirm that there are non-religious solutions to the problems we face. We affirm that human beings shape the future by shaping appropriate values in the present.

Join us in promoting the cause of a radically secular future—one where there are not two sides to every fact.


Note on Roman Catholic Abuse Scandal:
The 2004 John Jay Report commissioned by the U.S. Conference of Catholic Bishops (USCCB) was based on surveys completed by the Roman Catholic dioceses in the United States. The surveys provided information from diocesan files on each priest accused of sexual abuse and on each of the priest’s victims to the research team, in a format which did not disclose the names of the accused priests or the dioceses where they worked. The dioceses were encouraged to issue reports of their own based on the surveys that they had completed.

The team reported that 10,667 people in the US had made allegations of child sexual abuse between 1950 and 2002 against 4,392 priests (about 4% of all 109,694 priests who served during the time period covered by the study). One-third of the accusations were made in the years 2002 and 2003, and another third between 1993 and 2001. “Thus, prior to 1993, only one-third of cases were known to church officials,” says the report.

Around 81% of the victims were male; 22.6% were age 10 or younger, 51% between the ages of 11 and 14, and 27% between the ages to 15 to 17 years.

Beyond the Secular City

It has been forty six years since Harvey Cox was made famous by a book called The Secular City.

I’m sure people read it—they certainly bought it–but apparently very few people took it to heart. It was famous for being famous, had an untidy thesis and worst of all did not prominently take on the topic its title promised: the secularization of American life. It was dazzling, intellectually promiscuous, and energetic, much like its author, a “village Baptist” come to Harvard.

And it was an extended broadside against the death of God theologians who then dominated the covers of Time and Newsweek and whose shelf-life, after the initial shock of the new, did not amount to a decade.

No one could quite make out what they wanted God to be, so the thought that he was dead turned out to be something of a consolation. “Now,” I remember thinking one day after reading a certain book by Thomas Altizer, “if only the theologians would stop writing obituaries.”

It is a shame that The Secular City got so much press because when it was written secularization was a real phenomenon. God was not only in retreat at Harvard, Yale, Chicago, and even Emory, but the great social programs of the era seemed to suggest that people were looking for this-worldly solutions to urban blight, poverty, domestic illiteracy, racism, war and a dozen other issues that competed for attention. The jury is still out on all of those issues, from blight to birthers.

In an odd way, Cox’s book could have been written by Joseph Ratzinger who is constantly invoking “authentic Christianity” in “secular Europe.”. In fact,Cox was fresh back from a German stint when he wrote it and decided that the cure for many of the ills of American society was a new spirit of “authentic Christianity,” the first symptom of infection with the virus existentialus immoderatus. Cox did not mean revival in the Billy Graham style. That was an option throughout the twentieth century and, remarkably, affected politics from Truman to Obama. Like every freshly minted theologian, Cox believed that the the cure for nihilism (which was the jumping off a cliff option of the era) was not just any faith but (again) authentic faith. The kind of faith that found affirmation in negation. That sort of garbage.

In a 1990 article in Christian Century, Cox said he had written the book to stress that neither religious revivial nor secularism are unmixed blessings, that the thesis of The Secular City was “that God is first the Lord of history and only then the Head of the Church.”

This means that God can be just as present in the secular as in the religious realms of life, and we unduly cramp the divine presence by confining it to some specially delineated spiritual or ecclesial sector. This idea has two implications. First, it suggests that people of faith need not flee from the allegedly godless contemporary world. God came into this world, and that is where we belong as well. But second, it also means that not all that is ‘spiritual’ is good for the spirit.

Written to be quoted by liberal pastors, when I read this passage today it sounds like a vintage sixties tract, which in many ways it was. It is the language of someone who has drunk too deeply from the theology of Karl Barth (a real hazard of American theology of the era) and whose main talent was not serious theology but impersonation. Even the suggestion that “people of faith need not flee from the godless contemporary world” rings empty: who was chasing them? What answers were they afraid to hear?

The Secular City makes for depressing reading for another reason: because we are now twenty years beyond the twenty five year retrospective of its appearance, and we are not saved. There is plenty of religious revival. There is an awakened interest in atheism, that seems neither informed nor profound. But neither phenomenon is the point, any more than the shock value of the Death of God “movement” was the point in the swinging sixties.

The point is, we need to be talking about secularism. Of course, that includes a discussion of issues, and the Constitution, and the right of gays to marry, and a dozen cognate matters that respond well to secular approaches. But simple talk about those issues–and I will add various Pride Movements to the list–threatens to drown out the voice of what my former colleague, Austin Dacey, has called “The Secular Conscience.” That is what matters, and that is what we should be talking about. I have no doubt that people who are afflicted by various forms of discrimination have found a better friend in secularism than in the church, mosque and synagogue. That is why it is time to give our friend the time it deserves.

We do not need to be religious to realize that Father John Neuhaus (The Naked Public Square, the book Cox might have wanted to write) was right on the money when he said that the world is dying of metaphysical boredom. Neither fervently religious people nor ardently non-religious people, it seems to me, have the tonic for this peculiarly modern disease.

Be secular.

In the midst of the most degrading sexual scandal of modern history, the Catholic church still cleaves to the banner of moral authority in the name of this lord of History and head of the Church, while preaching a “gospel of life.” Our political world is dominated by office seekers who, to get elected, must swear fealty to religious principles they have never examined. Our teachers still find Darwin suspicious reading (or suspicious on hearsay) and evolution “just a theory.” Science illiteracy and religious illiteracy—always the Bobbsey twins of ignorance, are arguably worse in 2011 than they were in 1965 when Cox sounded his muddled alarm.

Something else was going on in the sixties, however, of far greater consequence and, this being America, of lesser note at the time. Prometheus Books was founded by Paul Kurtz—a voice for humanism, secularism and free inquiry in an age hounded by the reactionary religious (aka “Moral) majority of the era. Kurtz went on to found the Council for Secular Humanism to advocate for non-religious morality and decision making; the Committee for Skeptical Inquiry (CSICOP) to push for critical thinking in matters of science, and the flagship organizations, the Council for Secular Humanism and later the Center for Inquiry.

The mission and objectives of these organisations was crystal clear. They were dedicated to the advancement of science and reason. To make them more clear, he founded two magazines that are still going strong and are unique in their support of evangelical common sense: Free Inquiry, and The Skeptical Inquirer. In 1984, in response to explicit threats to the First Amendment and to encourage the free and open discussion of religion in the public square, he organized the Committee for the Scientific Examination of Religion.

Over the years, these organizations have grown against the odds and moved against the tide flung against progress by the Lord of History.

In 2010, after a humiliating setback in the Center for Inquiry, which led finally to his resignation, an undaunted Kurtz founded an organization whose name expresses better than any previous one what the unfaithed and unchurched and humanistic minority of this country need to support their habit of secular thought: The Institute for Science and Human Values.

The Institute will be an engine for a process that Kurtz and others put into place forty years ago. It is unequivocal in lobbying for a secular and humanistic worldview, grounded in science, supported by inquiry, and skeptical of the claim of any movement or group to possess the whole truth and nothing but the truth.

I am proud to associate myself with the Institute and its programs, its new publication (The Human Prospect) and its Forum. I think that every person who regards herself or himself as secular will want to support it too.

The new Secularism and the City Forum invites you to share your story, your commitments, and your thoughts. You may be an atheist, a faitheist, a skeptic, or a Freethinking None. But we hope to see you on the forum to register your thoughts.

The transition between the Death of God and the Secular Era, despite a few setbacks, begins now.