Vox Populi: A Theology of Messy Democracy

The Right to Vote

The elections are over. The election is upon us. Long live the Democratic Process! And a tip of the hat to the founding fathers, who in their prescience must have known that the fundamental metaphor for twenty-first century politics would be an endless and pointless NASCAR race.

Now we sigh deeply, wipe away a wanton tear, and try to adjust to the fact that barely two years after the election of Barack Obama (Hope, Change, Fired Up, Ready to Go) America has lost its energy, its nerve, and possibly its mind, and decided it wants to sit on the stoop and watch the civilized world (which it has just voted to quit) pass by for a spell.

Meantime, we will half-hear as the political assessors talk their heads off about what went wrong and whether Obama is listening, whether he gets it, whether the sting he was stung stung enough to hurt, whether he is paying attention or is just out of touch with the American people, and why someone with such a hoity toity education is tone deaf, can’t communicate, and acts sooo professorial. Just who does he think he is?

The assumption on almost everyone’s part is that a (virtual) vote of no confidence conveys a kind of popular wisdom because it is an expression of the collective will of the people and in this Man Up Democracy, vox populi vox dei, People Rule. A little attention to the full quotation from Alcuin to Charlemagne in the eighth century yields a slightly different flavour, however: “Nec audiendi qui solent dicere, Vox populi, vox Dei, quum tumultuositas vulgi semper insaniae proxima sit.” : “And those people should not be listened to who keep saying the voice of the people is the voice of God, since the riotousness of the crowd is always very close to madness.” Leave it to an ingenious country like America to prove Alcuin spot on.

Alcuin, proved right in 2010

I am not a political scientist, not a “political analyst” (read: sports announcer in ill-fitting gray jacket), not even much of an activist, though I do have longish hair and wear turtlenecks. Ideologically, I am a proservative, a progressive who is afraid of the consequences of progressive ideas. I am not even sure I care very much about politics unless it has the capacity to catch my attention, as it did a couple of years ago when Obama struck me as a rare bird in a nasty profession, and may still prove too rare to escape extinction in 2012.

But after last Tuesday I’m fairly certain I will not be paying attention again for a long time to come. Maybe not again in my lifetime. I have talked to many people who feel the same way–even worse, because my cynicism is greater than theirs, and my immunity to bitterness and disappointment slightly more developed. I once stretched my student budget to the limit to attend a Van Cliburn concert, and was virtually giddy the evening of the performance. Even by my pathetic expectations, he was not up to his standard, pleaded the flu before he sat down to play, and cut the program short by thirty minutes. It’s a bad analogy, I know, but I think that is vaguely similar to the performance-reality gap America is dealing with right now. The question really is, whose fault is it?

"Not mine."

I do not think politics matters very much because I do not think it has the power to change things. War and science, and occasionally poignant ideologies, perhaps the odd book, have the power to change things (usually because they lead to war or new technologies), but because people do not change very much, the collective voice of the people is only ever going to be an expression of their state of mind and emotional condition at a certain moment. Modern American elections are fought with only emotion in view–not government, leadership, not the social welfare of the people, and certainly not ideas. The idea of what is “good for me” and what is “best for the country,” for example, are not complementary: Obama worked for the latter and ran afoul of the former. There were no ideas in this election, if you except (as I think you have to) the idea that taking your country back is an idea.

Besides being terribly depressing for smart people, the election was terrifying because it displayed, for the first time, that the American Constitution is not well adapted for the new millennium. The tears and cracks become more obvious with every passing election season and every Supreme Court decision. But the Constitution, which is political sacred writ in the United States, especially among those who have never read it, is an eighteenth century playbook for eighteenth century ideologies about limited government, seldom amended, and largely unable to serve as a proof-text for social reform. Only its plagiarized Lockean preamble (the only bit ever quoted extensively) has lofty rhetoric. The remainder reads like a tax form, like most constitutions throughout history.

But when you think about what it–the Constitution–put into place–the “system” of checks and balances, the bicameral legislature, the separation of powers, the electoral college, the cumbrous protocol for amending the sacred text, and the oligarchical method of interpretation by a panel of men and women who, for all practical purposes are political appointees with private agendas–you have to lose a little sleep. What it also put into place is the scourge of elections to the “lower house” every two years–a practice based on the need to “refer” to the mood of the people frequently in matters directly affecting them, but totally unsuited to an attention-deficient population who are accustomed to doing their Christmas shopping in September. It is true that the closest ancestor of our representative system, the British Parliament, also has provisions for “bringing down a government,” but in the best of times, and as an encouragement for the people to take government seriously and weigh their reserve power carefully, the normal (legal) stretch between elections is five years.

To put this a little more cogently, if this were England, and the “executive” was simply the leader of the party in power, Obama would be out the door. But, as it is, he survives to limp along until 2012 as the mercy of his persecutors. This is democracy, American-style, in action. This what America wants for the rest of the world.

There is a new apocryphon in the press, so popular that it is has a life beyond facts. It is this: Aristotle said democracy “is the worst form of government except for all the rest.” Aristotle, who was not known for his humor, never said any such thing, but it is instructional to look at what he did say in the Politics:

Book III -“But the citizen whom we are seeking to define is a citizen in the strictest sense, against whom no such exception can be taken, and his special characteristic is that he shares in the administration of justice, and in offices. He who has the power to take part in the deliberative or judicial administration of any state is said by us to be a citizens of that state; and, speaking generally, a state is a body of citizens sufficing for the purposes of life.

For tyranny is a kind of monarchy which has in view the interest of the monarch only; oligarchy has in view the interest of the wealthy; democracy, of the needy: none of them the common good of all. Tyranny, as I was saying, is monarchy exercising the rule of a master over the political society; oligarchy is when men of property have the government in their hands; democracy, the opposite, when the indigent, and not the men of property, are the rulers.”

Book VII
“The citizens must not lead the life of mechanics or tradesmen, for such a life is ignoble, and inimical to virtue. Neither must they be farmers, since leisure is necessary both for the development of virtue and the performance of political duties.”

That kind of language will strike every Tea Party operative as elitist because it shifts the blame for the wretchedness of a political outcome such as the recent American election away from a “tone-deaf” ruler to a dumb and blind electorate who vote their gut, not their head, and call it conscience. Equally, it will strike liberals as offensive, not because it emphasizes “smart politics” (which liberals profess to like) but because it sees the citizen-voter as a subset of the whole population and not the whole population. Both liberals and conservatives appeal to the archetype of the Working Man, not the educated “man of leisure” who is simply ridiculous and probably unemployed in our system. (Additionally, the Republican Working Man works in a bank or on Wall Street.) Regardless, both groups depend on the myth of the popular will, as opposed to the idea of informed citizen choice; neither group can afford to stray very far from the modern concept of “constituency” because constituencies vote. In the era of special-interest voting, scientific polling and frontier politics, Aristotle’s ideas about democracy being inherently defective don’t wash well with either political party. Democracy, George Bush famously said, on being told the death toll in Iraq had reached 4427 in 2003, is “messy.” A grateful nation returned him to power in 2004.

Aristotle was both an embarrassment and a challenge for the founders, who weren’t certain whether “mechanics and tradesmen” in addition to men of property and leisure (who had time to read Aristotle) should be factored into the process. Slaves and women were another matter. As every schoolchild used to know, that did not really happen until the nineteenth century for black Americans, and for women not until the twentieth. Enfranchisement on the strict basis of “legal” citizenship (or rights) as opposed to philosophical formation was considered an end in itself. But what was achieved by virtue of stressing the value of participation and inclusion was highly problematical, and the founders weren’t around to fix it. The rights of citizens had been a slogan since the time of our own and the French Revolution. What happens when Leviathan grows so many legs he can no longer walk? Government by whim and need, faction and passion–but worst of all ignorance.

Which brings me to the theology of the whole sordid affair that has emplaced in the chambers of the most powerful legislative assembly in the world a clutch of Know-nothings unlike anything this Needy and often Know-nothing Democracy has ever seen. I am talking, of course, about biblical Israel.

The Old Testament is more relevant to the current crisis than our Constitution because the suspicion of monarchical government originates there and not in Aristotle. The founders had monarchy on their mind, and they had concluded with the philosopher that monarchy unchecked was tyranny, a system that operated only in the interest of the ruler. (They were wrong of course: the English had fought their own civil war and had debated monarchy much more thoroughly than the colonists ever had by the time the Declaration was issued in 1776.) But as men of literary accomplishment, they also knew that monarchy was regarded by the ancient Hebrews, and even the early Christians, as the source of calamity and political distress. Polemicists like Paine referred to George III as a “Herod of uncommon malice” who could rightfully be deposed because “God’s favor has parted from him.”

George III: "Temperance"

It’s amazing, in reading through the historical books of the Bible, from 1 Samuel onward, how king after king is a disappointment, a disgrace, a mistake in God’s eyes. Kings are given to men as a punishment (Saul) and even when very famous (David, 2 Samuel 11.4) are not very nice. British monarchical history seems to follow the biblical pattern (perhaps this is why “Zadok the Priest” is still sung at Coronations?); the American presidency, while young compared to English history, seems doomed to follow suit, though no Shakespeare will arise to sing the praises or recite the flaws of an Eisenhower or a Coolidge.

If there is one thing worse than bad kings, however, it’s people. People, according to ancient Hebrew calculus, are rotten, passional, fickle. They are incapable of paying attention, following the right path, or doing the right thing, or keeping the faith, or enduring hardship, or working together, or solving problems. In metaphor, they “chase after false gods,” and always come back depressed, defeated, and empty-handed. It’s not a track record that would necessarily lead to the vox-populi philosophy.

In the biblical scheme of things, the God of Israel, is “constant.” His constancy is not “personal,” however; it’s embodied in his law and justice, a theme that actually undergirds the judicial philosophy of most modern constitutional democracies. The justice and goodness represented in the Hebrew idea of God through myth remains, primarily, a concept or abstraction in Greek thought. Because certain questions, Euthyphro-style (Which god likes what?) don’t arise in the monotheistic context, the Hebrew vision is crystal clear: People are ingenerately unable to keep to his standards of justice and righteousness. Coaxing, threats, punishment, don’t seem to do the trick (and the Bible is not famous for subtle approaches like irony and appeals to self esteem). So the burden falls roundly on the people–who would change gods if need be–to figure out what kind of system would work. They choose kings.

The writer of I Samuel imagines the following scene: The Judge Samuel has experienced a succession crisis. In old age, he appoints his sons as “judges” (tribal chiefs, fair-minded warlords) to succeed him. They turn out, as sons often turn out, to be bunglers and scoundrels who “took bribes and perverted justice.” In despair, Samuel agrees to the demands of the elders for a monarchy, “a king over Israel.” The people have “voted”–for their own subjugation. They want to be like their more prosperous and successful neighbors. Monarchy is all the rage. Samuel confers with God, and God instructs him to warn the people what they have in store for them when the newfangled system is in place. It is worth quoting:

“Samuel told all the words of the LORD to the people who were asking him for a king. 11 He said, “This is what the king who will reign over you will claim as his rights: He will take your sons and make them serve with his chariots and horses, and they will run in front of his chariots. 12 Some he will assign to be commanders of thousands and commanders of fifties, and others to plow his ground and reap his harvest, and still others to make weapons of war and equipment for his chariots. 13 He will take your daughters to be perfumers and cooks and bakers. 14 He will take the best of your fields and vineyards and olive groves and give them to his attendants. 15 He will take a tenth of your grain and of your vintage and give it to his officials and attendants. 16 Your male and female servants and the best of your cattle[c] and donkeys he will take for his own use. 17 He will take a tenth of your flocks, and you yourselves will become his slaves. 18 When that day comes, you will cry out for relief from the king you have chosen, but the LORD will not answer you in that day. But the people refused to listen to Samuel. “No!” they said. “We want a king over us.”

And so it began. A history of tyrannical, faithless, lustful, war-hungry, greedy, and immoral men, punctuated (but not in time to have done Israel or the hybrid kingdom of Judaea any good) by a few good rulers. Passion gives you the form of government you want until you don’t want it anymore.

Is there a convergence between Greek and Hebrew political thought, these widely divergent cultures from the first millennium BCE? Of course. Both show the common ancient opinion about the “will of the people.” The people can’t be bothered with the consequences of any political decision, whether it’s shouted or registered on a touch screen. They vote their passion.

The Voice of the People

That is why Aristotle cautions against “need” and ignorance in the choice of political operations. People will choose tyrants who promise them bread, and execute the tyrant when the bread doesn’t appear on the table or costs too much. On the biblical side, they will choose kings who lead them to victory, then rue the day when their sons die in battle. No wonder the two streams of thought have had inordinate influence on the way we think about politics and government in the West.

Democracy was not an option for the Hebrews, and not what we mean by democracy for the Greeks. Given the amount of money the plutocrats inject into political campaigns in the United States in order to keep their hands on the wealth, it is arguable that American democracy isn’t what Americans mean by democracy either–but that’s a different point. In a naive and unexamined way, Americans think that certain phrases like “majority rule,” “the will of the people,” and “representative government” are self-authenticating, even though they smack of power rather than statecraft. Loftier ideas like “good government,” “sound counsel,” and “wise leadership,” even “justice for all” betray their biblical origins: there is not enough time to cultivate ideals like that when the complete political reality of our time, the definitive feature of messy democracy is change on demand. From where we sit, democracy means sending the menu item you thought you’d like, but didn’t, back to the kitchen.

The recent election has proved two things to me. First, we can never count on the American people to do the right thing, whether they choose kings over republics or republics over kings. The political history of the world, as every historian knows and every political “analyst” conveniently forgets at election time, is a history of disappointment, punctuated by remorse, followed by revolutions and wars.

That is the religious and political history of Europe. It is also the history of America in its revolution, its Civil War, and its most recent political spasm, the triumph of the Tea Party para-revolutionaries. When the frighteningly ignorant and undereducated Christian fundamentalist, Sharron Angle of Nevada, announced that Americans were ready for “Second Amendment remedies” to the current “regime” she was using language (probably scripted) in a deliberately provocative way. Alas, however, she may have been right. But I did not hear a single “analyst” with the historical presence of mind to suggest that both John Wilkes Booth and Lee Harvey Oswald (to name only two successful assassins) used these remedies. The phrase “We’ve come to take our government back” may sound more like a football cheer than a threat, but the underlying idea that a particular government is “owned” by a class of people and has been unlawfully seized by the unrighteous is not democratic rhetoric: it is populism gone berserk, Israel shouting for its king. This time, however, the king is not a man: it is their enthroned Echo.

America has fought only two continental wars, one against its colonial masters, the other against itself. Lincoln’s exegesis of Gettysburg–that it was a battleground to test whether the idea of equality and union could survive in a nation without much history (a scant eighty-seven years at the time) to guide it–has not been settled. Lincoln was depicted in the lore of his generation as a Hebrew patriarch: “We are coming father Abraham, 300,000 more.” was one of the most popular songs of the Civil war era.

But he was hated by at least as many thousands. John Wilkes Booth’s shout as he leapt onto the stage of Ford’s theater on the evening of April 14, 1865 summed up the feelings of the Tea Partiers of his day: “Sic semper tyrannis” (“Thus to tyrants, always”). He served exactly four years, one month, and twelve days as President.

What is it about the Lincolns, the Kennedys and so far, thankfully, nonviolently, the Obamas of this land that awakens the crouching demons of American democracy, the shouters, the haters and the merely suasible, and entitles them to bring their swords?

Some fairly impressive scholars think that the Civil War was merely the first outburst of regionally and socially stratified tensions that are even worse in the twenty-first century than in the nineteenth and twentieth. America, lacking a common enemy–the British, the Nazis, or the Communists–turns predator on itself and sees in the faces of Others traits it has managed to overlook. Until now. Some of us think that people are no smarter and may–if these absurd and destructive elections are any barometer–be getting less smart all the time. They are to enlightened government what obesity is to nutrition. And some of us think that the United States Constitution is simply inadequate (not imperfect, inadequate) to cope with the growing realities of this system of government.

Contrary to what the “winners” of this election say publicly: there is no divine mandate here. There is no country to be “won back,” no regime in place. There is no guarantee that America will survive the savagery of the masses and massively under-informed. The Constitution is not a magical formula, just a rather dull diagram for a political order that seems hopelessly out of step with the times.

As to the victors, the “voice of the people,” may God give them the king they desire, one who looks, feels, speaks, and thinks just like them.


Five Good Things about Atheism

It seems I cannot win.


When I chart the vague, occasional and ambiguous virtues of religion (mainly historical) I am accused of being intellectually soft. When I tell atheists they run the risk of turning their social solidarity into tent revivals or support groups I risk expulsion from the ranks of the Unbaptized and Wannabe Unbaptized.

It is a terrible position to be in, I can tell you, and I have no one to blame but myself.

To make amends and win back my disillusioned readers I am devoting this blog to the good things about atheism.

As far as I can tell, there are five:

1. Atheism is probably right: there is almost certainly no God. At least not the kind of pluriform god described by the world’s religions. If there were, we would know it in the way we know other things, like potholes and rainbows, and we would know it not because of syllogisms that begin “All things that exist were created,” or through the contradictory revelations of competing sects.

We would know it because we are hardwired to know.

The weakest argument of all, of course, is existence since existence raises the question of God; it does not answer it. The difference between a god who is hidden (invisible), or does not wish to be known (elusive), or cannot be demonstrated rationally is the same thing as a God who may as well not exist. Not to assign homework but have a look at John Wisdom’s famous parable recited in Antony Flew’s essay, “Theology and Falsification,” (1968).

2. Atheism is courageous. Not valorous perhaps, not deserving of medals. But it takes a certain amount of courage not to believe what a vast majority of other people believe to be true. You learned that much as a kid, when a teacher said to you, after some minor tragedy in the playground, “Just because your best friend decides to jump over a fence onto a busy road doesn’t mean you need to do it too.”

The pressure to believe in God is enormous in twenty-first century society, and all but irresistible in certain sectors of America–the fundamental international base line for irrationality. Having to be religious or needing not to seem irreligious is the greatest tragedy of American public life and a sure recipe for the nation’s future mediocrity. It dominates political campaigns and the way kids learn history in Texas.

Texas edits textbooks

Theological differences aside, what Muslims and Christians and other godfearers have in common is an illusion that they are willing to defend aggressively–in certain cases murderously.

Even when it does not reach that level of viciousness, it can make the life of the uncommitted, unfaithed and unchurched miserable. Atheists deserve credit for having to put up with this stupidity. That is bravery, defined as forbearance.

Many atheists realize that the fervour displayed by religious extremists has deep psychological roots–that history has witnessed its bloodiest moments when causes were already lost. The legalization of Christianity (312?) came within three years of the final assault against Christians by the last “pagan” emperor. The greater number of the wars of religion (1562-1592) occurred after the Council of Trent (adj. 1563) had made Catholic doctrine unassailable–written in stone–for Catholics and completely unacceptable for Protestants. The Holocaust happened largely because Rassenhasse flowed naturally from two done deals: worldwide economic collapse and Germany’s humiliation in the Great War of 1914-1918. The Klan became most violent when its utility as an instrument of southern “justice” was finished.

Most of the available signs suggest that religion will not succumb to creeping irrelevance in the next six months. Religions become violent and aggressive as they struggle for breath. The substitution of emotion and blind, often illiterate, faith in support of threadbare dogmatic assertions is part of this struggle. So is an unwillingness to accept any alternative consensus to replace the old religious one.

Atheism symbolizes not just unbelief in God but the nature of that alternative consensus. That is why atheism is especially opprobrious to belief in an a era when most questions are settled by science and investigation.

Yet even without the security of dogma, religions usually provide for the emotional needs of their adherents in ways that science does not. They have had centuries, for example, to convince people that the miseries endured in this life are simply a preparation for a better one to come. A purposeless world acquires meaning as a “testing ground” for initiation into future glory. There is no art of consolation for the atheist, just the world as it is. Granny may have lost the power of speech after her third stroke, but she knows there is a wolf behind the door: religion knows this instinctively.

Being an atheist may be a bit lonely, but better “Socrates dissatisfied than a pig satisfied.” (And Socrates was courageous, too.)

3. Atheists are more imaginative than most people. Religious people obviously have imagination too, but so much of their imaginative world is provided for them in myth, art, ritual and architectural space. Atheists know that the world we live in is dominated by religion: spires, minarets, ceremonial prayers, political rhetoric and posturing, ethical discussion. I am not convinced (alas) that atheists are “brighter” than anyone else, but they have to imagine ungiven alternatives and worlds of thought that have not been handed to them by tradition and custom.

Imagination however is that two-way street between vision and delusion. The given myths and symbols of a culture are imposed, not arrived at or deduced, and if not imposed then “imparted” by traditions. Jung was wrong.

Collective Unconscious?

Skeptics and unbelievers from Shelley and Erasmus Darwin (Charles’s grandfather) to Richard Feynman, John Ellis, Ljon Tichy and Einstein in the sciences, Sir Michael Tippet, Bartok, Rimsky-Korsakov and Shostakovitch in music, Bukoswki, Camus, Somerset Maugham, Joyce Carol Oates, Vonnegut in literature, have been imaginers, iconoclasts, rule-breakers, mental adventurers.

Far too often, unfortunately, atheists are the worst advocates for imagination.

They rather nervously limit their interest to the scientific imagination. They don’t see a connection between Monod and Camus. They consider their unbelief a “scientific” and “rational” position, not an imaginative one. When confronted with photographs of the Taj Mahal or recordings of Bach’s B-minor Mass, they point to shots from the Hubble telescope or (my personal favorite) soundtracks of earth auroral kilometric radiation.

Instead of owning the arts, they play the part of intellectual bullies who think poetry is for mental sissies.

Joyce Carol Oates

I have come to the conclusion that this is because they equate the imagination with the imaginary and the imaginary with the supernatural. The imagination produced religion, of course, hence the gods, but that does not mean that it is governed by religion, because if it were we never would have got round to science. The poet Charles Bukowski summed it up nicely in a 1988 interview: “For those of us who can’t readily accept the God formula, the big answers don’t remain stone-written. We adjust to new conditions and discoveries. We are pliable. Love need not be a command or faith a dictum. I am my own god. We are here to unlearn the teachings of the church, state and our education system. We are here to drink beer. We are here to kill war. We are here to laugh at the odds and live our lives so well that Death will tremble to take us.”

4. Atheism is an ethical position. That does not make being an atheist a “moral” stance, but it does raise a question about whether it is possible to be good with God. Only an individual free from the commandments of religion and the threat of heaven and hell deserves credit (or blame) for his decisions, actions, and omissions. Atheists are required to assume that responsibility fully. Religious people are not.

This is why anyone who teaches his children that the story of Adam and Eve in the Old Testament is a “moral fable” is just as bad as the fundamentalist who teaches it as history. What would you say about a brutish dog-owner who told his naturally stupid dog to piss anywhere but in the flower garden, then hied him to a shelter the minute he did what he couldn’t help doing to begin with? That is the story of Adam, without the benefit of two millennia of theology to disguise its simplest elements.

Bad Dog

Modern Christian theology has attempted to emphasize the love, mercy and compassion of this God: he is a God of second chances–redemption–after all.

But mainly the Christian message is little more than an attempt to rehabilitate God under the guise of teaching that it’s the humans who needed rehabilitating. They had to be given one more chance at the flowers in order to to show that God, after his initial temper tantrum, is really full of kindness and patience. That’s basically what the “New” Testament tries to do, after all, though in a highly problematical way.

At a basic level, an atheist is likely to detect that there is no ethical content to the stories of religion. The prototypes are Adam, the disobedient, Job, the sufferer, Noah, the obedient, and Abraham, the faithful.

But these figures are not ethical paragons. They are examples of the types of behavior religion requires. Religion evokes “good” in the “good dog” sense of the word–as a characteristic of obedience, not as an outcome of choice. That is not the kind of good any rational being would aspire to–and one of the reasons certain interpreters, like Augustine, thought that what was squandered in Eden was reason. But ethics is about reflection, discrimination, freedom, and decision. Religion, strictly and fairly speaking, does not provide for that; only unbelief does. If Augustine had understood things properly, he would have spit in God’s eye and said that Adam’s only rational choice was to do what he did, affirm who and what he was, and get on with his life without Yahweh. Instead, he creeps out of the garden, takes his punishment like a beaten spaniel, and lives in the hope that his master will throw him the occasional bone.

The expulsion from Eden

To the extent that modern liberal theologies try to say that religions have endorsed a policy of choice and reflection all along, the rebuttal is history.

5. Atheists are socially tolerant. By this, I mean that they do not have a history of violence against beliefs and practices they may privately abhor. They do not burn down churches, black or white. No matter how ardent their unbelief, they do not bomb mosques or blow themselves up at Sunday Mass to reduce the number of Catholics in the world. They are not responsible for the Arab-Israeli border wars. They have not created tens of thousands of displaced people in resettlement camps in Lebanon or torn whole African nations apart. In general, they do not mistake adventurism for preemptive wars.

They may support separation of church and state in sometimes strident ways, but not violent ways: you will not see gangs of secularists tearing down nativity scenes at Christmas or storming historic court houses to get icons of the ten commandments removed from public view. –Even if they think these public displays of devotion are inappropriate and teach people bad habits.

All of these things are pretty obvious, even to believers whose gurus talk incessantly about the secular humanist and atheist “threat” without ever being able (successfully) to put a face on it. But they need to be recorded because religious people often assume that tolerance can only be practised within a religious or inter-religious context, Catholic to Baptist, Christian to Jew and Muslim. But atheism stands outside this circle.

Atheism, as atheism, stands as the rejection of all religious beliefs: it is befuddling to believers how such a position deserves tolerating at all. If there has to be an enemy–something a majority can identify as uniformly despicable–atheism has to be it. That is why hoi polloi in the darkest days of the communist threat, especially those who had no idea what the social and economic program of the Soviet Union was, considered the worst sin of the “Reds” in Russia, China, and Europe their disbelief in God.

As with goodness, tolerance needs to be exhibited non-coercively. Not because Jesus said “Love your enemies,” or because Muhammad preached sparing unbelievers, provided they capitulated to Islam. Not even because John Paul II apologized to Galileo in absentia. What supports the suggestion that atheists are tolerant (and need to continue to be seen as being tolerant) is that the virtue of tolerance emerges naturally from the rational premises of unbelief. What atheism says is that intellectual assent is won by argument and evidence, not by force of arms or the power of priests and mullahs.

While atheists will never experience mass conversions to their cause “like a mighty wind” after a speech by a pentecostal preacher, the individual changes of mind from belief to skepticism will depend as much on the tone as on the substance of their message. By the same token, what atheist would trust the unbelieving equivalent of a spiritual awakening? It doesn’t happen that way. It happens one by one. Slowly. Just ask an atheist about how he “became” an unbeliever, and I wager that you will hear a life story, or something about how things just didn’t add up–a process, not a sudden emotional shudder but often a painful change of heart and (especially) mind.

Should Atheism be Studied?

“Atheism is rather in the lip, than in the heart of man, than by this; that atheists will ever be talking of that their opinion, as if they fainted in it, within themselves, and would be glad to be strengthened, by the consent of others….” (Francis Bacon, 1561-1626)

That is not a trick question. Atheism to have any intellectual standing in the world must be studied, like any other subject.

The stumbling block for doing this, most atheists will allege, are those pesky Christians whose pet cause is getting religion (or approved religion-substitutes, like “moments of quiet reflection” or post-school-day Bible study), back into the schools.

But that’s half the picture. The biggest obstacle to atheism being taught is that atheists have not claimed their subject matter, defined it adequately, or put it forward as anything other than being “not religion.”

It is difficult to teach “not”-subjects. Not-physics could be English, rollerblading or chiropractic services. It could be anything, as long as it’s not Physics. Defining a thing by its not-ness is not very helpful.

That is why the tired taunt against the unbeliever has been and still is, “So, what do you believe in then?” Stammer, cough.

Part of the issue is that atheists are too much foxhound and too little fox. They know when religious folk are trying to sneak religion into a conversation or a curriculum, under the guise of creation science or moral and spiritual development. But their lawsuits, protests, and cries of foul play and Unconstitutionality (whatever that hackneyed phrase may yet mean in this wretched age) seem as hollow as St. Peter’s dome. I mean the basilica.

But at least the Righteous majority, in America anyway, know exactly what they would like to see: stories about prophets and patriarchs, miracles and manna in the desert, Jesus speaking parables to the multitudes, and just a tiny, condescending nod to the millions of people who aren’t Christian yet but who have some interesting if basically wrong ideas–and (perhaps too much to hope) a nice nonsectarian prayer that ends, “In Jesus Name we pray, Amen.”

There is content there, even if the Constitution forbids its propagation as “learning.” And there is history. The religious rightists can also point to an imaginary golden age when Protestant America had no notions or plans to change its essentially doctrinal view of abortion, homosexuality, gender roles and the virtue of private wealth. So what if Johnny couldn’t read? At least he could pray and knew how to wash behind his unpierced heterosexual ears.

Nothing is more clear to the straight-thinking religious majority than that the obstruction of religion by people who don’t read the Bible leads to confusion, and confusion leads to–well, Barack Obama and terrorism.

It is true, of course, that the infinite jest of the religious right is enough to keep any self-respecting unbeliever busy with taunts, jabs, and protests.

In my view, that’s about all atheists have managed to do in the last hundred years.

That is because atheists have grown intellectually fat and lazy, enamored of the quaintness and minority rectitude of their opinion, careless about their targets and goals, gibberishical about their “values” and ideas, many of which are indistinguishable from anybody else’s liberal ideas. Except, perhaps the God part–the not-part.

In fact, the whole faith-versus-unbelief debate is askew.

The righteous and the right-minded have chosen to draw their battle-line on the map of myth. Yet both sides know that the trigger-question is not whether Genesis is “true” but whether the possibility of a being like God is true. The believer, if he is a profound Christian, says simply yes, because the story is true, it being validated by the power and authority whose story it is. This is not the time to drag out a logic primer or a copy of The God Delusion. Quantum physics? Forget about it.

It is time to be foxier than that. If the answer is yes, because the story says so, then the job of education (something atheists claim to care about) is to examine stories about gods. Not just the one in Genesis–all the stories.

And the job of education, and the goal of knowledge, is to find a real method–historical, scientific, critical, the same kind we use in other subjects–for sorting out true stories and false stories. In other words, Genesis can only be “true” to the extent it is certifiably different from, say, this:

Upon that desire arose in the beginning. This was the first discharge of thought. Sages discovered this link of the existent to the nonexistent, having searched in the heart with wisdom.

Their line [of vision] was extended across; what was below, what was above? There were impregnators, there were powers: inherent power below, impulses above.

Who knows truly? Who here will declare whence it arose, whence this creation? The gods are subsequent to the creation of this. Who, then, knows whence it has come into being? (R’g Veda, ca. 2100BCE)

And since difference, on its own, is no hallmark of truth (think of a Rembrandt oil and a copy of a Rembrandt oil), there must be other methods for finding out what the real story is, and which story, if either, has a foundation in reality–reality as non-delusional people understand the term.

The story of God in Genesis is no more a proof of the existence of God than the existence of Dumbledore in the Harry Potter saga is proof of the existence of a master-wizard headmaster.

That is what people who study religion learn to do in classes in anthropology, history, linguistics, archaeology. They look at stories, and rocks, and language trees and other stuff; they sort things out. They know that the Rig Veda is older than the oldest bits of the Hebrew Bible.

They know that written Hebrew wasn’t around in second millennium BCE, though its ancestor-languages, like Canaanite dialects, and ancestor gods to YHWH (the one who set himself apart from his brother gods by making the cosmos in six days) were.

Early God: Yahweh on his chariot

So if we ignore the method-issue by continuing to debate questions of no real importance as though there were no real answers, or none the Constitution will permit us to pursue, we are enduring the ignorance not just of the kids in the classroom but of the teachers, the parents, and school-boards like Dover.

We are enshrining mystery when there is no mystery. We are saying “Who could possibly know something like that?” when there are plenty of people who know precisely what’s what.

We are endorsing the opinion that a lot of learning is a dangerous thing. Americans, among the tribes of the earth, excel in that view, and atheists should be doing what they can to combat it.

Atheists should not be patting themselves on the back for discovering that creation science isn’t real science. That’s a bit like discovering the two men inside the horse-costume. They should be ashamed for not insisting that there are better ways of approaching questions they consider critical.

Creation Science


If it is part of atheist wisdom that God does not exist, then this wisdom has to be included–reflected–in the school curriculum in specific ways, not subordinated to a subset of mainly trivial issues–and by the way, in a way that also trivializes imagination and its offspring, mythology and art.

If atheists are going to help to fight this battle, they need to acquire what Mathew Arnold described as “culture” themselves. I travel in tiny circles, but many of the atheists I encounter got no chat when it comes to many of the things that count for culture–art, music, history–alas, even ideas other than new techniques for life-prolongation. They are simply boring. They are one string harps.

If the pious know what they want–school prayer for instance–what should an atheist want that can be taught?

For one thing, atheists should insist on courses in moral development. In the UK, where the idea of church-state separation isn’t quite as sharp-edged as in the Great Republic, classes in “spiritual and physical development” are usual, though the phrase really just means “moral” and physical education–important add-ons to intellectual formation through the standard lens of liberal learning.

Atheists should insist on ethics- or values-education. They should be fighting battles for good textbooks on the subject, texts that do more than offer an unsuspecting sixth- grader the most uninspiring precis of lives lived and thoughts thought– “Plato was an Athenian philosopher of the fifth century bce who is famous for his idea of the ‘forms’. He was also the teacher of fourth-century thinker, Aristotle who was famous for something else….”

Atheists (I stress) need to be interested in the history and development of culture, not just the assumed predominance of science. Culture and science are not the same thing, but they share a story.

But we live in an era and, in the United States especially, a society that encourages disjunction and dumbness. We have one standard of knowledge for the schools, another for our universities. And unlike Plato, we do not expect the higher pattern to be reflected in the lower.

How odd. We don’t learn to play violin or piano by teaching one set of scales and fingering techniques to seven year olds and a different set to students at seventeen. We insist on parallelism–the analogy–between one experience and the other because we know that real progress is only possible because the course (“Curriculum” in Latin) is also a path from the relatively simple to the relatively complex.

Only in American education can the schools get by with the enormous disconnect between the way in which knowledge is encountered and distributed in the schools and the way it is disseminated in even a mediocre university. And unfortunately, it is because of America’s generally low esteem for the humanities that this ignorance of method can thrive.

And where are the atheists? Fighting yesterday’s wars. Ranged against the Lord God of Hosts on the fields of Canaan. Doing everything possible to make their contribution unacceptable and suspect.

Atheists need to get behind an effort to get Wrong out of the schools–not just God and the Bible. If they claim knowledge is on their side, they need to be more actively involved in the way the knowledge business is run.

Unbelief as unbelief has no more business being taught than Unphysics.

But the body of accumulated wisdom–in ethics, the arts, the sciences and literature–is enormous, and much of it is by skeptics, humanists (in the post-renaissance sense) and atheists. Another lot is by “questioners” like Lorenzo Valla and Erasmus, without whose inquiring intellects the Enlightenment could not have happened.

But where are the bibliographies, the suggestions, the lists, the lobbyists who are willing to challenge the Christocentric and still dominant view that culture’s greatest achievements were carved out in stone and marble and glass?

The distinctive thing about atheism is that it is intellectual architecture, the life of the mind in crisis and question. Not some self-satisfied conclusion growing warts over time. Cathedrals are no proof that their builders were right, and atheists have never built cathedrals.

Its themes can be traced as well, and they are there from the time of the Rig Veda, through the time of “Job,” through the time of William Langland, Bacon (“a little philosophy inclineth man’s mind to atheism,”) and the first stirrings against church doctrine, superstition and clerical abuse in the Reformation. Please: spare me the totally ignorant point that Luther and Spinoza were not “atheists.”

The atheist role is to insist that knowledge is not a grand and beautiful tapestry but the story of doubt and the role of doubt in the wider story of human achievement. Can we not teach that? Should we not teach that?

The question isn’t whether atheism “can” be studied, but when atheists are going to come down from the rooftops and begin making telescopes for the rest of us. That is hard work. That is the real challenge.

Deficiently Humanistic?

This from Ed Jones, concerning the recent post on Religion. He cites Schubert Ogden, once one of my intellectual heroes, from The Reality of God, 1967: 40-41:

The characteristic deficiency of all nonthestic moral theories is that they leave the final depth of morality itself utterly unilluminated. Although they may well focus our moral action and the immanent standards by which it is governed, they fail to render at all intelligible the underlying confidence and its transcendent ground in which our moral activity, as our life generally, actually has its roots.

Often enough, this failure is not lacking in a certain irony. Proponents of nonthestic moral theories typically pride themselves on their right to give a fully rational account of man’s moral experience. Nothing in this experience, they contend, is to be left merely at the level of unexamined belief or tradition. but must be raised to the level of complete self-consciousness. Ironically, however, this demand for rationality is not extended to the basic confidence that all our moral experience necessarily presupposes. Hence, for all their vaunted “Humanism” such theories are, in truth, deficiently humanistic. While they may cast a bright light on the foreground of morality, they leave what Whitehead calls its “background” wholly obscure. They allow the original faith in which all our action is finally based to remain a merely incompleteness, quasi-animal kind of faith.

The basic point Ogden makes here, it seems to me, is unarguable. The demand for a totally rational morality must either be grounded in some theory of the human person–which takes us into the vaporous realm of metaphysics–or in some pragmatic view of consequences for the person and society in the absence of moral conditions.

If for example we are speaking of “law” in a secular and civil context, it is pretty easy to conclude that it is grounded in the latter of these conditions (“If men were angels,” Hamilton famously said, “no government would be necessary.”) The coercive and restraining power of law is therefore based on consequences imagined to arise if law did not exist. But this makes it virtually clear that law does not arise from a view of human action as innately (if that word means anything any longer) virtuous or placid. It arises from the idea that human action is brutish and mean. But hearken: Law has a problematic relationship to morality, and most theologians and philosophers have thought that its role is not to make a man moral but to make him pay his taxes or get him out of the ditch.

But by the same token, religion has never regarded humanity as innately virtuous either. Quite the reverse. A virtuous creature does not need saving from original sin, does not need the counsel and prods of the church, does not need commandments or pastoral care, does not need the promise of heaven or the threat of hell.

Ogden does not of course take such symbols literally: his God is much too “real” (meaning much too misunderstood) for that. But it has to be acknowledged that religion–in the broadest sense–but the book faiths in particular–virtually invented the language of legalistic morality and penal atonement. Its main difference from more mundane law is that the laws of religion are forecast in relation to a personified divine being, a sovereign king and judge, who can be personally offended by the violation of his rules and who has established specific ways of coping with transgressions. In theology, mankind is caught between heaven and earth; the best he can hope for is to be free from sin. In secular law, he is caught between the state and his own instincts; the most he can hope for is to stay out of trouble. There is no virtue and no morality in either scenario, though in traditional Christianity, the rewards for being good are infinitely greater.

Thus when Ogden says a secular morality “fails to render at all intelligible the underlying confidence and its transcendent ground in which our moral activity, as our life generally, actually has its roots,” he is trading in obscurity. It is the denuded theological doublespeak of an era that rewarded vacuity. Especially since this transcendent ground appears to be a not terribly clever circumlocution for God. Moreover, why should this transcendent ground be given any consideration in moral decision making if it is in no sense personal, cannot be offended (or pleased, or pacified), has no stake in the outcome of our decisions and actions, and could do nothing about it if it did?

Secular morality–Ogden is right–is greatly deficient because its instruments are not mathematically precise, its premises are negotiable and its outcomes approximate. Given its evolution as a rebellion against theological certainty, it could be nothing else. It is true that the absolute “standard”–or ground if you prefer–has been sacrificed to modern consciousness of real rather than transcendental ends and means.

But secular morality is not humanistically deficient, anymore than a religious morality is theologically perfect. It’s merely human. And its theological deficiency is nothing to apologize for.

Cheap Grace… Plus Postscript


Having been accused of “faitheism” by more than one reader of this blog, let me offer the following:

I have been a fairly vigourous opponent of the new atheism, manifesto-atheism, organized secular humanism (if that is not an oxymoron) and the quaintness of the term “freethought.” (Send it to the attic, it doesn’t apply to anything on the contemporary scene).

But you need to know why I am critical, and to understand that, you need to understand a bit of history–especially the history of men like Dietrich Bonhoeffer who was a victim of German-style National Socialism.

To my twenty-something readers who have just come out as atheist, or gay, or something, at Oberlin, or somewhere. Good for you: if you mean it. But please mean it. Because if this is just to irritate your parents, it’s hardly worth the trouble. It’s true that gays and blacks and resolute women have been a persecuted and marginalized class in American society.

But two things are not true: (a) That atheism is the last buttress against the know-nothings of American democracy (“A Mighty Fortress is No God”?) and (b) that there has been a consistent “persecution” of atheists in American history. Not getting elected to office because you do not believe in God is not, I am sad to report, persecution.


The fact is, atheists have seldom taken a moral stance about anything. Their core position–that religion is immoral and that they are therefore opposed to its influence and its effects–is not a moral position but a dog satisfied to have caught its own tail.

Perhaps that’s why years ago at Harvard I spent my spare time reading Dietrich Bonhoeffer. No atheist, clearly, but an ardent believer in the improvability of the human race, a race that for all intents and purposes God had deserted. Naturally critical, he floated between theological positions and even spent a year at Union Theological Seminary in 1930.

After studying with the best we had to offer– Reinhold Niebuhr–he concluded, “There is no theology in America.” He meant, of course, that there was no rigorous inquiry into the sources of belief nor any critical examination of Christian theology in general, the sort of thing the German faculties had developed as Wissenschaft –serious scholarship. In fairness to the softness of the American cultural landscape, however, we also had no Hitler.

For Bonhoeffer, “serious” theology had consequences, and these led him through an almost unimaginable circuit of events to being arrested, condemned and executed for his involvement in the Abwehr conspiracy.

Bonhoeffer was hanged at dawn on April 9, 1945, just three weeks before the Soviet capture of Berlin and a month before the capitulation of Nazi Germany. By decree of the SS and with Hitler’s explicit instructions, the execution was particularly brutal. He was stripped of his clothing and led naked into the execution yard, where he was hanged with piano wire. An odd fate for an academic, a poet, a pastor and someone who saw the Church’s mission as entirely compatible with humanist ends.

I am beginning to dislike atheism. I dislike it because it is historically illiterate, and because it sees its crusade against the “powers of darkness” as a crusade against a record that all the blasphemy and all the parody in the world cannot change. I mean those moments of sanctity, light and grace where for reasons beyond the normal course of political events men like Bonhoeffer stood down the real powers of darkness.

For reasons different from the philosophical messiness of religion, atheism is a mess.

In making religion its sworn enemy atheism–organized atheism and secularism especially–ignores the religionless elements that transfused both the Nazi and Soviet movements. When will atheism have the will and the confidence to admit that a world without God is no better than a world with God? If the twentieth century proved anything, it is that.

Bonhoeffer used the phrase “cheap grace” in his most eloquent meditation, The Cost of Discipleship, to describe the Christianity of his day–an idea he derived from Kierkegaard. In contrast to the energy and vision that had inspired the early Christians as a religious minority, European Christianity had become fat, lazy, and politically malleable. It required neither risk nor affirmation: to be German and Christian was equivalent to what it once was to be Roman and pagan. (The Jews got the short end of the equation in both cases).

His premise was simple: any intellectual position comes at some expense. At one extreme, it is worth lying for, conspiring for, and if all else fails, dying for. “Cheap grace means grace sold on the market like cheapjacks’ wares. The sacraments, the forgiveness of sin, and the consolations of religion are thrown away at cut prices. Grace is represented as the Church’s inexhaustible treasury, from which she showers blessings with generous hands, without asking questions or fixing limits. Grace without price; grace without cost! The essence of grace, we suppose, is that the account has been paid in advance; and, because it has been paid, everything can be had for nothing….”

Hitler’s enemies were not atheists. They were his co-religionists, Catholic priests and confessing protestants like Martin Niemoeller. They were his religious Others–the Jews, and had Europe then had a substantial Muslim population (I am sorry to disappoint my pro-Teutonic Muslim friends with this information) they would have joined the inmates at Buchenwald and Auschwitz as outsiders as well. The early anti-secular noises made by the Nazi party to pacify the churchly despisers of Adolph Hoffmann, whose picture appears in my family album, were decisively exposed as political by Hitler’s closest mentor, Martin Borman, in 1941:

When we [National Socialists] speak of belief in God, we do not mean, like the naive Christians and their spiritual exploiters, a man-like being sitting around somewhere in the universe. The force governed by natural law by which all these countless planets move in the universe, we call omnipotence or God. The assertion that this universal force can trouble itself about the destiny of each individual being, every smallest earthly bacillus, can be influenced by so-called prayers or other surprising things, depends upon a requisite dose of naivety or else upon shameless professional self-interest

Borman followed this with a 1942 memo to Gauleiters, that the Christian Churches “must absolutely and finally be broken,” as their views were fundamentally opposed to the total world view of democratic socialism.

Bonhoeffer’s reaction was not against proposals that (among others) would have banned the teaching of theology in the universities or removed the Old Testament from the Bible, or eliminated subsidies for churches and religious schools, or forbidden school prayer. The total menu of punitive actions against religion was much larger than this–and similar proposals have been the staple of democratic socialism in both Europe and America for more than a century.

Bonhoeffer’s nausea was evoked by the quasi-religious and spiritual trends of the Nazi inner circle: Germanic pagan imagery mixed with ancient Roman symbolism and emotion in propaganda for the German public, the naive acceptance of social Darwinism, a strong belief in the providential role of science, as Science, and a commitment to the idea of German intellectual supremacy. He saw forming behind the scenes a new myth, fashioned to replace the old one by summoning the tribalism of an ancient imperial past, and a Church so naive that it believed it could accommodate the “new ideas.”

Bonhoeffer died as a Christian, as someone opposed to the symbols and reality of the state-produced Man. If you want to see the most effective and still chilling visualization of this, watch the first fifteen minutes of Leni Riefenstahl’s 1935 film, Triumph of the Will–Hitler descending like Woden or Jesus (either is correct) as the expectant people (sitting in darkness, awaiting the light) clamour for the landing of his aircraft.

So the question arises, why in a world so allegedly hostile to their ideas have atheists never been held to account? Why are there no illustrious atheist martyrs, no equivalents to Socrates and Jesus–and Bonhoeffer? Given the insistence of the atheist and secular humanist movement that their position is heroic simply because it is (as yet) unusual in the world–perhaps especially in salvation-starved America–
what approaching army advances? What hideous penalties do they threaten? Do any involve being strung up at dawn by piano wire? And who will be the first to lay his life on the line for the glory of Unbelief.

In fact, modern atheism is the moral equivalent of what Bonhoeffer called “cheap grace.” Just as the comfortable Christian could count on the fact that the price of his sins had been paid for in advance by a God who operates as an endless source of moral credit, atheists know that the cost of their rage is slight. They count on the fact that the free speech they savor has been underwritten in constitutions and codes dating back two centuries–just as the Protestants of Bonhoeffer’s Germany counted on the fact that their greed had been atoned for in advance. They follow a narrow orthodoxy that punishes nuanced, critical and accommodationist views–just as the Churches of Bonhoeffer’s day embraced a gospel that perfectly reflected their social values and political lassitude.

Kishinew Pogrom

In other words, the cost of being an atheist is simply to proclaim being an atheist, with a wink to the atheist at your side. What, no applause? No police force, no secret agents are going to round you up for that. For that to happen, there would have to be something more to atheism than the purely negative impact of not believing in God or believing that religion is evil.

It would have to develop real ideas, agendas, and principles–preferably different from the ones that emanated from the first great organized wave of atheist ideology, Soviet communism.

And since atheists often adopt a Missouri posture in such matters: Show me your martyrs. Show me the principles for which they died. Show me the agenda that naturally flows from unbelief, and the positive consequences of taking that position. Show me the future of the world you believe in when the world no longer believes in God.

Otherwise, atheism is simply the additive inverse of cheap grace.


It’s hard to imagine that I managed to get through this whole piece without using the word “complacency” even once. One reader awoke me to the fact when he asked whether Jesus and Socrates had died for their religious opinions or were victims of political circumstance. The flippant response is that most people who die for religious reasons were victims of circumstance, including the heretics. Atheism as we use the term today is really an intellectual fashion of the seventeenth century when the Church in the west no longer had the power to roast people for their apostasy: Around 1650 an anonymous manuscript appeared (probably in France) entitled Theophrastus redivivus which appears to be the oldest extant atheistic document. But, of course, there was classical precedent for denial of the gods, as well as satire of their behavior and trivialization of their role.

The atheist “heresy” is in creating an apostolic succession of unbelievers (Socrates and Galileo are, somewhat ludicrously, often numbered among them) that never existed, but put forth on the premise that very bright people must (at least privately) have been unbelievers. The religious heresy is the complacent belief that unbelievers are beyond the help of the church and thus, as Anselm regarded atheism, a form of insanity or “foolishness” (Psalm 14.1).

But my real quibble with redivivus atheism is that it has taken a sideshow approach to a subject that ought to be viewed and debated seriously. Atheism, as such, is an intellectual position, not a moral philosophy. But sideshow atheism is neither. Blasphemy Days, sloganeering, bus campaigns, unbaptisms, video challenges, cartoon contests–whatever motivates this activity (bonding, boredom, or the lust to be noticed?), it is not of a kind nor quality that does atheists any good. If instead of arguing their case, the atheist strategy for growth was to build the world’s most repulsive bogeyman, they have done a good job.

I am not even certain why atheists feel they have the right to feel more agitated and annoyed by the noise of the religious right, which after all is simply a bigger and more influential sideshow, than liberally religious, studiously ethical, or indifferent men and women–where I think the real and growing numbers of “converts” are. Most absurd of all is the persistent effort of younger new atheists, the Dawkinsians and Flying Spaghetti Monstratarians, to see their “cause” as equivalent to the civil and sexual rights movements of the twentieth century.

For the sort of serious approach to the subject that American atheists (chiefly) might want to know about and would surely benefit from reading, Cambridge University’s “Investigating Atheism Project” will repay the effort of a little historical homework a thousand times over.

Atheist Denominationalism

Most atheists have never read H. Richard Niebuhr. That’s too bad. Because now that unbelievers are fighting with each other about how much of God not to believe in, they have a lot to learn from the battles fought among God’s people for primacy of position.

Niebuhr was primarily an ethicist and while influenced by philosophers and theologians as far apart as Barth, Troeltsch and Tillich, he was solidly grounded in the reality of social change. He knew that since the Protestant Reformation Christianity had become restless and incoherent. When monolithic belief in God’s holy church and her sacraments was demolished by the phenomenon of “fissiparation” (churches quarreling over picayune differences about inconspicuous doctrines and forming into ever more minor sects), the stage was set for a religion that could hardly claim to be what Christ had in mind when he expressed the wish that ‘all may be one’. Not of course that Jesus was speaking, if he was speaking, of the church when he said that.

Countries around the world experienced the Protestant Reformation in different ways: Europe at a theological level, and then in skirmishes that grew into full fledged wars. No longer able to contain the confusion by executing the odd heretic or sending the forces over the hill to rout the Huguenots, Europe settled finally into a state of religious détente that grew eventually to boredom and finally to a comparative loss of interest in religion and an acceptance of secular values. One of the reasons the “priest abuse scandal” has been so shocking to Europeans is that this generation of Irish, Italians, Germans and French have a hard enough time remembering the autocratic church of their grandparents’ day, when papal and episcopal fiat were good enough for relatively docile laity. It is the idea that society—the secular—stands against and above the church in all legal and judicial respects that makes the crisis almost unfathomable in modern terms.

A Protestant Scene

America experienced the Reformation as an export, a receiver nation. Whatever you might have learned about America being solidly “Christian” at its foundation is not only not true, but not true because the seventeenth century was the era when Christianity itself was being redefined. The puritans of New England did not share the religious interests of the commercial men of Massachusetts Bay, a “factorie,” and the relatively softer Baptists followed on their heels within a generation. Harvard had fallen from Calvinist grace by 1702 when Yale was founded to preserve its true religion (the mottoes are revealing: Harvard, Veritas, Yale Lux et Veritas). By then, Jews were aboard, or off ship, in Rhode Island and the first waves of Catholics were about to arrive in Lord Baltimore’s Maryland. Go a bit further south and boatloads of low church Anglicans had disembarked in Virginia decades before, and Presbyterians would squeeze into the gaps in the Carolinas, named for Charles II. Georgia (the name first suggested to a delighted George II in 1724) would be transformed by Wesley’s followers into a colony for Methodism. Go a little deeper and change colonial masters: waves of Catholics driven from New France by the pursuing forces of the British General Wolfe, would arrive in the bayous of the Mississippi Gulf region and learn to call it home.

A 'Cajun' (Acadian) Scene, 1898

The grab bag of religious immigrants that came together at the end of the eighteenth century was not an especially remarkable mix. It was a powder keg of competing denominations with explosive potential. In their wisdom, as Americans like to say of the founding fathers, the authors of the Constitution were savvy enough to make sure that religion and government should stay apart: that’s what the first amendment was devised to do. But they were equally savvy about the instincts of these displaced and largely yokel Europeans. Whether it was debt, famine, crime, adventurism, a loveless marriage, a lost fortune or religious persecution that had brought them to the New World, it was entirely likely that their faith came with them. So, what the founders gave with their right hand to government they took away again with their left by delivering to these competing sects the “free exercise” of their faith. Congress would never pass a law constraining the free exercise of religion. And in saying that they passed a law concerning the free exercise of religion. America became the most religious nation on earth and the most fertile field for growing new religions.

Mormon Trek

Niebuhr of course knew all of this, the son of a distinguished immigrant German theological family himself. He knew that the ragbag culture of American religion would always be a supermarket of choices–and not only that. There was something in the nature of Protestantism that was friendly to competition (as Weber had argued at the beginning of the twentieth century), from strong belief to weak belief, from Born Againism to Ask me Later. If ever Feuerbach needed confirmation of his idea that religion makes God in man’s image, the proof could be found in the American Experience.


* * *

Modern Atheism is a continuation of the pattern of denominationalism and derives specifically from it. It is the fatal last step in the journey from strong to weak belief. Just as secularism emanates from the religious acceptance of tolerance and pluralism, necessities imposed by competing sects living in close cultural proximity over a long period of time, atheism is that point on the belief scale where God becomes not optional but impossible.

By saying that I don’t mean to suggest that atheism is religion. That is a limp, tiresome, historically uninformed debate. But atheists would be very foolish not to understand themselves connected through history and process to the developments that help us to understand the phenomenon of denominationalism. If a hard core atheist cannot believe in creation ex nihilo, it would be pretty silly for him to believe that any social or intellectual position can be equivalently wrought.

It also seems clear to me that atheists, in accepting that they have their origins not in Zeus’s bonnet but in a social process, should also accept that atheism will also experience its own denominationalism, its own sectarian divisions This process has been under way for a long time. We are seeing its latest eruption in the “debate’ between old and new atheists, as in the twentieth century in differences between religious humanists and secular humanists. Even the terminology used to express the differences (as Niebuhr pointed out) becomes crucially significant: Labeling is one of the properties of the protestant spirit. Just as it isn’t enough to say Baptist without specifying Southern, American, Freewill, Particular or Seventh Day, the day may come when one atheist will demand of the atheist sitting next to him at a bar “Old,” “New,” “Bright,” “Strict,” “Friendly” or “Prickly?”

What denominationalism teaches is that human beings despise norms. I suspect there will never be a more impressive “norm” than the rules and doctrines and liturgies of the Catholic church of the sixteenth century. If you want to know what God looked like at the peak of his game, it was then. But “then” is when the Reformation happened.

I suspect that atheism had something like that heyday in the 1940’s when it became, normatively speaking, a sexy bad boy philosophy associated with the likes of Julian Huxley and Bertrand Russell, and slightly later with (certain) existentialists, especially Camus. It had a prior history of course, and a later one. But I tend to think the potential for variegation in atheism goes back far into history. Maybe it goes back to Hobbes, maybe to Lucretius or Epicurus. But wherever it goes it is always in juxtaposition with religious values, and often enough (especially with the French) with particular religious doctrines. Read the forward to Marx’s doctoral thesis to see what I mean.

Atheism follows the religious pattern of denominationalism not only because it behaves religiously but because its central question is a religious question—or more precisely a question about religion. It should surprise no one that what we are seeing now are permissive, soft, hard, pluralistic, total rejectionist, possibilist, impossibilist, and accommodationist responses to the question of God’s existence and the “meaning” of religious experience. Why would we expect anything else?

What we can hope is that the process doesn’t take atheists too far down the denominational road as they jockey for position as the True Unreligion: Once-born and twice-born atheist is a distinction we can live without.

“Who was You?” On Hiding from What You Are

The Boston Lowells knew who they were. From their perch on Beacon Hill they enjoyed a perspective that encouraged them to believe in the Unitarian credo: the fatherhood of God, the brotherhood of man, the neighborhood of Boston. When William Filene opened a discount store in the basement of his father’s store to sell overstock and closeout merchandise through his “automatic bargain basement” (off the rack, serve yourself), Beacon Hill was a swarm of indignation. The son of a (Jewish!) peddler would throw Boston society into disarray. Cheap clothes that looked like finery? Now even Irish women who worked as chambermaids could look respectable. That is, if you didn’t look too closely.

Never to be persuaded without a firsthand look, Anna Parker Lowell walked into Filene’s downtown store near Washington Street, coiffed and umbrellad, sought directions “to the so-called Basement” and took the steps with the polish of someone who was used to grand staircases. Once aground she saw women flipping through racks of dresses like playing cards–choosing, refusing, playing tug-a-war, even threatening bodily harm if a latecomer tried to prise her find away from someone with a prior claim. “Disgusting,” Mrs Lowell tsked to herself. “Just look at them.”

Just when she had satisfied herself that Edward Filene’s brainstorm would mean the end of high society in Boston her eyes lit on a beautiful taffeta gown that looked just the thing for the spring ball at Harvard. She moved closer for a better look. As she reached to collect her prize, a woman of questionable pedigree snapped it from the rack and headed for the till. “Not so fast my dear,” said Mrs Lowell. “I was about to have that dress.” “You was,” said the woman without slowing. “I don’t think you understand.” I had chosen that dress. I was just about to collect it.” “You was,” said the woman, unable to evade Mrs Lowell’s pursuit because of a crowded aisle. “Look here, madam. I didn’t want to tell you who I was, but I will if you persist.” The woman stopped, turned, looked Mrs Lowell in the eye, and said “Ok dearie: Who was you?”

I have always wondered what people mean when they say “That’s who I am,” but usually they mean something silly and parochial: I’m a Catholic, a democrat, a creationist, a car dealer, an ex-con, a neo-con. It’s the substitution of code for argument, a conversation stopper rather than an invitation to discuss a position or idea. Clearly identity matters, but the twentieth century was distinctive in breaking down the sorts of identities that isolated people from majority communities and power structures.

There are big identities and small identities, weak and strong. Part of this has to do with the nature of language and part with the nature of things. Being a democrat or a used car salesman are weak identities: you can change those things tomorrow if you change your mind or lose your job. Being an African-American or a male, despite the fact that we know a lot more about race and sexuality now than we did fifty years ago, still have a lot to do with properties and are much more difficult to change. To say, “I’m gay,” is not just to say “I’m not straight” but to challenge the idea of straight as normative and authoritative. That’s different from saying, “I’m Catholic,” if by that you mean you’re on your way to heaven and the guy you’re talking to is going the opposite way. Beware of anyone who says “That’s who/what I am” with a smile on his face.

Identities can be a great source of fun, as when Ambrose Bierce (the Devil’s Dictionary, 1925) defines a bride as “a woman with a fine prospect of happiness behind her” and “Brute” as husband, or a “minister as “An agent of a higher power with a lower responsibility.” Too bad that in Bierce’s day the Vegan craze wasn’t what it is in the twenty first century, but he did have this to say about clairvoyants: “A person, commonly a woman, who has the power of seeing that which is invisible to her patron, namely, that he is a blockhead.”

The weakest identities of all are the ones that have to do with what we believe to be the meaning of life. I can remember in college three distinct phases of change: being a socialist at seventeen, a half-hearted anarchist at twenty, and an existentialist at twenty one. I recovered from these infatuations by not permitting myself to stop reading and never reading Camus after thirty. With confusion intact, I went to Divinity School and emerged as confused and doubtful as ever. Voltaire said it was only his skepticism that prevented him from being an atheist. That was me, too.

I can’t doubt that there are “meaning-of-life” identities that one holds passionately and therefore appear to qualify for the “That’s who I am” category of identification. I have known people whose non-belief is as fervent as the belief of a twice born Baptist or Mormon elder, people who say “I am an atheist” as proudly as an evangelical says “I’m born again.” It’s tempting to say, isn’t it, that the difference between these two statements is that the atheist is smart and the Born again needs his intelligence quotient checked. But we all know that identity statements are code for a whole range of ideas that need to be unpacked and call for explanation. An atheist who felt his non-belief in God entitled him to murder children because of the absence of divine commands to the contrary would be no better than a cult member who believed that disobedient sons can be stoned because it says they can in the Bible.

I feel my Atheist Reader squirming, because while you liked the Bright-Dim difference, you don’t like equivalences. When Katherine Hepburn turns out to be an atheist people say, “I just knew it. Such a strong woman.” When Pol-Pot says God is bunk, we think “Well that’s different, isn’t it—and so far away?”

Personally, I don’t like people who say “That’s who I am,” or “That’s what we are,” or “We need to be honest about who we are.” At a crude level I want to say WTF? It’s eerily metaphysical when atheists do it—not only because it’s the language God uses when he introduces himself to Moses on Sinai. You remember, right?: Moses hasn’t been properly introduced and God says, “That’s who I am,” and when pressed after Moses accuses God of being slippery says “I am what I am.”

I reckon what he really means is, “You know—God—the one who does firmament, landscaping, Leviathan, floods, human beings God.” In fairness, however, the Hebrew Bible insisted that God was not just a proposition but an actor on the human stage. I don’t believe that God did any of the things ascribed to him in the Bible, but to believe in a doer and deeds is a perfectly legitimate way to establish an identity—even if it’s a fictional identity. That’s why Jewish atheists begin by denying the deeds and then the doer. None of this silly ontological stuff: too Christian, too mental.

But I find it a lot harder to know who I am or what we are on the basis of not believing something.

“We need to be honest about who we are” coming from an atheist doesn’t translate easily into the propertied descriptions of being black, gay, female or physically challenged–things over which people have no choice and no control.

It’s tempting, I know, to think the things we believe or don’t believe have the same status as the things that constitute us as persons or collectives of persons. But you would laugh at a used car salesman saying at dinner, “Dammit, Mother, I’m tired of hiding from who I am. Tomorrow I’m going right into the boss’s office and say to him, ‘Mr Jones: I am Bill Smith and I’m an atheist.” You would not laugh at someone who said, “Mr Jones: I haven’t had a raise in two years. Is it because I’m black?”

Atheists often complain when religious groups claim special treatment on the pretext that any speech against religion is defamatory while claiming equivalent protection for their own beliefs. But atheists need to be very careful about traveling the road of victimization and minority rights or simply adopting the legal definitions supplied under non-discrimination laws. Especially when racial, sexual orientation and gender provisions do not apply to atheism and the protection accorded to religious beliefs, if embraced by atheists, creates a stew of issues–not the least of which is that there is no settled definition of atheism and if there were a true freethinker would reject it.

Difference is deceptive, especially when it comes to self-definition. Is coming out atheist like coming out gay, an act of courage? On what basis–the fact that terms like “minority,” “unpopular” and “misunderstood” can be applied to both categories? But simply to embrace a minority position toward a “divine being” based on denying a premise is not an act of bravery. It doesn’t make you who you are or what you are. It’s neither race, profession nor party platform—not even a philosophical position or scientific theory. It’s not something to be ashamed of or proud of. It’s just about an idea—even if it’s a really Big idea.

New Ethics and Atheist Newbies

The Necessity of Atheism?

Pardon my cough when I see titles like Good without God being hailed as “trendsetting.” Not only is the title overworked and the subject matter stale, but the author manages to get through the entire discussion without so much as tipping his hat to the theologian who pioneered the debate almost a generation ago, Cambridge University’s Don Cupitt.

To be fair, it is possible the author never read Cupitt. American learning is almost as parochial and inward-looking as it was in Emerson’s day when the sage, in his exceedingly dull 1837 Phi Beta Kappa address, tried to argue that American scholarship (still the object of ridicule in Europe) would be concerned mainly with “Nature.” So, we in this nation, especially perhaps scholars, are part of a proud tradition of not paying attention to foreign scholarship and are more prone than Europeans to claim squatter’s rights to ideas developed by others, elsewhere, often long ago.

Whatever the case, to write a book about ethics without God and not to cite Don Cupitt’s The New Christian Ethics strikes me as plainly negligent, to the point of being out of touch with the topic. A bit like writing a book on the history of the Statue of Liberty without mentioning Frederic Bartholdi.

This out-of touchness is something I have been battling for years. The problem with Atheist Newbies (as good a beginning of a carping sentence as you could want) is that they are too little aware that the battle they think they are fighting was fought over a century ago, fought by theologians in liberal trenches (not atheists in foxholes) and for better or worse won by the forces of reason—if not exactly the battalions of unbelief.

I suspect that is why they spend so much time battling old believers–ranging from DMS’s (Dead Medieval Scholastics) to MILFs (Multiple Illiterate Leadheaded Fundies) because for the most part their work shows no currency with the serious strands of contemporary theology, social ethics, or even of philosophical dialogue with theology. This isolation from theology also nurtures a strong tendency among the Newbies to assume that they were at the station ahead of theologians who had actually caught the train days before them.

Of course there is no need to keep current if you have determined to win against the religious losers and claim that there are no other intellectual positions worth fighting against. It happens to be true that a great deal of modern theology is not worth bothering with. But that is doubtless true of books in general. Not to know the history of theological Destruktion since Kant, Coleridge and Schleiermacher ruled the waves is simply to claim poverty as privilege.

Which brings me to Don Cupitt. Cupitt was the unwitting source of my greatest disappointment many years ago when I was offered a place to read theology at Caius College, Cambridge, and decided to go the City of Dreaming Spires instead.

Gonville and Caius College, Cambridge

To be blunt, there was no one quite like Cupitt at Oxford, though a few came close. In 1988 he published a modest volume called The New Christian Ethics. The book was a follow-up to his highly controversial, absolutely marvelous little book called Taking Leave of God. (1980).

Taking Leave of God had taken what theologians sometimes call a non-realist position: God is a conglomerate of expressions about god, but not the same as any individual expression nor any total of these expressions. In this sense, God is not “real,” and so any idea that this God has given moral commandments to the human race is untrue.

Like a lot of non-realists, I prefer to say that God is not real instead of saying “I do not believe in God,” or more confidently, “There is no God.” I have no idea whether there is any god that equates to any idea or expression of god. How could I? When I say “God is not real, “ I simply mean that there is not now nor has there ever been a being equivalent to the descriptions of the divine being in sacred scripture and Christian (or any) theology. I am not saying that Christian theology is deficient and some other person’s theology is “right.” I am saying that while I cannot rule out the possibility of God, I can rule out the historical descriptions of him and the rules of conduct thought by some religious people to emanate from him. It’s odd how close this is to atheism, but the atheists I know are the last to admit it.

The idea of the unreality of God gets us beyond the existence question in a healthy linguistic way, because it means that there is no way to experience the reality of God in the way we experience the reality of the world. We know that the historical, traditional descriptions of God are man made. We know this as fact.

The commandments of the Bible and Quran are man-made as well. They are ideas that were used in antiquity to flesh in the idea of God as lawgiver and sovereign over the customs and conduct of human beings. Almost certainly, they are the work of a professional class–priests, prophets, royal sycophants and bureaucrats.

With the collapse of the biblical-realist idea of God, which happened in theology beginning in the nineteenth century, the idea of “divine command” ethics was washed away as well. For many contemporary theologians it does not matter that a great many errant and usually unrefined voices still defend the “reality” of God, the basic soundness of the biblical view of God, or the general “wisdom” (if not the details) of divine command theory. It should matter however that these voices are evidently the only ones of any interest to Atheist Newbies and matter as well that the most vocal critics of religion don’t really seem to care about making the careful distinctions that would, if ignored, sink them as experts in any other field–especially the sciences. The moral is, it is easy to be a critic in a field in which you’re an amateur.

However, the most important thing about Cupitt’s ethics is that he regards the end of realism (the end of the belief in the reality of the God of the Bible) as a turning point in human history. Rather than setting up a straw-man opposition between the “truth” of science (and any ethics emanating from “scientific reason,” whatever that is) and the falsity of religion (with its God-driven, rule based, non-negotiable edicts), Cupitt sees the end of God as a challenge that confronts everyone: the atheist may consider herself free of it, but her obsession with continuing to play with tin solders contradicts her freedom. The Christian, Jew, Muslim on the other hand must begin by acknowledging that the challenge has not been met, and that they may still be infatuated with ideas they have never taken the time to question or examine:

The end of the old realistic conception of God as an all-powerful and objective spiritual Being independent of us and sovereign over us makes it now possible and even necessary for us to create a new Christian ethics. It is we ourselves who alone make truth, make value, and so have formed the reality that now encompasses us.”

Cupitt’s position is far more radical than it seems—radical precisely because he is not saying what I take the Atheist Newbies to be saying–that is, if they are arguing a kind of ethical détente between believers and nonbelievers consolidated in the paralytic slogan, “It is possible to be good without God.”

Cupitt is saying that it is not only (or primarily) the atheist who must learn to do without God-based ethics. Believers do not have the option to choose a reality of godly proportions and christen his commandments as the divine will as a cover and support for their morality. He is saying that everyone, including believers, must learn to be “good” without a God who is not real in the first place, who has never spoken—and not just not to atheists–and certainly not to the modern mind.

This is optionless ethics, where an atheist will find no opportunity to exchange the fixed certainties of religion for the discovered truths of science as an alternate source of ethical reassurance.

“There is no bedrock and nothing is fixed, not my identity nor my sexuality nor my categories of thought, nothing… There is no external measure or value or disvalue– and therefore our life is exactly as precious or as insignificant as we ourselves make it out to be.”

In his work, Cupitt has always been clear that there is a strong religious argument against religious ethics and against the objective existence of God. Religious argument against God? Yes, certainly. It shows through vividly in those faiths that profess an absolute loyalty to an absolute ruler who reigns from the heavens. In Christianity and Islam, the idea that God exists primarily to tell us what to do, knows what we do, and reacts by punishing and rewarding what we do, is prominent if not primary. It is not only repressive; it so limits the idea of the freedom of human beings that this sort of God cannot really desire choice as part of his plan for salvation: salvation would necessarily (and actually does) mean salvation from the structures he imposes on his own creatures.

Cupitt dismisses with a stroke of the quill the turbid debates of two millennia concerning freedom and bondage of the will and says that they are a conceptual overwrite of a scriptural tradition that precludes them—inveigled in from philosophy, planted in Eden, but with no convincing root system. “An objective God cannot save anyone. …The more God is absolutised, the more we are presented with the possibility of living under the dominion of a cosmic tyrant who will allow nothing, and least of all religion, to change and develop.”

The unreal God of the Christian tradition is nothing more than humanity setting limits on its own self-understanding by projecting such a tyrant and his rules as restrictions on human freedom. Nowadays, Cupitt argues, “the nature of language dictates what can and cannot meaningfully be said of anything, God included.”

As to the thesis that it is possible to be “good” without God: The more radical proposition is that a morality based on choice and freedom is only possible once the reality of God has been sacrificed to a deeper understanding of our own humanity.

How Christianity is the Perfect Religion

Love Incarnate?

I confess to having a seasonal defective disorder about this—Christmas I mean.

I am frankly tired of news about religious extremists plotting world takeover from septic tunnels, watching deals between “good” Taliban and “pro-western” Pakistanis brokered and shredded within months by toothy politicians, depressed from smiling over my gin when MSNBC reports that a pilotless drone (no, a different entity from the United States Senate) has killed a “top level Al-Qaida leader.” (No, not bin Laden. Certainly not—but someone who knows someone who met him once. Maybe at a barber shop.)

Bored enough even to yawn at the last report of a horrific car, market, bus, mosque or school bombing somewhere in Iraq, Pakistan, Afghanistan. Weary to the point of dizziness at the latest decisions to send in another doomed-from-the get-so cadre of troops to “finish what we started” [sic] in Afghanistan. Innocence betrayed by the allure of travel to distant lands?

At a lower level of cynicism, I am lulled to despair with the conflict over whether Jews in Santa Cruz should or should not have a right to display a fifteen foot high menorah in the “downtown area.” It’s a cluster of candles for God’s sake, but more to the point: don’t you have a back yard?

I am sick of the Vatican being forced into the position, yet again, of apologizing for randy priests and abusive, sexually repressed nuns who couldn’t keep their paws off innocent children in their care. It is disgusting. It is so disgusting that we need to consider seriously if any other social community, unprotected by the fiction that religion operates for the good, is even capable of doing the things that religion does—and does by pointing to a Higher Authority whose function it is (apparently) either to forgive it or condemn it but does nothing to prevent it by putting its holy temple in moral order.

Magdalene Asylum

The commonplace concept of God in all three religions is so miserably and wretchedly puerile that it sends me searching for my dog-eared copy of The Future of an Illusion on an annual basis. May the Kingdom come (and go) soon.

So I ask myself, what went wrong, or what’s gone missing? All of these religions had mystery once upon a time. And without overstating the terrors that take shape when religion is taken literally rather than mystically religion unclothed is a dangerous thing. The poet Matthew Arnold warned a century and a half ago of the danger of taking myths, mixing briskly with the hazards of unformed religious passion and ignorance of literature, and turning them into dogma. For Arnold, the great devil of nineteenth century religion in the English tradition was making postulates out of poems.


Who could have foretold that the literalism and plain-talk we expect in twenty-first century discourse would constrain religion to take its own propositions seriously, and worse, act to defend them in absurd and violent ways. But that, I submit is what has happened.

Maimonides. Avicenna. Meister Eckhart. Jalāl ad-Dīn Muḥammad Balkhī Rumi, and more to date (d. 1937) Muhammad Iqbal and Thomas Merton, alas, are not the future of religion.

I have always found it odd in one sense that many of the great philosophical mystics were also great intellectuals, especially it seems logicians and mathematicians. Origen and Ibn Rushd, in their respective pockets, saw theology closely aligned to true wisdom, in that higher sense the neo-Platonists were so fond of talking (and talking) about.

So let me talk about it.

I have said a sufficient number of times (so that anything beyond this time will be mere repetition) that the “cure” for all the bad religion we see around us is not “good” religion or the “right sort of” religion or (above all) declamations that what we’re witnessing “isn’t really religion” but some sort of satanic parody of religion. All such talk is an invitation for conflict under the banner of dialogue.

Religion is not purified by scraping away the mould to see if any edible bread is left. A cure—and yes, that is the word I want–depends on seeing the violence inherent in religious literalism and heeding the call to myth, mystery, and poetry.

When it comes to religion, words speak louder than actions. All forms of biblical and Quranic literalism are invitations to moral terror not because the precept you happen to be reading at the moment is “wrong” but because the one you read next might violate both conscience and commonsense. Violent because you cannot know what verses stir the mind and heart of your friendly local mullah, priest or rabbi. Picking and choosing what the experts believe the laity need to hear–the way most preachers have practiced their faith in public over the millennia–may be a tribute to the power of discernment, but it teaches the congregation—the occasional Catholic, the wavering Muslim—some very bad habits.

It can lead to a constricting of moral vision, the abuse of little children, butchering or disfiguring wives and daughters, the killing of the tribe of Abraham by the children of Abraham. Words do this because they have the power to be misunderstood. And because taken as a bundle, the texts of the sacred traditions are a muddle of contradictory and sometimes terrifying ideas that commend everything from peace on earth to extermination of the unbeliever in their several parts.

It is the kind of tangle that attracts knot-tiers and exploiters and anyone who needs the money of the poor to be rich. Most of the methods developed to study and examine the narratives of the world’s religions “scientifically” in the last two centuries have helped to provide contexts for texts, have shone light on the community within which texts developed—ranging from Syria to Medina—reminding us above all that the ancient words are no different in provenance than modern words: that is, they are human words and need human interpretation. The words are not above us, they should not be considered immune from our assessment and judgment. Any doctrine of inspiration that teaches otherwise is potentially if not actually malignant and insidious.

I could quote Rumi, or Ibn Rushd, or a poem by Alama Iqbal to make my point. They were all great hearts and deeply committed to their vision of religious truth. Taken in another direction, they might have been vicious—because mysticism has often led to esotericism and fanaticism. (Religious language is funny that way.) Origen and Peter Abelard lost their testicles and hundreds of Anabaptists in Munster in 1535 their lives not because they lacked imagination but because they had special visions of how to take the kingdom by storm.

So let me take refuge instead in the myth we find embedded in the story Christians like to read at this time of year.

The Christian myth is that love was born into the world in human form, divine nonetheless and (as the story winds on, without prejudice to the order of composition of the gospel elements) capable of suffering, and destined (as in the ascension myth in Luke) to regain his heavenly estate. True love, recall, does not undergo change, does not “alter when it alteration finds.”

Love came down at Christmas,
Love all lovely, love divine;
Love was born at Christmas,
Star and angels gave the sign.
(Christina Rossetti, 1885)

People who hate the gory images of crucifixion and the metaphysically blinding element of the resurrection narrative, tend to like Christmas anyway. They like it even though they may very well reject every other part of the Jesus tradition. What they like “about” it may not be Christian at all, and may well be more ancient than the ancient ideas that quietly undergird Luke’s and Matthew’s poetic fables.

Socrates it’s easy to forget, was no fan of “poetical myths” “Those which Hesiod and Homer tell us and the other poets, for they composed false fables to mankind and told them [Republic, 377d]. These are “not to be mentioned in our city” [Republic, 378b]. It is easy to forget this because Plato himself was unable to exile Homer completely from his city. What he worries about is the propensity of “myth” (poetical or philosophical) for misunderstanding and the natural tendency among the uneducated, the young and the intellectually dull for getting the myths wrong—missing the point.

Fragment, The Republic

In the Ion [533c], Socrates explains that some people are closer to wisdom and interpretation than others. Call it knowledge—as later Platonists and their sympathizers did. There is a power, Socrates teaches, which descends from the gods to certain men and to others who, like Ion, use the works of the inspired. “It is, he says, like a series of iron rings the first of which is attached to a magnet so that the power of the magnet passes on to all in the series.” Think God, think angel choirs, think wise men, think shepherds. “Those beautiful poems are not human, nor the compositions of men; but divine, and the work of the gods: and that poets are only the interpreters of the gods, inspired and possessed, each of them by a peculiar deity who corresponds to the nature of the poet.” But it stops with the interpreters, the users. The force is not with everyone.

Christianizing Plato is a perilous business, but it did not stop the church fathers and later writers from trying and getting it poetically wrong in their determination to be theologically right. The life of Jesus for many of the interpreters was simply an allegory of divine love, the way in which love (truth) became incarnate. The way love “came down”—in the beginning, for John, “at Christmas” for Rossetti. Certain writers saw this, to be fair, more philosophically than others. The Gnostics did not need a manger or a virgin mother. The most arrogant of the mystics sided with the ancients in thinking that this love was simply a gift of inspiration given to men of learning and ability. Love, philia, is the general term that Plato uses when he wants to convey attraction. It is usually a one way street: the image of iron rings and magnets drawing the things of this world to the things of an unseen realm by a mysterious power that is divine—god-originated..

Perilous though it is, I think that Christianity was unique in democratizing love and in making love available to even the lowliest, the most ignorant, the slaves and sinners. Even the pagan haters of Christianity hated it most for its non-exclusivity, its lack of a membership code. Plato would have hated it, too, and would have insisted that, had there been any, Christians should be barred from his city. Later philosophical Platonism had next to no social dimension. Christianity did.

Christian mythology took the principle of attraction and the connection between God, conceived as love, and forgiveness, considered intrinsic to goodness, and extended it to a human race that had lost its compass and its ladder. Everyone could be perfect because everyone could be attracted.

Do I believe this is literally the state of humanity? Do I think that we should tell our children these things irrespective of SAT scores? Do I agree with Plato that amateurs need not apply and that the secrets of the myths should be “locked in concealment”—the path taken by most of the Platonically-based mysteries and even for a while among certain Christian groups.

What I believe is, there are no mysteries in mangers.

Measuring the Truth of the Book


All religions make truth claims. These may be specific, as in the form of particular doctrines—heaven, hell, the trinity, the virginity of Mary—or more general: the finality of the Prophet, the exclusive role of the Church as a means of grace and salvation, the belief in the divine election of the Jews.

What is not so widely acknowledged is that these claims of truth are supported by a set of rationales, or to use Van Harvey’s famous term, “warrants” that provide security and confidence to adherents of the religious tradition.

The warrants are seldom available in the sacred writings and doctrines explicitly, but they are often observable in teaching, interpretation and conduct. The three book religions, which often have been referred to as “Abrahamic” actually have quite different warrants for their truth claims.

Warrants in religion are a kind of pseudo-empiricism—a quantification of truth value. Like empirical tests, warrants are susceptive of disconfirmation—being proved false—at least in theory. A warrant is not a doctrine, but a justification for religions to “do as they do”; they empower belief and practice by creating benchmarks for the success, prestige or dominance of a religious tradition—often through comparison to rival traditions.

For example, in some forms of millennarian religion predictions of the end-time have been recorded with remarkable precision. The habit goes back at least to the time of Rabbi Joseph the Galilean, a contemporary of Hyrcanus and Azariah, who thought the Messiah would come in three generations (60 years), after the destruction of the Temple in 70 CE. The messiah failed to arrive, however, and the nominee for the position, Shimeon bar Kochba died a humiliating death at Roman hands in 135. End-time prophecies continued with the Christian Hippolytus’ calculation that 5,500 years separated Adam and Christ and that the life of the world was “6,000, six full ‘days’ of years until the seventh, the day of rest.” His calculations in 234 indicated there were still two centuries left. Two millennia of apocalyptic forecasting lay in store. The “prophet” Moses David of The Children of God faith group predicted that the Battle of Armageddon would take place in 1986 whenRussia would defeat Israel and the United States. A worldwide Communist dictatorship would be established, and in 1993, Christ would return to earth.

Apocalypticism is conspicuously subject to disconfirmation and its calculations have—quite obviously–never been accurate, as Simon Pearson has documented in his popular survey, A Brief History of the End of the World (2006). Just as surprising though is the amazing ability of apocalyptic movements to regenerate themselves: this or that cult or movement may die away through embarrassment and loss of faith and members, but the phenomenon itself is tied to a (more or less) naturalistic belief in the beginning and end of things, and theological constructions of that belief to include ideas of judgment, reward and punishment.

All three of the book religions, at bottom, believe in the last three of these ideas—the end of the world and the judgment of humankind. The mechanism and details differ slightly, with Christianity and Islam being historically more tied to eschatology (the belief in the final destiny and dispensation of the human race by god). In fact, it would be more accurate to call the three “Abrahamic” faiths the eschatological traditions because of their common belief that the relationship between God and the human race is personal and moral rather than abstract. The belief in judgment is most vivid in Islam, less so in Christianity, and highly controversial in Judaism—where, nevertheless, since Hellenistic times, it has featured significantly.

If eschatology is a core belief in the three book religions, it is fair to ask what mechanisms (warrants) have been used to procure the success of these traditions in the face of disconfirmation?

Just as any case of eschatological “disconfirmation” (a failed apocalyptic event) weakens the overall strength of a warrant, so too the collapse of a warrant will lead to general doubts about the truth claims of the religion. This religious domino effect is most clear when the eschatology is strong.

For example, messianic Judaism of the period after the Babylonian captivity (6th century BCE) is relatively well attested. Most Jewish apocalyptic literature is not written until after the death of Alexander in 323 BCE (most even later) and the disintegration of the Hellenistic world he created. Between the time of the Persian hegemony over Palestine, right through to the period of Roman domination, the apocalyptic spirit—an acute sense that the times are out of joint, that God is at the end of his wits waiting for things to right themselves, and that divine intervention is imminent—is at a high pitch. But while the spirit may have been feverish, solutions did not arrive on schedule, and when they did they were not the solutions the Jews had been expecting.

Apocalypticism ends with a massive crash: the Roman assault of 66-70 CE–the burning and looting of the temple, the destruction of Jerusalem, a century of uneasy détente followed by a second blow with an edict that Jerusalem was henceforth off limits to Jews and that a pagan shrine would be built on the temple site. This is not coincidentally the period when messianism, originally a political movement, later a more spiritual one, was most in evidence. But the hope for a messiah was repeatedly disconfirmed by circumstance, loss, and disappointment. The “truth” of Judaism and beliefs subordinate to its eschatology, had to be sacrificed at an empirical level for more secular goals and a this-worldly focus on ethics. In strictly historical terms, the truth claims of Judaism were untruthed. All else is adaptation and interpretation.

The Jewish situation cannot be understood properly without looking at its foster child, Christianity. Whatever else may be claimed about this religion, it is undeniably Jewish, eschatological, and messianic in its origins. It belongs specifically to the time when Judaism was the most fraught with expectation, and some of its apocalyptic books, and passages from the gospels (such as Mark 13) are literally taken wholesale from Jewish writings such as IV Esdras and I Enoch.

Christianity survived for just under a century under what scholars used to call the cloud of
“imminent eschatology,” and what one scholar has called “prolonged disappointment”. By looking backward and forward, it appropriated and reinterpreted passages from the Hebrew prophets to apply to their messianic hero. This point of conjunction is often overlooked in exchange for the belief that Christianity somehow forged quickly ahead of Judaism and looked back only occasionally and when necessary. In fact, as the second century Marcionite crisis showed, Christianity could not go it alone. It needed the “witness” of scripture—the Hebrew Bible–and the promises of the prophets to make sense of its emerging belief system. It required Jewish atonement theology to explain the significance of the crucifixion. It did not claim a new finality but completion of a process. It did not (except very rarely) challenge the wording of the Hebrew Bible or rewrite the prophecies or produce targums of Jesus setting it all straight. It became skilled at allegorical interpretation, in its own theological service, but also made reference to the rabbis. Christianity was not the shock of the new but the old repackaged for sale to gentiles,

Above all, beginning with Paul, it was messianic. And its first crisis, as we gather from passages such as 1 Thessalonians 5.2 and 2 Peter 3.4-6 concerned the delay in the return of the messiah. When that event—the second coming that would vindicate the unexpected failure of the first—did not happen, Christians were confronted with a crisis that could only be rationalized organically.

Two things distinguish the Christian reaction to eschatological failure from the Jewish response, however. First, Christianity was much more concerned with the belief in resurrection than with belief in messiahship. Its happenstansical withdrawal from the Jewish world at the end of the first century immunized it to a certain extent from the effects of disconfirmation—or at least, bought it some time. Truth was focused on the larger event which (though tied to eschatology) was not seen to be identical to it in the gentile world, where Christianity gained the most ground. And in the gentile world at least, even the emphasis of the “judgment aspects” of resurrection were deemphasized in favour of its promise of immortality—a theme long revered by the Greeks and Romans. Later on, in the onslaught of death, plague and war, the emphasis on judgment and the cruder aspects of the afterlife would reemerge in the middle ages. But during the period when Christianity was most at risk of being another disconfirmed Jewish messianic movement, it survived by changing the subject. Indeed, it may have been Paul who changed it –as early at the 50’s of the 1st century.

As the resurrection faith, a religion of expectation, Christianity survived through a proclamation of a risen lord “who will come again.” Its truth claims were protected through procrastination—not that any individual Christian or church or hierarchy was aware of the strategy. No “groupthink” was involved and no council could have been called to resolve the issue. The response seems to have been organic and somewhat reflexive—but crucially it meant that Christianity could not be untruthed until such time as Jesus did or did not come, and no one knew precisely when that time was: the psychology of prolonged expectation prevailed over the psychology of prolonged disappointment. In a word, “faith.”

Islam is related to its cousin traditions in a contorted way. Like Christianity, it claimed to be a common heir of the Abrahamic traditions. Unlike Judaism, it taught that much of that tradition had been corrupted by false prophets and evildoers. Like Christianity, it claimed a continuum with the prophets of old; unlike Christianity it made little use of any specific passages of the Hebrew bible, did not incorporate it into its own sacred library, and did not regard the finality of Muhammad’s prophethood to be based on any adumbration in the books of the Jews or Christians.

This was important, because the legitimacy of Christianity was theoretically dependent on the sheer fact of the Old Testament (rightly interpreted) and its soteriological system being applied to the death of Jesus—the atoning sacrifice for sins. Islam like Christianity understood itself as somehow connected to the past, but disconnected from most of its theology and in large part from its literary tradition. In particular it was disconnected from Jewish and Christian soteriology: the God of the Prophet does not suffer the sin of the people but rather judges them according to his fiat, the Qur’an. The connecting fiber that joined Christianity to Judaism was decisively cut by Islamic rejection of the ancient idea of atonement.

The extent to which the earliest teachers of Islam felt able to appropriate the Judeo-Christian sources ex post facto is a subject of some discussion, but whatever the reasons for the disuse of the prior claimants to the Abrahamic faith, Islam alone found error not merely in interpretation but in the sources themselves. The idea of error was both tied to and a consequence of the doctrine of finality: Muhammad is the prophet of God in a conclusive and indubitable sense. What is contained in the book revealed to him is true beyond question.

The messianism of the two older traditions depended in different ways on verification. Even the New Testament, whose messianic claims are undone by historical outcomes, asks believers to look to the skies, but the portents and signs can only be understood by looking backward (Mark 13.14-16).

Judaism and Christianity saw the events of the end-time as suprahistorical happenings whose occurrence could only be understood prophetically. By sacrificing the “backward look” to the idea of finality Islam created a new understanding of prophecy, whereby ‘non-prophets’ could be adopted simply because they were believed to have lived in an age of witnesses—as “Muslims before their time.” This theme was not unknown in Christianity; it is voiced by church fathers like Justin and Clement in relation to Old Testament heroes and a few classical worthies who “taught truth” before its time had fully arrived in the person of Jesus Christ.

The last day or yawm al-din underscores the idea of finality which also shapes the view of prophecy and scripture: God’s judgment demands the observance of Islam to such an extent that in Islam, eschatology replaces theology. This also accounts for the largely allusory style of the Qur’an in relation to the other book traditions; individual stories do not matter as much as establishing the historical pattern of “warning” and the Prophet’s pedigree: Adam, Abraham, Jonah, Noah, Moses, form a kind of chorus of worthies, an honor guard, whose role it is to provide a line of succession to the prophet of God. They are not so much “adopted” or interpreted as in Christianity but expropriated.

So too the Islamic use of the messianic idea. It is not clear that the first Muslims grasped the idea of the messiah or “mahdi” except in relation to the belief in judgment. Ibn Khaldun, the 14th century historian famous for his pioneering work in philosophy of history, writes in his Muqaddima:

“It has been (accepted) by all the Muslims in every epoch, that at the end of time a man from the family (of the Prophet) will, without fail, make his appearance, one who will strengthen Islam and make justice triumph. Muslims will follow him, and he will gain domination over the Muslim realm. He will be called the Mahdi.”

The Mahdi’s bona fides are well-established from early on: He will be an Arab, from the tribe of Banû Hãshim and through his line by Fatima (ie a member of the Prophet’s family). Critically, he will not be a Jew or a Christian—Islam’s declaration that the final judgment of God will be according to the rules of Islam. The Mahdi will be “assisted” by Jesus, who is relegated to role of helper on the day of judgment; he “will fulfill a role behind the Mahdi.” The true Christians “will follow Jesus in accepting Imam al-Mahdi as the leader at the time and become Muslims.” In short, the messianic expectation is that all those who will be saved will follow Jesus in subordinating himself to the true messiah.

The measurement of any truth claim in Islam, therefore, is subject to the prior assumption—or “strong belief” in the finality of the Islamic position towards its predecessors. This claim, despite certain superficial or family resemblances—is a belief in unqualified rejection. The claims of Christianity and Judaism are selectively falsified in the doctrine of the corruptibility of sources, the partiality of God’s revelation to previous warners, the rejection of the idea of atonement, and the replacement of it with a strong and exclusivist eschatological scenario in which followers of Jesus will be judged on the basis of their acceptance of Islam.

More directly relevant to measuring truth claims however is their effect. Never a large religion and today consisting of only about 14,000,000 adherents worldwide, Judaism has historically been an exclusivist religion. Its salvation theology emerges from its historical situation–one surprisingly similar to its current political situation–as a fairly cohesive religio-cultural community surrounded by adversaries. The viability of faith depends first of all on the existence of the faith community, and throughout its later history this has been Judaism’s primary concern. In such constricted circumstances its theology was necessarily more about salvation, messiahship, and rescue than conversion and growth. Its truth claims were tied to that survival more directly than to other possible warrants, such as military achievement or imperial expansion.

Christianity traded exclusivism for expansion after the second century of its existence. It did so by lowering the religious bar on radical monotheism, relaxing some of the more stringent safeguards of Judaism in terms of diet and religious observance, the use of images and rituals, and substituting for this a church-based system of authority and a sacramental system that created a sharp class distinction between laity and hierarchy. “Faith” (de fide) in this sense was not an act of the will but a body of doctrine passed down as a sacred deposit of truth interpreted and taught by the Church: the laity had no active role other than to accept the church’s teaching and conduct their lives accordingly.

To the extent this system was successful, as it was until the sixteenth century and in modified form even until he twentieth, Roman Christianity and its protestant spawn successfully substituted the idea of reliance on belief for the more ancient belief in the coming of Christ (even though the latter has been given honorary status among the discarded beliefs of the ancient period). The warrant of the truth claims of modern Christianity for all the available versions and possibility of continued fissiparation, is simply the quantum of what the church or churches teach and what Christians find agreeable to faith. Protestantism shifted the focus from the nominative sense of faith as a body of orthodox teaching to the verbal understanding –faith as assent in conscience to biblical revelation. But in either case, the lex fidei, the law of faith, was the exclusive warrant for Christians of the Middle Age and Renaissance periods.

Islam offered no such options. The doctrine of finality had not budged much since the early middle ages among serious adherents of the faith. When Islam is seen as regressive or repressive in terms of social doctrine or custom, it is usually because its core structure has remained remarkably intact, like a well built house that defies the weather.

The doctrine of the Mahdi, for instance, has never had to be rationalized, defended or abandoned, because it did not suffer the historical disconfirmation that both Judaism and Christianity experienced. Islam’s eschatology is alive, robust and looks to the future. It is fundamentally different from an eschatology undone by history (Judaism), or dislodged by qualifying doctrines (Christianity). While the authority of approved teachers, imams and ayatollahs is a significant feature of the religion, there is no central authority and no mechanism for consensus of all individual authorities. In fact, the debate in much of contemporary Islam is not whether the fundamentals of faith are sound but whose Islam is the most Islamic—the “truest” example of the faith.

Superficially this would seem to suggest chaos, but instead it points to the fact that there is enormous room for disagreement among Muslims, within limits. The limits concern subordinate or derivative doctrines: when is violence justified; should women wear hijab, to what extent is it permissible to sort out true and false traditions relating to the early community or the hadith, and the applicability of sharia to the regulation of the conduct of believers.


In addition to the apparent impermeability of its core doctrine to disconfirmation, Islam has developed a sixth pillar which it seems to me is beginning to serve as a warrant for its truth claims. Unlike Judaism and increasingly unlike the phenomenon of a deflating world Christianity, Islam is growing. Its success is in numbers–conversions, expansion, the building of mosques and madrasas. From Malawi to Toronto and London, the signs of Islam’s health and success at a demographic level are visible, impressive, and unmistakable.

In 2008 the estimated world Muslim population was close to 2 billion, and rapidly increasing. Estimated increase and actual numbers vary widely among researchers, but the U.S. Center for World Mission estimated in 1997 that Christianity’s total number of adherents is growing at about 2.3% annually. (This is approximately equal to the growth rate of the world’s population.) Islam is growing faster: about 2.9%, and Islam will surpass Christianity as the world’s most populous religion religion by 2023.

Samuel Huntington famously saw these numbers as portending a clash of civilizations. Whatever the merits of his argument, the more significant issue is how numbers are interpreted by the adherents of a belief system and just as vital, how adherents “behave” toward numbers. If numbers serve as a warrant of truth, adherents will have an enormous interest in sustaining and expanding the numbers, through whatever means possible. As a matter of history, unlike the messianism of the Jews and the parousia-theology of early Christians, Islam–uniquely–has not been eschatologically disconfirmed. In fact, its warrant provides a kind of empirical test that Judaism and Christianity have already failed. Given the warrant that Islam uses for the truth value of its beliefs, it passes the test.

Early Judaism dreamed of a day when Abraham’s descendants would be a numberless as the stars in the heavens. If that remained an ideal, the day never came. As a warrant of truth claims, Judaism would have very little to gain from playing a numbers game. The more modest and warranted Jewish position is that Judaism is true as long as it survives.

But the same is true of Christianity, largely because it is no longer one thing but many things—not Christianity but Christianities, as the Oxford scholar Peggy Morgan likes to point out. In significant ways, Christianity has been unharmonious and inhomogeneous since the Middle Ages. It has had to measure its truth with different spoons, using different systems for the better part of five centuries, and still is large enough that certain segments of the Christian religion hardly know that other sectors exist or what doctrines they profess. Evangelical Christians may dream of bringing a singular gospel to the far flung regions of the world, but a healthy majority of other Christians oppose the entire missionary philosophy as form of religious colonialism. In addition to this, an unknown but sizable percentage of the world’s Christians are largely secular, agnostic, or “lapsed” members of the tradition; they identify with it in name only. Rarely in the twenty-first century will someone be denied the status of “believer” in any denomination through violence or persecution simply because his beliefs are askew. And even in those traditions with ancient legal traditions, such as Roman catholicism, rules are unenforceable at a penal level.

Thus the Christian warrant for its truth claims, “faith” (whose faith?), is a wobbly instrument of measurement in the modern situation, and a number of factors weigh against the ability of Christians to use geographical reach and population as indicators of truth. Christianity possesses no single vision, doctrine, or praxis. With the death of “Christendom” in the sixteenth century, Christians also sacrificed geography and population as a warrant for the claims advanced by the faith. The export by missionaries during the colonial period of a variegated Christianity preached in different ways to different colonial populations only accelerated the process of international fissiparation–which we still see in the massive success of “conversions” in Central and South America from Roman Catholic to Evangelical protestantism, and the supermarket Christianity of the developed world. With the acceptance of modernity, Christianity was obliged to accept the relativity of its belief systems to other ways to the truth, including in principle the idea that its faith was unwarranted. Christianity’s survival seems latched to the acceptance of the final triumph of secularism and its correlate: believing less and less.

For Islam however, from an early date, the increase of the faith is a living proof of its finality. Numbers are paid attention to. Territory once submitted to God must always be submitted to God—one of the reasons the question of Jerusalem remains one of the irreconcilables of the Arab-Israeli conflict. Dominant stories, dates, and myths are significant: The triumph over the Meccans, the submission of Constantinople, the conversion of the Mongols, the winning back of Jerusalem by Saladin, the capture of al-Andalus. “Jihad” has been the key word to describe this warrant, but rather than thinking of it as war or violence, it must be seen as the execution of a principle, without which Islam might go the way of the other book traditions.

Sheer increase has become the defining warrant for the truth of Islam. Consequently those who pursue the interests of the dar-al-Islam (the territory submitted to God) most vigorously—the Taliban, for example, or others that western observers are likely to label “religious extremists”–are acting on a proven principle. If we end where we began : “A warrant is not a doctrine, but a justification for religions to “do as they do”; they empower belief and practice by creating benchmarks for the success, prestige or dominance of a religious tradition—often through comparison to rival traditions.” By that definition, Islam’s success seems assured whether by comparison to its rivals in the Abrahamic tradition or by dint of the prestige it enjoys as the world’s fastest growing religion.