What Is Fundamentalism?
2008 Fr. John vonHolzhausen Lecture by Brett Grainger on Saturday, May 10, 2008
Father Antony, members of St. Mary's, welcome guests,
I'd like to thank you all for coming this evening. I'd also like to thank Father Antony and the Fellowship of St. John the Divine for inviting me to deliver this year's vonHolzhausen lecture. As a member of St. Mary's, it is an honor to be connected with a name so beloved by the St. Mary's community, and to follow in the footsteps of previous speakers such as Nicholas Constas, now Father Maximos of Mount Athos.
The title of my talk this evening is "What Is Fundamentalism?" It's hard to think of another word that conjures more immediately visceral reactions. Fundamentalists, we are told by the media, are militants and extremists; they are "agents of intolerance," bellicose and inflexible, scornful of compromise and conciliation. They are also ubiquitous. In the 21st century, we are blessed with a bewildering proliferation of fundamentalisms, a veritable Baskin Robbins of varieties: in addition to the classic vanilla of Protestant fundamentalism, we now have Mormon fundamentalism, Catholic and Orthodox fundamentalisms, Islamic and Jewish fundamentalisms, Hindu and Buddhist fundamentalisms, even the so-called "atheist fundamentalism" of figures such as Richard Dawkins, Sam Harris, and Christopher Hitchens. Indeed, it's become increasingly difficult to imagine a movement or position that might be immune from being "hijacked" by fundamentalists (this is something else fundamentalists are known for-they're always hijacking things). It is easy to find popular references to "secular fundamentalism," "market fundamentalism," and "Darwinian fundamentalism."
What gets lost in this more expansive definition of fundamentalism is a sense of its particular historic roots. All too often, the word is thrown around carelessly as an epithet, much like "Nazi," whenever we want to accuse someone of being excessively stern, rigid, extreme, or dogmatic-or just for disagreeing with our political point of view. It's common for "fundamentalist" to be used interchangeably with "conservative," "right-wing," "medieval," "traditionalist," even small-o "orthodox." But fundamentalism is not synonymous with any of these. As I hope to make clear, fundamentalism has a very specific set of meanings and a very particular historical trajectory. After spending the last few years acquainting myself with some of this history, I came away with the conclusion that our cherished platitudes about fundamentalism say about the movement itself. For example, fundamentalists are not a throwback to a "medieval" age. They are thoroughly modern creatures. Indeed, fundamentalism is impossible without modernity. Similarly, fundamentalists are not always "right-wing." For example, there are fundamentalists in my own family who are staunch pacifists. One of the chapters in my book profiles a biblical literalist who farms organic wheat and rages against the consumerism of Western culture and the corruption of the banking industry-hardly the kind of sentiments one encounters on The O'Reilly Factor.
Tonight, I want to offer a subtler take on fundamentalism than the one we've become accustomed to hearing. It requires us to engage in an exercise of critical sympathy. Protestant fundamentalism, I would submit, is not a monolith, but like any large religious group-Catholicism, Judaism, mainline Protestantism-is marked by a rich internal diversity. Though its theological imagination is in many ways tragically constrained, it has also proven to be far more flexible than its critics contend, incorporating influences from every corner of American culture, while presenting itself as the pure, distilled essence of the gospel. Fundamentalism is many things-dynamic, innovative, pluralistic, adaptable: "old-time religion" it is not.
When I told my mother I would be writing a book about Christian fundamentalism, the first thing she said was, "Oh, that's nice. And what is a fundamentalist?"
I couldn't believe my ears. "Mom," I said, "our whole family are fundamentalists." My parents had both grown up in the Plymouth Brethren, a small, militant sect of believers who, during the second half of the 19th century, had helped to erect a number of the intellectual pillars of fundamentalism. During my childhood in the 1970s, I had been consistently taught that the Bible was the literal word of God, inerrant in every detail, and to regard movies, pop music, and other forms of popular entertainment as "worldly"-a word that suggested a state of compromise and contamination. How could my mother not know what a fundamentalist was?
But the more I thought about it, the more her question made a certain kind of sense. I couldn't remember a time when anyone in the family had described himself as a fundamentalist. The Brethren forsook labels, preferring generic titles such as "Christian" or "believer." Only in college did I begin to associate the religious culture of my childhood with something called fundamentalism. I learned to regard fundamentalists as stubborn relics of a pre-enlightened age. Against modern liberal values of freedom and equality, they sowed seeds of intolerance and superstition, a band of hardened zealots waging a rearguard action against Western progress.
Though my sympathies had drifted to liberalism, something about this characterization rang hollow. The Brethren I knew were not extremists, but everyday men and women who lived lives of quiet piety. Their religious beliefs were no more irrational than those professed by Christians for millennia. Their cultural attitudes were cloistered but not xenophobic; the work of evangelism constantly brought them into contact with other races and cultures. Believers were suspicious of higher education, and yet they placed an inordinate stress on literacy and daily study of the Scriptures. And though they were out of step with mainstream attitudes on issues such as abortion, they had no desire to infiltrate the government and impose their religious vision on the world. The Brethren were "separatists," which meant that they avoided all political activity, even voting, as dirty. In day-to-day life, it would have been difficult to distinguish a believer from a nonbeliever. The Brethren were entirely at home in the middle-class world of shopping malls, supermarkets, and subdivisions. They dressed the same as nonbelievers, sent their children to the same schools, toiled in the same positions. But deep in their hearts, the Brethren saw themselves as outsiders, a holy remnant that stood apart from the world.
My mother's question-"What is a fundamentalist?"-would have been much more easily answered in the first half of the twentieth century. A fundamentalist was someone who affirmed the historic "five fundamentals of faith," as first set down during the Niagara Bible Conference movement of 1878-87 and formalized in 1910 by the General Assembly of the Presbyterian Church: the inerrancy of the Bible, the virgin birth of Jesus, the doctrine of substitutionary atonement (the idea that "Jesus died for our sins"), Jesus' bodily resurrection, and the doctrine of miracles (or alternately, the imminent Second Coming of Jesus). The name was coined in 1920 by Curtis Lee Laws, the editor of a Baptist newspaper, who wrote that a fundamentalist was someone willing to do "battle royal" for the faith.
Fundamentalism hit its peak of popularity during the national anti-evolution campaigns of the early 1920s, which climaxed in 1925 with the so-called Scopes Monkey Trial. The Scopes in question was John Scopes, a public school biology teacher in Dayton, Tennessee, who was brought up on charges for teaching evolution to his students in a test case of the state's new anti-evolution legislation.
The trial quickly descended into a media circus, with the entire nation focusing on what reporters styled as a battle between two Americas: an urbanized North, representing reason, science, and social progress; and a rural South packed with yokels and superstitious "simians" whose benighted attachment to tradition threatened the upward march of American history. In the end, Scopes was convicted, but it was a Pyrrhic victory. The skeptical northeastern journalists sent to cover the trial depicted the anti-evolution forces congregating in Dayton as a superstitious rabble of slack-jawed yokels. The media coverage helped turn public sentiment against fundamentalists. In popular memory, Scopes became a creation myth of modern America, the moment when the defenders of reason and cultural progress struck a deathblow against ignorance and demagoguery.
More than any other event, it was Scopes that turned fundamentalism from being a compliment into a curse. Already by the 1950s, the pejorative associations had reached such a degree that a young fundamentalist named Billy Graham disavowed the F-word in favor of "new evangelical"-a term that retained the doctrinal emphases of fundamentalism without its reputation for divisiveness and belligerence. Graham's crusades avoided politics and focused on soul-winning, and succeeded in bringing "Bible-believing Christians" back into respectable conversation. Groups such as Campus Crusade for Christ made inroads at colleges and universities, places that fundamentalists had alternately abandoned and been forced out of in the early decades of the twentieth century.
In the 1970s, the F-word made a surprising comeback. Reacting to the perceived moral decline in America hastened by the antiwar movement and the sexual revolution, a new generation of flashy televangelists-preachers such as Jerry Falwell, Pat Robertson, and D. James Kennedy-blended the savvy of the neo-evangelicals with a renewed moral outrage. But their effort to reclaim the positive connotations of fundamentalism was hindered by political events on the far side of the world. In 1978, the Iranian religious leader Ayatollah Khomeini led a revolt against the Shah that resulted in the creation of an Islamic theocracy. It newspapers, it became common to read of "Islamic fundamentalism." The terrorist attacks of September 11, 2001, helped to further broaden the definition of a fundamentalist to include virtually any agent of religious violence and intolerance.
Today, the majority of those who in previous decades would proudly have called themselves fundamentalists have publicly shunned the name. "Evangelical" has become the one-size-fits-all descriptor for Protestants who, in general, ascribe to conservative moral values and a theology that centers on evangelism. The result, I would argue, has been a loss of clarity. While all Christian fundamentalists are evangelicals, not all evangelicals are the same. Fundamentalists, for example, differ from mainstream evangelicals, who do their best to "get along" with American culture, taking a conciliatory tone and downplaying areas of discord. They put evangelism ahead of purity. Fundamentalists reverse that equation. As much or more of their energy is devoted to pursuing holiness and resisting worldly influences. They spend a greater amount of time worrying over the decline of "moral values," defending the Bible from attack, and speculating about the end of the world.
Despite sharing a number of concerns, Christian fundamentalists are not easy to define as a movement. For one thing, they are highly decentralized, with no center outside that of the Bible itself. While often described as homogeneous, fundamentalism is no monolith but a sprawling collection of competing values and contradictory instincts. Believers may share a core set of beliefs, but they are deeply divided over how and to what extent Christians should participate in mainstream society. For many, America has a special destiny as a chosen people, the fabled "city on a hill." For others, it is Babylon, singled out for God's special wrath. Some believers serve in the military, while others are pacifists. Some condemn materialism while others see conspicuous consumption as evidence of God's grace. Some feel called to enter politics, while others refuse even to cast a vote, lest they become entangled in worldly concerns. To make things messier, many believe various combinations of all of the above.
So what is a fundamentalist? A good place to start is the definition offered by George Marsden, who described a fundamentalist as "an evangelical who is angry about something." Marsden's insight is that fundamentalism has less to do with a set body of doctrine than a particular attitude or tone, a way of being in the world. What fundamentalists are angry about is secular modernity, a period that began roughly in the second half of the nineteenth century, when the mingled streams of industrialism, the Enlightenment, Romanticism, secularization, and the Second Scientific Revolution wrought unprecedented change in virtually every area of American society. Many of the basic tenets of Christian orthodoxy-ancient notions of physical resurrection and the divinity of Jesus, among others-underwent dramatic redefinition as liberal ministers and denominations struggled to remain "relevant." Those who resisted these innovations took up an old banner, the call to purify the church of worldly influences, to be "in the world but not of it."
Many histories of Christian fundamentalism begin with the turbulent 1920s and the Scopes Monkey Trial, which put fundamentalists on the national stage. However, by then, the basic template of fundamentalism-millenarianism, biblical literalism, doctrinal purity, Christian Zionism-had been kicking around for nearly half a century. By paying particular attention to that neglected early history, I hope to show how fundamentalism was both an attempt to preserve earlier beliefs and practices and a radical break from what came before. Like the Counter-Reformation of the 1600s, which aimed to secure Catholic tradition against the innovations of Protestant theology and the liberalizing effects of the Renaissance, fundamentalism fell prey to the paradox of conservatism described by G. K. Chesterton. "All conservatism," Chesterton writes, "is based upon the idea that if you leave things alone you leave them as they are. But you do not.... If you leave a white post alone it will soon be a black post. If you particularly want it to be white you must be always painting it again; that is, you must be always having a revolution. Briefly, if you want the old white post you must have a new white post."
For more than a century, the defenders of old-time religion have engaged in a revolutionary reworking of the fundamentals of Christian faith. Their brand of orthodoxy is a simulacrum of modernity, a streamlined architecture of steel and glass, composed of rational lines, simplified forms, and a stark palate. Contrary to conventional wisdom, fundamentalists are not interested in returning to a pre-modern age. They are among the most adept pupils of modernity, copying and recasting its designs for their own purposes. What is remarkable about fundamentalists is not how they manage to resist the intellectual habits of modern liberal society-rationalism, individualism, capitalism, pluralism, alienation-but how well they exemplify the modern condition.
As I've already argued, there's a tendency to present Christian fundamentalists as a monolith, when in fact, they are highly diverse and often internally divided. This internal diversity is something I experienced growing up among the Brethren. If you lined up any member of my family with public faces of fundamentalism since the 1970s-the late Jerry Falwell, for instance, or Pat Robertson or John Hagee-I'm sure you could find a number of issues on which they'd agree. However, I'm also confident they would disagree about nearly as much. Unlike Falwell's Moral Majority or Robertson's Christian Coalition, the Brethren objected to all expressions of nationalism, militarism, or political activity, which diluted a believer's focus on the kingdom of God. For his part, Falwell would have considered my family guilty of shirking its responsibility to the "cultural mandate" delivered to Adam and Eve and recorded in Genesis 1:28, which asserts a Christian obligation to assume rule over every sphere of human activity. (For those of you not familiar with it, God instructs Adam and Eve to "fill the earth and subdue it. Rule over the fish of the sea and the birds of the air and over every living creature that moves on the ground.")
Despite such differences, it would be difficult to contest that either was not a Christian fundamentalist in good standing. Diversity and pluralism are values not often ascribed to fundamentalism, but they have surely been one secret to its success. Though they appear to march in lockstep, believers often have difficulty reaching consensus once they move beyond a sacred core of "moral values"-gay marriage, abortion, pornography. Many doctrines presented as fundamental-belief in the Rapture or a Young Earth, for example-remain deeply contested. Such internal divisions are unlikely to disappear, as they come out of a culture that prizes the arts of persuasion and places a responsibility on each believer to discern the will of God through prayer and private study of the scriptures. Fundamentalism has no pope, no center of authority outside the Bible. Having no center, it can by definition have no periphery. Anyone who professes a personal relationship with Jesus is entitled to interpret God's Word and to try to win others to his views. Fundamentalist faith is exciting and dynamic in part because it is a free-for-all-an unregulated sacred sprawl of freewheeling pastors and interconnected churches and denominations.
Just as fundamentalist culture is no monolith, it is also more flexible and open to change than typically acknowledged. Like all living traditions, fundamentalism is in a continual state of flux as has been ever since its appearance in the later decades of the nineteenth century. Believers present their faith as an unwavering and uncompromising presentation of a "biblical worldview," but in truth their faith, while stiff and brittle on the outside, is highly malleable at its core. A thin, rocky outer crust protects and conceals its molten heart. One example is the contemporary creationist movement. In the early 20th century, it was not uncommon to walk into a fundamentalist church in this country and hear a minister declaim that dinosaur fossils had been buried in the ground by God to test the faith of believers. On the other hand, today we have "creation science," a global multimillion-dollar enterprise that takes up the tools and methods of modern biology, geology, and genetics in a bid to demonstrate the factual basis of the Genesis account of creation. Answers in Genesis, the largest creationist ministry in the world, recently opened a $27-million Creation Museum in Petersberg, Kentucky.
Fundamentalist groups such as Answers in Genesis are constantly assimilating new influences, changing direction, adapting to new intellectual and social environments. One of their most successful recent innovations has been their adoption of certain aspects of postmodern philosophy, in particular the critique of Western notions of reason and objectivity, ideas on which modern philosophy and science are based. So at the same time that Answers in Genesis goes to great lengths to argue its view that "mainstream science confirms the Bible's history," they also question the trustworthiness of reason itself. As Ken Ham, the president of Answers in Genesis, told me, "Ultimately, creationism and evolution have nothing to do with the evidence. It comes down to your presuppositions."
My point here is that, to an unbelieving ear, fundamentalism can sound like a rigid recital of dogma. But in the hands of a skilled improviser, such as Ken Ham, it actually sounds more like jazz. The song remains the same, but the tune is restless, the tempo constantly changing.
Often, changes of faith can take place in less dramatic ways. My maternal grandfather is a case in point. For most of his life, my grandfather was a Young-Earth creationist, which means that he believed the world was created in seven, 24-hour periods less than 10,000 years ago. In his seventies, he had a change of heart while reading a book about Niagara Falls. The author described how the falls had been produced by the collision of two ancient forces: the Niagara River and the Niagara Escarpment, a long ridge of limestone left over from a tropical sea that covered central North America millions of years ago.
Without any dark night of the soul, my grandfather became an Old-Earth creationist. So far as he was concerned, the authority and accuracy of scripture had not been compromised. It was a mistake, he told me, to confuse God's Truth, which was forever and unchanging, with our limited ability to grasp it.
After that, I began to see him differently. His faith had always seemed as steady and constant as granite. But like all things under the sun, it changed with time. After a lifetime, the waters of experience had rubbed many of his jagged edges smooth. His conviction that reason and revelation could harbor no permanent enmity, coupled with his inclination to take truth where he found it, allowed him to synchronize the disparate rhythms of ancient myth and modern science. His faith resembled a handmade clock pieced together from odd, mismatched parts: bits of a sundial and a pendulum, scraps of rope woven from papyrus, stone cogs meshing imperfectly with the teeth of stainless-steel gears. Somehow, it kept time.
Of course, fundamentalists are not the only ones who feel a pressing need to harmonize the demands of tradition and modern life. Every contemporary religious community is busy today building their own handmade clock, and each keeps time in its own way. The special genius of fundamentalism has been its ability to absorb so many basic modern assumptions about the world while baptizing them as "old-time religion."
If fundamentalists learned anything from the twentieth century, it was that the best way to resist the temptations and snares of modern life is by making their own "safer" versions of them. They are quick studies of fashions and fads, taking the wineskins of popular culture and filling them with grape juice. Fundamentalists produce knock-off versions of Hollywood films. They put their imprimatur on Christian rock bands and the science-fiction novels of the "Left Behind" series. They cheer on wrestlers with handles such as "the Sermonator" and "the Abrahammer." (I'm not joking.) They have their own newsgathering networks. Believers no longer have to choose between this world and the next. They have built a world within a world, a microcosm of American society that promises all the advantages of separatism without the staleness of quarantine, and all the benefits of modern life without the risk of contamination.
The historian George Marsden has written that fundamentalists are "preachers of paradox." In one and the same breath, America might be praised as Zion or condemned as Babylon. At one and the same time, the church can be a "faithful remnant," a persecuted and purified elect, and a "moral majority" exercising a powerful network of lobbyists and legislators on Capital Hill. This willingness to embrace paradox has given fundamentalists an astonishing flexibility. With each new generation, believers rework the ancient clay to fit contemporary molds. Fundamentalists describe their faith as a steady rock in the swirling seas of modern life, but everything about their streamlined, no-nonsense Christianity-its practices, beliefs, and attitude to the world-is in a constant state of struggle and evolution.
For much of the 20th century, the stubborn persistence of fundamentalism was explained by social factors. Fundamentalists were the losers of history: the masses of socially marginalized and culturally benighted souls who had been left behind in the upward march of Western civilization. The most common version of this explanation, known as the "secularization thesis," held that conservative religious beliefs would wither with the spread of modernity. Yet the numbers tell a very different story. Between 1960 and 2000, the Southern Baptist Convention, one of the country's most influential fundamentalist denominations, grew from 10 million to 17 million members, while membership in the liberal Episcopalian Church dropped from 3.5 million to 2 million during the same period. The phenomenal growth of Christian fundamentalism can't be comfortably explained away as the product of mass cultural neurosis. For many millions of Americans, militant faith provides a persuasive answer to the question of what it is to be modern. There are reasons, often quite good ones, why people become fundamentalists, and it makes sense for non-fundamentalists to listen to some of those reasons before passing judgment.
I'm sure that some of you may be curious to know how I make the transition from being a Protestant fundamentalist to Orthodox Christian, and so, with the remaining time I have this evening, I'd like to offer a brief account of my own spiritual journey.
As I've already mentioned, I was raised in the Plymouth Brethren, a small fundament sect founded in England in the 1830s. The Brethren are known as separatists, meaning that they stress separation from the world. Paul's injunction in II Corinthians, "Come out from among them and be ye separate," was a kind of manifesto for us. The Brethren cared about personal purity almost as much as they cared about personal salvation-and they cared a lot about personal salvation. By purity I mean something very different from Orthodox notions of deification or even conventional Protestant notions of sanctification. The Brethren believed neither deification nor total sanctification were possible to the believer in this world. The curse of Adam ran far too deep. The most one could reasonably hope for was to keep impurity at bay, as if one were fending off a large, angry dog with a stick. They were suburban ascetics. Rather than retreat into the desert like St. Antony, they made their way in the air-conditioned wilderness of the mall and beauty parlor. Like the spiritual athletes that preceded them, they did daily battle with demons. These demons wanted you to do all kinds of crazy things: if you were a man, they wanted you to grow long hair; if you were a woman, they wanted you to cut your hair short and wear makeup and provocative clothing. A Christian's purity was communicated through a series of symbolic renunciations: it wasn't what you did as a believer that mattered, so much as what you didn't do-no drinking, dancing, smoking, or swearing, and certainly no sex outside of marriage. These small sacrifices were what marked believers as being in the world but not of it. Outside in the street, it could be hard to tell the difference between a believer and an unbeliever. We weren't Mennonites, who wore strange clothes and lived in isolated communities. The wheat and the tares grew together. Saved and unsaved sat beside one another in school, toiled on the same assembly lines, drank the same soft drinks. Because daily life made it impossible to distinguish the elect, these small markers of identity-what Freud called "the narcissism of minor difference"-acquired tremendous significance for us.
Reading the signs of election could be a challenging task. Though we knew we were not of the world, the borders of worldliness could be mysterious and ill defined, like a shadow on a cloudy day. Good and evil were absolute, but worldliness was a moving target. Sometimes you could even see the line being redrawn in the sand. Take televisions, for example. In my mother's day, only unbelievers had TVs. By the time I came along, the majority of Brethren-apart from a few old-school preachers like my grandfather-had a heavy, wood-paneled set in the living room, just like their neighbors. And so worldliness migrated to the big screen. At recess, I played Star Wars without ever having seen Han Solo or the Millennium Falcon. This never really bothered me until the day that my best friend asked me along to see Clash of the Titans at the drive-in. I was nine years old and I begged my mother to let me go, though I already knew it was out of the question. If by coincidence someone in the assembly had happened to find out that I had been to the movies, my family would have been humiliated. However, a few years later, VCRs came on the market. Given that many Brethren families already had televisions, the local elders found it difficult to take a clear moral position on the new devices. This meant that I could finally watch Clash of the Titans. In the living room, safe from any fear of ritual pollution, I saw the pagan hero Perseus decapitate Medusa and use her writhing head to turn the monster Kraken to stone.
Some of the older prohibitions were bizarre even to believers. Thirty years ago, many Brethren assemblies forbade the wearing of red on the Sabbath, out of respect for Jesus suffering and death. Facial hair was also suspicious. In the 1980s, my Sunday school teacher got into trouble with an elder when his beard grew unkempt. The elder quoted I Corinthians 11: "Does not even nature itself teach you that if a man has long hair it is a disgrace to him?" My teacher brushed off the blow and returned a one-two jab of Leviticus 19, where God forbids the Israelites from trimming the corners of their beards, and II Samuel 10, in which David's enemies humiliate him by capturing his servants and shaving off their beards.
A third brother helpfully pointed out that the elder himself had facial hair. It wasn't much: a thin line of stubble shading his upper lip. But if God disapproved of beards, he reasoned, he likely didn't appreciate moustaches, either. The next Sunday morning, both men turned up bashful and clean-shaven, like newly sheared sheep.
As I became a teenager, all of these idiosyncrasies grew stranger to me. Yet there was no single moment when I ceased to be a fundamentalist, no stroke of disenchantment, no dramatic deconversion. Like a train leaving a station, my departure was so incremental it is hard to say precisely when it happened.
When it came down to it, I wanted more life. That's what Jesus promised his disciples, and it is what my grandparents found in the Brethren in the 1950s. Biblical faith had offered them an escape from the shallow streams of liberal optimism and secular materialism. The beauty of Scripture ennobled their lives and focused their devotion. Its wisdom corrected and consoled; its stories offered models and precursors with which to frame the events of daily life, making sense of tragedies and triumphs, and twining their private destinies with that of the universe.
However, for a teenager in the 1980s, life among the Brethren felt like a closed window on the world. I was unsettled by their itch for certainty, which curbed curiosity and cast doubt as an act of treason or ingratitude rather than as a dialogue partner, the iron that sharpens the blade. Jesus said that perfect love would cast out fear, but too often the Brethren seemed girded about with fears of impurity, change, disappointment. Because I respected my grandparents' faith and the sacrifices they had made for it, I knew I could not settle for a poor photocopy. Like them, I had to make my own way to Damascus.
So I fell away. As a child of mixed religious parentage-a mother who submitted to the Protestant sacraments of conversion and baptism, a father who did not-I always felt in a halfway covenant with the Brethren: in the saints but not of them. I expected to leave my split identity behind. But I felt no more at home in the world than I had among the Brethren. The diasporic sensibility that the Brethren borrowed from the Jews-the awareness of being a stranger in a strange land-still gripped me. I suffered from a spiritual arrhythmia that set a chamber of my heart beating out of time. Later, I saw how the modern condition was its own kind of diaspora-a wilderness of self-alienation, futility, and dis-ease.
In my early adult life, I struggled to find a way to re-enchant the world, to reconcile the claims of tradition and modern life. I experimented with evangelical and mainline liberal churches but was frustrated by the informal air of worship and by the sunny revisionism that elided anything of potential embarrassment to the modern mind (as John Updike said, "Too much light"). In the Brethren, I had bucked against the image of an angry and vengeful God. But the alternative, to cleanse God of all fierceness, was equally distasteful, a watery soup.
One day I wandered into an Anglican church, closing the circle on my grandfather's exodus more than 60 years before. It was not the feeling of having arrived, so much as having begun something without an end. In ritual and liturgy I found a bottomless beauty, which, though polar to the unscripted sermons of my childhood, strangely mirrored the patient and sober reverence of Brethren worship. I also began to uncover the riches of theology, the post-biblical traditions of the Church. Reading the ancient spiritual writers, I saw how earlier Christians had not always defined orthodoxy in terms of doctrinal purity but as a shifting point of balance between mutually undesirable extremes, a yoking of contraries, a third way.
My first exposure to Orthodoxy came while I was an undergraduate in Montreal. During an introductory course on the history of Christian thought, the instructor passed briefly on Eastern Orthodoxy on his way from Augustine to the medieval West. I remember getting a sense that Orthodoxy was a tradition concerned more with mysticism than theology, which I had been told was the special reserve of the West. A much more profound impression of Orthodoxy came from an art history course focusing on the art of Byzantium. Glimpsing the material culture of Orthodoxy-the icons and rich vestments, the church architecture that united spacious heavenly domes with squat, earth-like foundations, wordlessly communicated a theology of incarnation and deification. When we arrived at the Emperor Justinian's sixth-century masterpiece, Hagia Sophia, I swooned. I honestly developed something of an obsession with the building. When it came time for my first trip abroad, I passed on the usual backpacker destinations-Paris, London, Rome, Madrid-and made a beeline for Istanbul (or as my Greek Orthodox friends correct me, "Constantinople").
When Rachel and I came to Cambridge in 2002 to study at Harvard Divinity School, I used the opportunity to indulge my private obsession. I spent the first year of my master's program immersed in picayune architectural details of Hagia Sophia. In another class, Nicholas Constas' introduction to the Theology of the Icon, Prof. Constas suggested that Rachel and I visit St. Mary's. Our first service was Orthodoxy Sunday. I was undone by the beauty and reverence of the divine liturgy, by its enlistment of the entire human person in worship. My reaction was perhaps something like that of the emissaries of Prince Vladimir of Russia, the 11th-century ruler who brought his country into Orthodoxy. Like them, I did not know whether I was in heaven or on earth. "For on earth," they wrote, "there is no such splendor or such beauty, and we are at a loss how to describe it. We only know that God dwells there among men, and their service is fairer than the ceremonies of other nations, for we cannot forget that beauty."
Yet, even now as an Orthodox Christian, I find myself returning to draw water from the wells of my childhood. I come back to the Brethren for their affirmation of daily life, for their love of literacy and lay study of the scriptures. I come back to be reminded that truth is not a popularity contest, that standing up for what you believe can set you against the powerful current of modern consensus. More often than not, their willingness to do "battle royal" for the fundamentals of the faith was tempered by the recognition of a believer's frightening limitations in the eyes of heaven. The Word of God might be fixed for all time, but its interpretation was never-ending, an unfinished third testament to which each believer contributes a verse. Truth comes not cold and perfect into the world. Ever an alloy of humanity and divinity, it must be fired and formed in the heart's forge, and the crude shapes that result offered up with humility.
 George Marsden, Fundamentalism and American Culture: The Shaping of Twentieth-Century Evangelicalism, 1870-1925 (Oxford University Press, 2006), 235.
 See Ernest Sandeen, The Roots of Fundamentalism: British and American Millenarianism, 1800-1930 (Chicago: University of Chicago Pres, 1970).