Jason Byassee
Always has been, always will be.
- View Issue
- Subscribe
- Give a Gift
- Archives
Preaching is hard. Anyone who has ever sat with the Bible open, surrounded by commentaries, with a date circled on the calendar, knows what I mean. Words don’t jump to life by themselves. When it happens, it’s hard to explain how it happened. When it doesn’t, it’s just depressing.
This set of books about preaching tells us something about the difficulty of preaching. To read great preaching is to open oneself to homiletical despair. Sure I can see that Will Willimon is hilarious, Barbara Brown Taylor gentle, Marilyn McCord Adams brilliant—but how’s that make me any more likely to be hilarious, gentle, or brilliant on some upcoming Sunday morning at 11?
Adams will seem like the one who doesn’t belong among these homiletical greats. She’s better known as a philosophical theologian whose work on such difficult late-medieval thinkers as John Duns Scotus and William of Ockham has bailed out many a graduate student facing exams. In the best Anglican tradition she is also a priest, and has spent her teaching years at UCLA, Yale, and now Oxford, stepping in at local churches with charming names like St. Augustine’s-By-The-Sea.
For a philosopher with a penchant for nominalism, Adams is a surprisingly accessible preacher. A dandy sermon on Trinity Sunday starts with the throat-catching premise from Bernard of Clairvaux that his Cistercians shouldn’t preach that day. What words do we have adequate to the mystery of the triune life? Even the seraphim can only stammer, “holy, holy, holy.” But this is not holiness as exclusivity, the three persons as “gated community,” since we only know this God in personal, ecstatic form among us in Jesus and the Spirit. Adams concludes, “The funny thing is, Divine Being can’t be literally holy. God’s very nature explodes the meaning of that word. Who God is makes it impossible for God or any of us to be separate or isolated, ever. And that, my friends, is a good joke!” Parishioners poised for dry “philosophy” that morning were pleasantly disappointed!
Despite a light touch and often stirring biblical exegesis, Adams’ preaching elsewhere has a kind of monotony to it. She introduces her work as “sermons preached for those who find God’s Goodness problematic,” and piles up vignettes about those whose experience of abuse has left them with mistrust of God as father, or the church as a place of healing. To her credit she does not simply replace one idol with another—God as mother comes in for scrutiny, and Adams loves traditional church teaching too much simply to throw it overboard. Yet she so regularly signals disappointment with the church’s refusal to sanction same-sex relationships that she comes off here as something of a single-issue-preacher. By the time we reach sermons entitled, “Queer Variety,” “Gay Pride, Humbled Church,” and ” ‘Coming Out’ in the Power of the Spirit,” things have become a bit predictable.
An alternative to the monotone of the single-preacher collection is the multi-preacher collection, here represented by two books of sermons celebrating the university. Yale threw a preacher series in 2001 for its tercentennial, inviting some unevenness—exactly how many times do we need to hear Yale graduates invited back to preach at Yale patting Yale on the back for how religiously diverse Yale is? Especially when that diversity is limited to those inclined to praise diversity, and who have some means to pay Yale’s tuition bills. Yet some of Yale’s genuine theological greatness is also present in sermons by five former university chaplains, including the great William Sloane Coffin, Jr., still inspiring after all these years, insisting it is God who tells us who we are, and not money, power, America, or the Yale Corporation. One can see why he sent a hoard of students into the streets to protest various outrages and thereon to seminary.
The collection picks up steam later, especially as some of the heavy hitters—Taylor, Willimon, and Peter Gomes—seem to have gotten a little more time to preach than normal and to have brought their A game. Taylor, formerly an Episcopal parish priest in Georgia and now a teacher at Piedmont College there, experiments with the idea that the difficulty of Jesus’ claims on his disciples means that most of us aren’t actually his disciples. In fact, the church wouldn’t have survived if all of us church folk actually did what he said—hated our families, gave up all our things, took up the instrument of our execution. So the preacherly tendency to let hearers off the hook by softening Jesus’ demands may be, in fact, right: “along the way they found a third way to live with his high call to discipleship—neither turning away from it nor lowering it but allowing it to shimmer high over their heads—where it provoked them, disturbed them, inspired and strangely reassured them.”
What Taylor says here may recall a Catholic two-orders social vision (in which Jesus’ greatest demands are for monks, nuns, and priests only) or a Lutheran law/gospel dialectic (in which Jesus’ unrealistic words drive us to the need for grace), but in fact it’s something different. Taylor sounds like someone who really wants to follow Jesus, knows most days she doesn’t make a very good job of it, but holds out hope that God might get her yet. To all of us, Jesus says, “Return to your home, and declare how much God has done for you.” With Taylor’s wise graciousness suddenly evangelism doesn’t sound heroic, but like something I, and the ordinary people to whom I’ve preached, could do.
Gomes’ sermon made me wish I were present that day in Battell Chapel. Undoubtedly in his best Boston Brahmin accent, honed by decades as Harvard’s chaplain, he told of a friend in London who, “knowing what a terrible snob I am,” arranged for them to attend the Queen’s private chapel for Sunday worship. So in a royal lodge “filled with flowers and aged well-bred people and little dogs running around,” Gomes lied to the Queen Mother and told her he liked the sermon. She replied, wisely, “I do like a bit of good news on a Sunday, don’t you?” For Gomes that sweet but wise piety gets at the heart of Yale’s present and future, not just a relic of its past: that “the whole ideal of Christian learning and public service in the middle of a large, prosperous, and frequently hostile university is the struggle that remains ahead of you.”
Willimon draws on his long experience as Duke Chapel’s dean (before he became a Methodist bishop in Alabama) to describe the way the university rules out any sort of knowledge that cannot be measured, fixed, stapled down, and quantified—in short, any knowledge that involves God. Then he counters with stories of university people—students, professors, snobbish townies—who’ve been caught up by this God nonetheless. “Think of a lot of this—the buildings, the curriculum, many of the faculty, the beer—as an elaborate, subtle, really effective defense against the incursions of a living God.” What is it about the best of today’s university preaching that it refuses to bow to the idol of the university, and in fact names that institution’s competition with Christian discipleship, preferably with disarming humor?
It wasn’t always so. As Willimon’s collection of sermons from Duke Chapel shows, mainline preachers once delivered sonorous, learned discourses, punctuated by grand introductions (“The theme to which I wish to invite your attention is …”), quotations that nod to erudition (“let us remember the words of the poet …”), and a confidence that they held the ear of America. At mid-century one preacher could open with a premise that 96 percent of Americans claimed religious adherence, and then criticize the absence of works from that faith accordingly. By the 1970s, the preachers are fighting for a hearing with an audience not only less inclined to be impressed by learning and less biblically steeped, but also arrogantly condescending toward religion and most all else “institutional.” Yet, wonder of wonders, they still came to hear great preaching. This is the volume most inclined to gobble up your whole day if you’re not careful, for you can watch the history of American Christianity unfold before your eyes in these sermons. And the result is not all bad. As Willimon confides in his introduction, “I had assumed that preaching was in decline from its former eloquence and brilliance. But now … I think that many contemporary preachers are more biblical, more engaging and more theologically faithful than some of our predecessors.”
Some sermons here do seem included simply because the preacher was famous, as with a predictably Tillichian sermon by Paul Tillich. Some mid-century efforts at experimentation, such as a sermon-length conversation with Death of a Salesman, or a dialogue between two preachers, seem more quaint than inspiring now. It is also striking how often Willimon’s introductions to each sermon praise the celebrity of a preacher now rarely read or remembered—and indeed, how lovely their preaching was. At other points the rage of a tumultuous era is stirring, as in the mid-1970s when Baptist prophet Will Campbell equates abortion, war, and capital punishment as moral evils before concluding, “Jesus came that we might have life, not that we might deceive ourselves into thinking that we can take it away.” Elsewhere Martin Niemöller, then head of the World Council of Churches, must’ve floored his hearers when he recounted his time in the Nazi concentration camp at Dachau: “Every day … the idea arose: If these people will pull me out of my place here to that gallows, I shall shout to them, ‘You criminals, you murderers, wait and see—there is a God in heaven and he will show you!’ And then the torturing question: What would have happened if Jesus, when they nailed him to his gallows, to the cross, had spoken like this and cursed his enemies? Nothing would have happened, only there would be no gospel, no Christian Church, for there would be no message of great joy.” I’d almost flipped the page on Billy Graham’s closing altar call, until I saw him address the chapel of the still-Methodist university thus: “George Whitfield, one of the founders of Methodism, preached every night on the subject, ‘You must be born again.’ Some of the leaders of the church came to him and said, ‘Why don’t you change your text?’ He said, ‘I will when you become born again.'” Point taken, we are Methodists still, strike up “Just As I Am” on the Flentrop organ!
The most bracing thing for me in reading these sermons by such luminaries as Jürgen Moltmann, Howard Thurman, Eberhard Bethge, and dozens more, is that I’ve preached in that pulpit. Willimon was a beloved mentor of mine at Duke, and a childhood friend (on whom I have dirt) is the current associate dean, so I was invited last summer. To touch that stone pulpit, and look out on those learned faces underneath James B. Duke’s “great towering church,” was, shall we say, a bit humbling, and not less thrilling. That’s an image for the preaching task as such—standing where ages of saints have stood before and doing as they did, not as a burden or imposition, but as a gift. Preaching is surely difficult, only a fool would doubt that. Yet there is grace equal to the difficulty, enabling us to stand where we do not belong on our own merit.
Of course, the humbling and thrilling company in which every preacher stands extends much farther back than the early 20th-century founding of Duke Chapel. Duke Divinity School’s Richard Lischer has collected Wisdom on Preaching from Augustine to the Present in what immediately became the standard anthology for homiletics classes in English upon its publication. Lischer’s most ancient selections are often the most precious, as when Alan of Lille says apropos of Jacob’s ladder that preachers “are the ‘angels,’ who ‘ascend’ when they preach about heavenly matters, and ‘descend’ when they bend themselves to earthly things in speaking of behavior.”
The ancients can also be the hardest on the poor preacher. Lischer observes in his introduction that the holiness of the preacher is a universal concern among ancient homileticians, but is only recently making a comeback these days after centuries of silence. So John Chrysostom, with the loveliest surname a preacher could ever want (“golden mouthed”!), counseled preachers to “despise praise,” for if a preacher is “impudent and boastful and vainglorious his superior may as well pray daily to die.” Spoken like a true bishop! And since, as John Cassian points out elsewhere in the volume, preachers are particularly given to being “puffed up with the love of vainglory,” they cause not a little woe in the church. Perhaps the difficulty of preaching can be salutary medicine for our all-too-common pulpit pride.
One strength of this volume is the diversity of voices among the modern selections. Jarena Lee, the former slave turned preacher, and Frank Bartleman of the Azusa Street revival, describe dramatic empowerments in which the Spirit commanded them to preach and pray in tongues, no matter what human strictures against it. We have the great Protestant German theologians of the early and middle 20th century, Deitrich Bonhoeffer, Rudolf Bultmann, and Karl Barth (the latter in a particularly non-Protestant mood, insisting on sacramental practice as regular as the church’s preaching), alongside great Catholics such as John Henry Newman, Nicholas Lash, and Oscar Romero—he in his last sermon ordering Catholics among El Salvador’s cutthroats to desist from oppression, in the name of Jesus. Now that’s a sermon! Two selections after the great liberal preacher Harry Emerson Fosdick describes preaching as pastoral counseling writ large, P.T. Forsyth takes issue: “preaching is not simply pastoral visitation on a large scale.” Rather, the evangelical Scot insists, “The preacher is not there to astonish people with the unheard of; he is there to revive in them what they have long heard, He discovers a mine on the estate.” That is to say, the church is the one true preacher, as it offers treasure from one time and place in its life to those elsewhere. Boom! Any one of these excerpts could become foundational for an entire preaching life, as Lash’s “Performing the Scriptures” and Richard Hays’ “Hermeneutic of Trust” have for me.
This anthology’s glory helps offset some of the disappointment of O.C. Edwards’ A History of Preaching. Edwards, emeritus at Seabury-Western Seminary, is something of a dean among historians of homiletics, such that this volume has no lack of praise on its jacket. It is astonishing that someone has read enough to write about the use of rhetoric throughout the entire history of the church, from the Scriptures themselves until now. The only problem is that the book is surprisingly dull. Rhetoric, classically that exercise in learning to “instruct, delight, and move” an audience, in the tradition of Cicero, is here described in such a way as to anesthetize the reader’s imagination. When we describe the rhetorical abilities of, say, Chrysostom, surely we have to capture something of the greatness that led someone to say he had gold in his mouth, or that led his congregation to applaud him after one sermon. He upbraided them, insisting that if they liked his words, they should show it by changed lives of greater holiness. They responded to this unexpected rhetorical encore by … applauding again. Rhetoric has its place, no doubt, though churchmen have tended to express their worry about “empty rhetoric” even as they employ it to the best of their ability while quoting masterful rhetoricians. It’s striking how hard it is to write about great rhetoric in a rhetorically appealing way.
A late medieval English preacher lamented, “Nowadays three things are loved by many people: short skirts, short masses, and short sermons. It is their shortness that is loved, rather than their nature.” According to Siegfried Wenzel of the University of Pennsylvania, English churchgoers from 1350–1450 got more than they asked for. Though a book entitled Latin Sermon Collections from Later Medieval England is not going to fly off the shelf at Barnes & Noble, I found it the most pleasant surprise of the bunch. For one could argue that there are profound similarities between medieval churchgoers and those in our day. Church leaders then lamented the woeful state of church catechesis, as our leaders would be right to lament the state of learning among most of our lay people and not a few of our ministers today.
Late medieval English bishops did something about it. They mandated that all preachers instruct their people annually with set homiletical pieces, including the Ten Commandments, the Creed, the Our Father, the seven deadly sins, the seven virtues, and so on. Such dreary lists have often led historians to say that preaching in that time was dull, but Wenzel insists to the contrary. Preaching in that century is marked by a freshness and originality not matched in the years before or after, to the point that Wenzel can even call it “creative” in its own way, not marked by the kind of “repetitive and undisciplined ramble that occasionally comes from modern pulpits.”
I’ll go further and say the examples he gives are frequently stunning. For example:
As long as the beginning and the end of a circle are not fitted together, the circle is imperfect. But God and man are the circle of the whole creation … through God’s power, in the union and conjunction of the two the circle was perfected, for through God’s power divinity and human nature are in one person, the beginning and the end of the circle so joined and united that they can never again be separated from each another.
With good historical work like Wenzel’s we’re approaching the day when the old canards about allegory being lifeless and arbitrary will be no more, and we might even be so bold as to learn from medieval exegetes. For the work of our ministers is more like than unlike theirs, if such evaluations as this from the 14th century are to be believed: “A priest’s office is to regulate the moral life of his subjects, to drive off their errors, to solve their doubts and answer their questions, and to preach elegantly well-constructed and moral sermons” (emphasis his).
Indeed, it still is. If only it weren’t so difficult.
Jason Byassee is an editor at The Christian Century.
Books discussed in this essay:
Marilyn McCord Adams, Wrestling for a Blessing (Church Publishing, 2005).
Frederick J. Streets, ed., Preaching in the New Millenium: Celebrating the Tercentennial of Yale University (Yale Univ. Press, 2005).
William H. Willimon, ed., Sermons From Duke Chapel: Voices from “A Great Towering Church” (Duke Univ. Press, 2005).
Richard Lischer, ed., The Company of Preachers: Wisdom on Preaching, Augustine to the Present (Eerdmans, 2002).
O.C. Edwards, Jr., A History of Preaching (Abingdon, 2004).
Siegfried Wenzel, Latin Sermon Collections from Later Medieval England: Orthodox Preaching in the Age of Wyclif (Oxford Univ. Press, 2005).
Copyright © 2006 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.
- More fromJason Byassee
Interview by Alf K. Walgermo
Or do they? A conversation with Norwegian philosopher and Sunday school teacher Henrik Syse.
- View Issue
- Subscribe
- Give a Gift
- Archives
He who is faithful in what is least is faithful also in much,” says Henrik Syse. Norway has hired Syse, a professional philosopher who has written books about the ethics of war as well as ethics in everyday life, to figure out how the Norwegian government’s Petroleum Fund can act as investor in an ethically beneficial manner.
“My main job is to sit and think,” Syse jokes.
He is not alone in working toward an ethical management of the fund. The Finance Ministry’s “ethical guidelines” include a list of no-goers for the Petroleum Fund. The decision as to which companies actually to exclude is made by the Ministry itself, with advice from a separate Ethics Council. But after this gatekeeping, which is based on principles decided on by Parliament, there are still many ethically diverse companies left in the fund’s portfolio. (So far 17 companies have been excluded, while close to 3,500 are part of the portfolio.) And this is where the bank’s own ethicists and corporate-governance people, such as Syse, have to do their work.
Apart from being concerned with the ethics of large investments during work hours, Syse—the son of a former prime minister of Norway—teaches Sunday school at an Evangelical-Lutheran church in Oslo. We are sitting in his office in the Norwegian Central Bank, where Syse began working as an in-house ethicist last fall. Corporate governance, it’s called: the philosopher is in charge of how the fund uses its ownership rights.
How do you feel about being a “moral compass,” as The Wall Street Journal called you, for one of the richest countries in the world?
I don’t exactly see myself as a moral compass. There are so many people here in the bank with excellent moral compasses already. But whoever has power also needs direction, and hopefully I can help give direction to our work on corporate governance. As an ethicist in charge of this field in the Petroleum Fund (as of 2006, it’s officially called The Government Pension Fund—Global), it is my work to coordinate our long-term dealings with more than 3,000 companies from all around the world. We have to decide how we want to use our ownership rights, how we want to vote in annual general meetings, who we want to see on the boards of directors, and so on. That doesn’t necessarily make me a “moral compass,” but I participate in daily discussions and bring in the ethical perspective.
When we work on the governance of the companies we invest in, we have strategies based on ethical, social, and environmental concerns, as well as more traditional governance and financial concerns. Integrating ethics in that way is a good thing in itself, of course. But it is also profitable in the long run, which is why it is important for a serious investor to care about these things, Remember, we are a long-term investment fund. We are not primarily interested in what happens during the next two months, but how it will look 20 to 50 to 100 years from now. Norway’s oil income is a gift that should be shared jointly with future generations. It’s not a good investment for us to support companies that bury containers of poison that will leak in two years and destroy their local communities. And extensive corruption, which can provide short-term profits, could undermine a company’s trust; such a company, therefore, is not a good place for long-term investments.
The Petroleum Fund is one of the world’s largest single-owned institutional funds, with approximately $210 billion in assets, of which 40 percent is invested in stocks. So even if we only own a small share in each company, we are a major actor, and we have to use that power in a wise and prudent way. Look what happened with Enron! Enron has taught us that we must be active and keep our eyes on the ball. We can’t afford bad ethics.
How can investment and moneymaking go hand in hand with ethics?
We create value by investing so that other people can start new projects, and that is a good thing. And we do it on behalf of future generations. You could of course ask if capitalism itself is ethical. I think it is a system with inherent temptations and possible cruelties that we have to be aware of, but it is the best system we have for spreading prosperity. So I am not troubled working inside the system, but I’m glad that I’m working with the framework of the system, and hopefully helping to improve it.
How would you characterize the ethics of the Norwegian Central Bank?
People here are well aware that they make huge decisions about large sums of money, decisions that are based on values such as moderation, decency, and honesty. We are dependent on the Norwegian people’s trust. This way of thinking pervades the organization. I can observe this somehow from the outside, and fortunately it challenges my old prejudices about what it is like in a money palace. In my heart, I am still highly skeptical about the effect money can have on people, and I have a certain Platonic worry about being devoured by material concerns. Therefore, it is good to see that large sums of money can actually be managed ethically and effectively —and that there is a real willingness to ensure that high moral qualities and principles are actually upheld.
When it comes to where not to invest, ethically speaking, that is mainly a political issue, and we at the bank respect and abide by the decisions made. But I must add that it is impossible, and probably not even desirable, to avoid all companies that do bad things. It would not be wise to stand outside of the international economy. The solution is the middle road: Be proper owners and gradually improve the ethics and performance of the companies we’re invested in.
Is it ethical to invest in tobacco or weapons?
Good question. And I’m glad I am not the one responsible for answering it as long as I am in this position, since ethical screening is the politicians’ decision. But it is a legitimate debate. If we invest in tobacco, we must at least try to get the companies not to promote their products to youngsters. As of today, several kinds of weapons production have been excluded from our investment universe, based on the ethical guidelines, but tobacco has not.
This is an ongoing debate among the politicians.
Which companies are simply unacceptable?
Companies with serious violations of human rights, exploitation of children, and production of illegal weapons. Companies that support immoral regimes and companies that spend money on large-scale corruption. In such cases we cannot use consequentialist ethics, we have to apply duty ethics: It’s wrong to invest in these companies, period. That is what our ethical guidelines say, and I think that is wise. But, again, that’s the politicians’ decision in the end. My end of the playing field is corporate governance and active ownership.
What was it like to be the son of a prime minister?
To be the son of my father was fun. I have many fond memories. He was a caring dad. I have an ambivalent feeling about being the son of a famous person, but for me it was mostly nice and relaxed. Still, I remember when he visited me in Boston, when my wife and I studied there, he came with bodyguards. It was a bit weird. I also remember that everywhere he went, he was met by kindness. Anywhere I go today, there are people who tell me they met my father 20 or 30 or 40 years ago. But I have never felt that I should try to live up to what he did. That was never an issue. When he became prime minister, I was an adult myself. Unfortunately he died early.
How did he affect your own career as a philosopher?
I think that he influenced my interest in political philosophy and ethics. Our family was always discussing social and political matters. And my father was also focused on the idea that he was taking care of something that belonged to somebody else. When he built a cottage in the mountains, he was thinking of the joy it would give us, not the material value. He wanted to spread joy. I have learned a lot from that.
Before your job as an in-house ethicist for the Norwegian Government Pension Fund, you were working as a senior researcher for the International Peace Research Institute (PRIO) in Oslo, where you still have a part-time position. You have written much on the idea of Just War. What is the parallel between the Norwegian Central Bank and a just war?
In both cases we need to restrain ourselves so that we don’t damage the world around us. A military defense can do much harm to the outside world, but it can also create safety and be used in morally right ways. Military forces can keep the peace.
In your Norwegian book from 2003, Just War? On Military Power, Ethics and Ideals, you were skeptical of the American-led war against Iraq. Why?
The Just War tradition says that a country must have strong reasons to proclaim a war. Martin Luther makes this point emphatically: A war of necessity can be a just war, but a war of choice is the devil’s work. I take care not to be too absolute, and there could have been just reasons for war in Iraq. But the war would only have been just if based on broad alliances and preferably a mandate from the UN.
But isn’t it good that there is a country like the United States, which takes moral responsibility?
I have a lot of respect for how the United States took moral responsibility in the two world wars. But if you choose to act alone, you risk counteracting what is morally good. This is an old problem in war. In Europe we have in many ways huge moral expectations toward the United States, not least because religious and ethical values supposedly play an important part in American politics. Therefore many were disappointed on this occasion, but we should not exaggerate the disappointment: I do not think our relationship has deteriorated beyond the point of no return.
How do you reflect on the strong interconnectedness between religion and politics in the United States?
In general I think it’s positive that Christianity can be a basis for politics. In Europe there has been much illiteracy about how important religion is to human beings—Americans understand that much better. But the Bible can’t be used for single political decisions; religion should be a motivation and a moral obligation, not a political program. Most fundamentally, it gives us a set of values. What the Religious Right did with Bill Clinton during the Lewinsky affair, for example, was in my view far from the ideals of the Founding Fathers, and not in tune with basic Christian values.
How does your job as an oil and war ethicist fit with your volunteer work as a Sunday school teacher in an Evangelical-Lutheran church in Oslo?
There are some similarities! I have sort of a missionary agenda. Not that I preach here at the Central Bank, but I’m a missionary on behalf of moral values.
What is your church like?
My wife and I came back from the United States, where we had been studying, some 15 years ago, and we sat down in the back row in our local church at Fagerborg in Oslo. It was a good and warm place to be, we thought, and when they announced that they needed more volunteers for children’s work, we accepted. We have stayed with that church ever since. It is a community that represents both human and theological breadth and seriousness. People and not topics are important there. A newspaper recently wrote that the only thing Christians are preoccupied with is sex. I don’t think that is true. I’ve been a member of a Bible study group for almost 20 years, and we have talked about a lot of other and more important issues than sex.
One thing I like about going to church is a plain and normal service with a plain and not necessarily perfect sermon. What makes it good is the people who come together to connect to the great eternity, and the humbleness and joy of the community. It is not—and should not be—a “happening.” For the Lord’s Supper, we are gathered around a table to be in fellowship with God. Jesus, despite the dramatic situation on Maundy Thursday, took the time to eat with his friends. Luke tells us that Jesus said: “I have eagerly desired to eat this Passover with you before I suffer.” Just think: They took time to enjoy a meal together. It is easy to forget that aspect of the story. This joy in community—between humans, and between humans and God—is what the church represents at its best.
Do you feel at home in the Norwegian state church?
Yes—and I am also very ecumenical, so it would be a defeat in a way if I should need to convert. I have respect for friends of mine who have converted to Catholicism because they seek tradition, roots, and unity. I have an aunt who is a Baptist, friends who are Methodists, and a close friend who is Orthodox. But I think that unity in Christianity, as a whole, is so great that I should not need to convert. Not that I agree with Luther on every point, but the Evangelical-Lutheran church represents the Christian faith and basic Christian values in a good way. And I hope my friends feel the same way. Something that pleases me enormously in our time is the increasing ecumenical approach between Christian communities.
What is the most important to you when it comes to faith?
To belong, and to be forgiven. I belong to Him who has created this inconceivable and mysterious universe. And even if I do wrong against others and against God, I can start over again.
What is the most important thing you have learned from the Bible?
The answer is new every morning! But at the moment I am reading 1 John, so it is natural to say “love.” Love connects belonging and forgiveness, which are both based on the love between people and the love between people and God.
And finally: Do you have any characteristics or vices that you want to get rid of?
Yes, there are plenty, but I will not share them with our readers! Well, I can share one: I am terrible/great at postponing things. If it doesn’t need to be done today, it can wait until tomorrow. Suddenly I find a letter I should have sent four weeks ago.
In other words, you are really not good at day-trading?
No, I’m not. Definitely not! I’m a long-term guy.
What do you think about celebrating your 40th birthday later this year?
My wife is a bit older than I am, which helps. I am actually very relaxed about age. I only hope that I live long enough to provide everything I can for my children.
How old are they?
Twelve, nine, and twins at four. They’re all girls! It’s me and the girls, and I love it. Even though they put their heads together against me in a cute and very effective way: They know how to use their female grace and power.
Alf K. Walgermo has published two Norwegian books: one about the donkey in world literature, and one about the 100 individuals who met Jesus according to the gospels. He works for the Norwegian national daily newspaper Vart Land. E-mail: alf@vl.no
Copyright © 2006 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.
- More fromInterview by Alf K. Walgermo
Roger Lundin
Pragmatism, postmodernity, and the theology of experience.
- View Issue
- Subscribe
- Give a Gift
- Archives
When he resigned the pastorate of Boston’s Second Church in 1832, Ralph Waldo Emerson forsook his ecclesiastical office but not his ministerial duties. To be sure, he did enjoy a two-year hiatus, during which he undertook a tour of Europe and experienced a period of recuperative peace in relative solitude. But all was to change with the publication of his first major work, Nature, in 1836. The wide dissemination of his essays and his growing fame as a lecturer were to give Emerson a more expansive pastoral charge than he had ever held within the Church. The lectern was now his pulpit, the lecture his sermon, and the public his congregation. If the blind did not see and the lame did not walk as a result of Emerson’s secular care, many who labored and were heavy laden nevertheless found rest in his words.
One of the early, restless adherents to Emerson’s vision was Henry James, Sr., a man who had been made materially wealthy by birth and spiritually tormented by training. In early 1842, James heard Emerson lecture and was so taken with the message that he wrote the inscrutable man from Concord what R. W. B. Lewis calls “a spiritual love letter, expressing a desire to ‘talk familiarly with one who earnestly follows truth through whatever frowning ways she beckons him on.'” He implored Emerson to visit him at his home in lower Manhattan. Emerson obliged, and when he arrived, he was first ushered upstairs to the nursery “‘to admire and give his blessing’ (in the younger Henry’s words) to the two-month-old William.”
There is something sweetly incongruous about the picture of Emerson dispensing an “apostolic” blessing and the infant William James receiving it. Few things were more galling to Emerson than the claim that authority could be imparted from without rather than generated from within. Belief in the authority of office and the practice of rites of succession—these belonged to that “icehouse of externals” that Roman Catholicism was to Emerson. As James himself would assert six decades hence at the Emerson centenary, “there is something in each and all of us, even the lowliest, that ought not to consent to borrowing traditions and living at second hand.” The disdain for tradition and derivative thinking was intense in Emerson, and “the hottest side of him is this non-conformist persuasion.” The living soul, the vital individual “is the aboriginal reality,” and as a result “the past man is irrelevant and obliterate for present issues.”
Despite Emerson’s and James’ aversions to the “traditions” of “the past man,” it seems clear that an apostolic mantle of sorts did pass between the man and the infant that night in 1842. We can speak with confidence of a tradition of enquiry that has wended its way through American intellectual life from the time of Emerson to today.
It is a tradition of disavowing tradition, and it worships whatever unknown gods adaptive individuals are able to fashion out of their experience alone. This tradition moves from the Concord sage through James and William Dewey and finds its center at present in the work of Richard Rorty, Stanley Fish, and other key figures in what is frequently referred to as the “revival of pragmatism.”
The charter document of this movement was Emerson’s greatest essay, “Experience,” written only months after his introduction to the James family. In that aching work, composed in the wake of the death of his beloved five-year-old son Waldo, Emerson explained, “Grief too will make us idealists.” Having discovered through his sorrow that “bodies never come in contact,” Emerson could only conclude, “souls never touch their objects. An innavigable sea washes with silent waves between us and the things we aim at and converse with.”
With this most personal of essays, Emerson was signaling a major shift in modern conceptions of theological authority. For more than a century before him, the English-speaking world had leaned ever more heavily upon nature for ethical and spiritual support, as the traditional Christian sources of authority seemed increasingly shaky and unstable. Humean skepticism had called miracles, including the resurrection of Jesus, into question; the historical criticism of the Bible had cast doubt upon its historicity and authority; and Newton’s mechanistic science threatened to leave no room in the world’s workings for the energies and agencies of spirit.
As the traditional pillars of religious authority became badly eroded, the poets of England, along with the theologians and philosophers of Germany, had looked to the union of the potent human spirit and the fertile natural world for deliverance. At the dawn of the nineteenth century, William Wordsworth gave classic expression to the hopes invested in this marriage. About Eden, the Elysian Fields, and the lost Atlantis, he asked,
why should they be
A history only of departed things,
Or a mere fiction of what never was?
For the discerning intellect of Man,
When wedded to this goodly universe
In love and holy passion, shall find these
A simple produce of the common day.
Yet only decades later, for Emerson at mid-century, the differences between spirit and nature had come to seem all but irreconcilable. Darwin had not yet published his Origin of Species, so it was not a materialist vision that drove Emerson to despair of the possibilities of union. Instead, it was that Emerson’s idealism had triumphed so thoroughly that its mate, the material otherness of nature, appeared to fade from his view and draw back from his touch: “Dream delivers us to dream, and there is no end to illusion,” for “life is a train of moods like a string of beads, and, as we pass through them,” they prove to be lenses of many colors that “paint the world their own hue, and each shows only what lies in its focus.” “Experience” was Emerson’s discovery that nature and spirit had been put asunder, as well as his announcement that divorce papers had been filed, on the grounds of desertion.
William James was a product of this divorce, and he would labor to the end of his life to articulate a theology of experience that could replace the discredited theology of nature. “The axis of reality runs solely through the egoistic places,—they are strung upon it like so many beads,” he wrote, echoing Emerson. “The whole drift of my education goes to persuade me that the world of our present consciousness is only one of many worlds of consciousness that exist.” Those other worlds must contain larger, more copious “experiences which have a meaning for our life also.” They keep their distance from us yet become “continuous at certain points, and [then the] higher energies filter in.”
“I believe rather that we stand in much the same relation to the whole of the universe as our canine and feline pets do to the whole of human life,” James explains in Pragmatism. “They inhabit our drawing rooms and libraries. They take part in scenes of whose significance they have no inkling.” Content with barking, purring, and sniffing, these creatures “are merely tangent to curves of history the beginnings and ends and forms of which pass wholly beyond their ken. So are we tangent to the wider life of things.” But just as the cats and dogs have ideals that coincide with ours, and just as they “have daily living proof of the fact, so we may well believe, on the proofs that religious experience affords, that higher powers exist and are at work to save the world on ideal lines similar to our own.”
That is, we in the modern world now read a darkened nature through the lenses of our translucent but dazzling experiences. To us, those experiences appear as the center of life, but in reality, they are “tangent to the wider life of things.” Nevertheless, they are our experiences, and it is from them that we must build our theology in the hope that our canine howlings and feline meanderings will somehow point towards these “higher powers” that are determined to save the world.
James pressed on with his efforts to replenish the depleted stores of religion with the bountiful harvest of experience. Even in his final decade, he was still trying to construct a system roughly according to the plan Emerson had sketched more than half a century before. The keystone of the Jamesian spiritual arch was The Varieties of Religious Experience, which he first delivered as the Gifford Lectures in 1901–2. It is a measure of the significance of those lectures and that book that over the following century a number of Gifford lecturers would engage the Jamesian argument and examine the Jamesian scheme.
Both his critics and admirers have shown particular interest in this book’s efforts to articulate a natural theology with human experience as its revelatory core. In elaborating this theology, James was executing the terms of the lectureship as established in Adam Gifford’s will. The Scottish Lord had made the rational study of natural theology the raison d’etre for the series: “I wish the lecturers to treat their subject as a strictly natural science… . I wish it to be considered just as astronomy or chemistry is.” Gifford’s assumption carried with it the conviction, Alasdair MacIntyre explains, that “one mark of a natural science [is] that its history is one of rational progress in enquiry.” For William James at the start of the twentieth century, the only open highway to that “natural science” and its “rational progress” led through experience, for the road that ran through nature had now been blocked for good.
In his own Gifford lectures, delivered a century later and published as With the Grain of the Universe, Stanley Hauerwas was to take issue with James’ experiential response to the 19th-century loss of nature. He reproves James, as well as the liberal theologians of the 19th century and their 20th-century pragmatic descendents, for giving away too much theologically and receiving too little in return. “Under Kant’s influence,” Hauerwas writes, “Christian theologians simply left the natural world to science and turned to the only place left in which language about God might make sense, that is, to the human” world of moral action. Because theology could no longer “pretend to tell us anything about the way things are, James attempted, without leaving the world of science, to show how religious experience might at least tell us something about ourselves.”
Richard Rorty has long praised the same Jamesian compromise that Hauerwas questions. Rorty is happy to have religion say nothing about nature or about God, for he and James worship a taciturn deity who issues no commands and makes no demands. “The quasi-Jamesian position I want to defend says: Do not worry too much about whether what you have is a belief, a desire, or a mood,” for “the tension between science and religion can be resolved merely by saying that the two serve different ends.” Rorty claims it is no more absurd to resolve the tension between science and religion in this manner, than it was for Emerson and others in the mid-19th century to argue that Christianity had to be demythologized “to immunize religious belief from criticism.” Darwin was able to “trace the origin of human beings” and their minds “to the unplanned movements of elementary particles,” and if Christianity was to survive, it had to swear a vow of silence about matters science had decided for good. “Demythologizing amounts to saying that, whatever theism is good for, it is not a device for predicting or controlling our environment.”
In Rorty’s history of immunized Christianity, the 20th-century hero is Paul Tillich, who had possessed the theological good sense not to let our senseless babble about God drown out the revelatory murmurs of our experience. According to Rorty, a “pragmatist philosophy of religion must follow Tillich and others in distinguishing quite sharply between faith and belief.” In this curious use of terms, “faith” is our warranted talk about ourselves and the experiential god(s) in whom we have placed our fanciful trust; “Liberal Protestants” eagerly speak of their “faith in God” in this sense. “Belief,” on the other hand, involves our fruitless efforts to speak of a God who creates, reveals, and redeems; according to Rorty, it is “Fundamentalist Catholics” who belong in this camp, because they are “happy to enumerate their beliefs by reciting the Creed.” In contrast to these benighted souls, the “Tillichians” can “get along either without creeds, or with a blessedly vague symbolic interpretation of creedal statements.” They do not believe that religious faith requires a transformation of human character or offers insights into the will of God. For them, the simple goal of religion is to “make the sort of difference to a human life which is made by the presence or absence of love.”
With the “presence of love” the goal of human life, the actions or beliefs leading to that end may be as ethically diverse and theologically diffuse as the experiences of life itself. As Tillich wrote in his Systematic Theology, for many who work in the tradition of Schleiermacher, “experience is the medium through which the sources [of theology] ‘speak’ to us, through which we can receive them,” and no authority can predetermine what counts as a valid experience. In this theological tradition, “reality is identical with experience,” and “nothing can appear in the theological system which transcends the whole of experience.” According to Tillich, a pragmatic Christian must consider the possibility that Christianity has become exhausted and will become irrelevant; “being open for new experiences which might even pass beyond the confines of Christian experience is now the proper attitude of the theologian.”
Here again, Emerson had traveled this ground long before James, Tillich, and Rorty began crossing it. In 1838, less than a month before he was to deliver his “Divinity School Address” at Harvard, he complained of our tendency to value the miraculous over ordinary experience. Though each day is “full of facts,” we take them to be “heavy, prosaic, & desart,” until a creative intellect comes along and finds “that the day of facts is a rock of diamonds, that a fact is an Epiphany of God” upon which “he should rear a temple of wonder, joy, & praise.” Those who look to scripture, miracle, or mystery for God’s revelation will not discover it, for it is already in their midst and within their power. “They call it Christianity,” Emerson concluded, “I call it Consciousness.”
This precise conflation of Christianity with consciousness—of theology with experience—is to Hauerwas the problem at the heart of the Gifford Lectures project. If Christianity and consciousness are interchangeable, then Christian belief adds nothing to the deliverances of consciousness and has no means of speaking authoritatively of the world beyond the mind. “The social and intellectual habits” that shaped Lord Gifford’s understanding of natural theology and experience deprived Christian theologians of the “resources needed to demonstrate that theological claims are necessary for our knowledge of the way things are and for the kind of life we must live to acquire such knowledge.”
According to Hauerwas, this especially proved to be the case for William James and Reinhold Niebuhr, the two most famous Gifford Lecturers of the 20th century. Although “Niebuhr allegedly challenged the humanism James represented,” Hauerwas claims the neo-orthodox theologian’s “account of Christianity stands in continuity with James’s understanding of religion.” Richard Fox says Niebuhr’s reliance upon James was first evidenced in his 1914 Yale thesis, which claimed that religion no longer could depend upon “superhuman revelation” for its beliefs. Instead, “it had to be grounded in a philosophy of human needs and in the actual experience of belief.” Niebuhr’s God exercises his freedom only within the workings of the human personality; in dealing with nature, even God must obey the ironclad laws of matter in motion. To the end of his life, Fox says, Niebuhr remained “a skeptical relativist committed like William James to the life of passionate belief and moral struggle.”
The struggles of the finite self, rather than the sufferings of an incarnate God, thus became for Niebuhr, as they had been for James, our only viable revelatory source. According to Hauerwas, both James and Niebuhr believed “the truth of Christianity consisted in the confirmation of universal and timeless myths about the human condition that Christianity made available to anyone without witness.” These myths are accessible to all without condition, because they are diffused in the human consciousness, that Emersonian spring from whence the tributary of Christianity flows on its way to the boundless sea of religious experience.
In confronting James on religion, Hauerwas refers to George Santayana’s well-known gibe about his Harvard colleague. “There was,” Santayana wrote, “no sense of security, no joy, in James’s apology for personal religion. He did not really believe; he merely believed in the right of believing that you might be right if you believed.” Hauerwas criticizes Santayana for a lack of charity yet reaches a similar conclusion: “William James never entertained the presumption that the God of Israel might exist.” Being a theist did not entail for James making the “claim that something like God exists.” Instead, it meant one believed that no account of the world can be “adequate that denies the aspect of human existence that led us to believe in a god or the gods.” We cannot know that God exists but only that we have a compulsive urge to believe he does. In the end, Hauerwas believes, “all James sought was to show that religion is but another name for the hope necessary to sustain a modest humanism.”
Historian Ann Taves, who is as sympathetic to James’ religious project as Hauerwas is critical of it, agrees with the latter’s reading of James’ goals. According to her, James shifts our “attention from the study of religion per se to the processes by which religious” phenomena and other phenomena are “made and unmade.” The Jamesian approach leads us to “lose a sense of religion as a substantive thing” and to consider it instead as a diffuse force that infuses all human experience. To Taves this diffuseness is a good thing, for it makes us acknowledge there are no bounds to authentic religious experience. Even the narrowest of religious traditions must accommodate a wide-ranging array of beliefs and practices, and it is pointless to stake out creedal boundaries to corral them. With the field of experience so wide open and the horizon of its concerns so vast, “the study of religion opens out at this point into the study of everything.”
The “study of everything” in a religion of experience is the opposite of the focus on “something” in a theology of revelation. In the “Divinity School Address,” Emerson had complained about the narrowness of Christianity’s “exaggeration of the personal, the positive, the ritual. It has dwelt, it dwells, with noxious exaggeration about the person of Jesus.” In like manner, in A Pluralistic Universe, James claimed the “vaster vistas” of science and “the rising tide of social democratic ideals” had rendered the particular claims of Christianity—that “older monarchical theism”—”obsolete or obsolescent.” That “theological machinery that spoke so livingly to our ancestors,” with its creation ex nihilo, its astonishing eschatology, “its relish for rewards and punishment,” and its picture of God as an “‘intelligent and moral governor,’ sounds as odd to most of us as if it were some outlandish savage religion.” We may still confess our belief in “an external creator and his institutions” in our worship “at Church, in formulas that linger by their mere inertia, but the life is out of them, we avoid dwelling on them, the sincere heart of us is elsewhere.”
That “elsewhere” is the nowhere of the universal experience of religion. It comprises a field of study that has no object for its attention but endless subjects for its concern. Hauerwas believes James embraced the universal and shunned the particular, because his democratic concern for fairness was greater than his intellectual passion for truth. Near the close of The Varieties of Religious Experience, James offers a justification for his decision to reject all “particularized” theologies and their claims. When we wish to understand our “union” with the “more” that many have called God, we are tempted to seek some “definite description” for our ineffable experience. Yet we must resist all temptations to “place ourselves offhand at the position of a particular theology,” especially “the Christian theology,” which would have us “proceed immediately to define the ‘more’ as Jehovah, and the ‘union’ as his imputation to us of the righteousness of Christ.” If we were to give in to this temptation, “that would be unfair to other religions, and, from our present standpoint at least, would be an over-belief.”
Hauerwas counters James’ universal religious uncertainties with Karl Barth’s particular theological affirmations. In the Barthian vision, particularity is not an affront to our sense of fairness but the substance of our knowledge of God. In Dogmatics in Outline, Barth discusses the portion of the Apostles’ Creed that mentions Christ’s having “suffered under Pontius Pilate.” The inclusion of this historical reference affirms that the drama of God’s wrath and mercy—played out in the death and resurrection of Jesus Christ—did not “take place in heaven or … in some world of ideas; it took place in our time, in the centre of the world-history in which our human life is played out.” We must not seek to fly to some “spiritual Cloud-Cuckooland,” because “God has come into our life in its utter unloveliness and frightfulness.” According to Barth, the incarnation involves a concrete event in which a human name plays a crucial role in the cosmic drama of reconciliation:
There is nothing in the opinion of Lessing that God’s Word is an “eternal truth of reason,” and not an “accidental truth of history.” God’s history is indeed an accidental truth of history, like this petty commandant [Pilate]. God was not ashamed to exist in this accidental state. To the factors which determined our human time and human history belong, in virtue of the name Pontius Pilate, the life and Passion of Jesus as well. We are not left alone in this frightful world. Into this alien land God has come to us.
“God has come to us”—to contemporary ears that dissonant assertion does not harmonize with the naturalistic assumptions that inform Jamesian pragmatism. Barth’s rejection of those naturalistic premises makes it difficult for the “educated world,” as Hauerwas terms it, to comprehend the truth claims he makes. Those claims seem confusing and demand an explanation, while the assumptions of pragmatism appear to represent common sense itself. Thus, Barth needs to be translated into a language we moderns can understand, while in “the world as we know it, James and Niebuhr do not need to be ‘explained.'”
To understand them, “you do not need to have your conceptual machinery, to say nothing of your life, turned upside down.” The art of avoiding having either “your conceptual machinery” or “your life turned upside down” is, after all, central to the pragmatic tradition, which thinks in terms of adjusting human experience rather than of transforming it.
In good measure, pragmatism was a product of the American universities of the 19th and 20th centuries, and as such, it was defined by the practices of the seminar and sheltered under the umbrella of tenure. Not surprisingly, the experiences of the university have in turn supplied key metaphors for the pragmatic enterprise. For example, in Philosophy and the Mirror of Nature, Rorty promotes philosophy as a conversation without origin or end. The continuation of that conversation—subsidized by ample university salaries and foundation grants—becomes the point of thinking and talking alike. “The only point on which I would insist,” Rorty wrote in the final sentence of that book, “is that philosophers’ moral concern should be with continuing the conversation of the West, rather than with insisting upon a place for the traditional problems of modern philosophy within that conversation.”
Fish speaks ardently of the virtues of the conversational game, but the contest as he and the postmodern pragmatists envision it unfolds on a field protected by a retractable roof.
For Rorty and other postmodern pragmatists, the conversational model has undeniable religious and ethical benefits. They discover a life-affirming power in the refusal to commit to belief, for over the centuries passionate commitment has brought so much grief to history and so many conversations to an end. As long as we are talking, we can’t be fighting, or so the theory goes.
Although he disagrees with the conclusion Rorty and the pragmatists reach on this point, Charles Taylor admits they raise a crucial question; these writers point to a general truth, “which is that the highest spiritual ideals and aspirations also threaten to lay the most crushing burdens on humankind.” There is some validity to the pragmatists’ claim that since “the highest ideals are the most potentially destructive,” the “prudent path” may prove to be the safest, and “a little judicious stifling [of those ideals] may be the part of wisdom.” Yet Taylor believes this strategy only makes sense on “the assumption that that the dilemma is inescapable, that the highest spiritual aspirations must lead to mutilation or destruction.” He does not take this to be our “inevitable lot,” for while the “dilemma of mutilation is in a sense our greatest challenge,” it is not “an iron fate.” Instead, Taylor embraces what he considers the central promise of “Judeo-Christian theism,” which is “a divine affirmation of the human, more total than humans can ever attain unaided.”
For the proponents of postmodern pragmatism, however, “self-mutilation” remains the central fact of human history, and they argue it may prove to be our destiny, if we spurn the saving powers of conversation. “In the post-Cold War world,” there are many competing systems of belief, explains Lewis Menand in the final paragraph of The Metaphysical Club, his highly lauded account of the origins of pragmatism. Given the pluralistic vitality of these competing visions, “skepticism about the finality of any particular set of beliefs has begun to seem to some people an important value again.” In like manner, “the political theory this skepticism helps to underwrite” has been resurgent; it is the “theory that democracy is the value that validates all other values.” For the pragmatist, Menand says, “democratic participation isn’t the means to an end… ; it is the end. The purpose of the experiment is to kept the experiment going.”
For contemporary pragmatism, then, it is a given that while many things may transpire in a conversation, none of us should seek to have it reach a conclusion. Conversations provide a means of bringing order to what James famously called the “great blooming, buzzing confusion” of experience; through them, we learn the words others have used to name the world, and we trace the forms they have fashioned to gave it shape. We add a word or two here and tinker with a pattern there, but we have no illusions about the permanence or importance of our work. The purpose of the experiment is to endure the fact that the only point is the experiment itself, and the end of all our activity is to sustain an activity that has no end beyond itself, no point beyond its own pointlessness. On these terms, a healthy conversation will never lead to a repentant turning, a decisive metanoia, but only to evermore satisfying perspectival gazing, an endless round of theoria.
“We are all living out pragmatism,” writes Stanley Fish in There’s No Such Thing as Free Speech, and It’s a Good Thing, Too, “because we live in a world bereft of transcendent truths and leak proof logics (although some may exist in a realm veiled from us).” Lacking those “truths” and “proofs,” we must make do with a “ragtag bag” of linguistic tricks, cultural practices, and clichéd conventions “that keep the conversation going and bring it to temporary, and always revisable, conclusions.” Others may believe they have something to recommend that would improve the conversation or “make the game better.” Not Fish: “All I have to recommend is the game, which, since it doesn’t need my recommendations, will proceed on its way undeterred and unimproved by anything I have to say.”
In this passage packed with avowals of skepticism and denial, Fish slips in a parenthetical phrase—”although some may exist in a realm veiled from us”—as a gesture toward the possibility of transcendence. Many great works of modernist literature and postmodern theory make similar gestures toward some form of “the protean, and the unpredictable.” From Henry Adams’ “The Dynamo and the Virgin” to William Butler Yeats’ “The Second Coming,” to Jacques Derrida’s “Structure, Sign, and Play in the Discourse of the Human Sciences,” the poetry, fiction, and theory of the twentieth century are replete with premonitions of an otherworldly irruption into worldly affairs.
The transcendence of which these authors write and toward which Fish and James doff their rhetorical caps is, however, a power against whose intrusions the conversational model is meant to serve as a shield. Fish speaks ardently of the virtues of the conversational game, but the contest as he and the postmodern pragmatists envision it unfolds on a field protected by a retractable roof. If there are going to be storms or floods or bolts of lightning, we will hear reports of them no doubt, but they will never interrupt the flow of the game or make us call a halt to the proceedings. Our conversation may continue uninterrupted by calamities and unthreatened by those things our insurance policies still quaintly call “Acts of God.” Like the cats and dogs of James’ analogy, we may romp away at our doggy activities or refine our feline pursuits without fear of divine intervention but with the hope that we nevertheless may be taking “part in scenes of whose significance [we] have no inkling.” Coupling and uncoupling, pursuing and being pursued, playing and resting—these are the center of our lives, yet we still long to sense we are somehow “tangent to the wider life of things.” So, we keep open the possibility that “higher powers exist and are at work to save the world on ideal lines similar to our own.” “Yes, you are free to turn my life (and conceptual machinery) upside down,” Fish and James seem to say to the indifferent God hidden behind nature’s veil, “but not until I’m dead. In the meantime, please leave me alone, so that I may get on with the game.”
This is a conversational gambit with a distinguished history. In first-century Athens, the Apostle Paul covered similar ground with the Epicurean and Stoic philosophers. They too were masters of discourse in a world sealed off from divine intervention; they too lived in a “blooming, buzzing confusion” and worshiped at an “altar with the inscription, ‘To an unknown God.'” Paul’s response to what he discovered in skeptical Athens was simple and direct: “What therefore you worship as unknown, this I proclaim to you” (Acts 17:23). After having offered the briefest of summaries of Jewish history and early Christian theology, he concluded, “While God has overlooked the times of human ignorance, now he commands all people everywhere to repent” (metanoia)—that “world turned upside down” again. The world, Paul explains, will be “judged in righteousness by a man whom he has appointed, and of this he has given assurance to all by raising him from the dead” (Acts 17:30–31).
As lonely as these Stoics and Epicureans were in their cosmic isolation, “when they heard of the resurrection of the dead, some scoffed; but others said, ‘We will hear you again about this'” (Acts 17:32). And a number of those who did listen no doubt found the rules of the game, and the direction of their conversations, changed forever.
Roger Lundin is Blanchard Professor of English at Wheaton College. This essay is adapted from his book From Nature to Experience: The American Search for Cultural Authority, just published by Rowman & Littlefield. © 2005 by Rowman & Littlefield. Used by permission.
Copyright © 2006 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.
- More fromRoger Lundin
Wilfred M. McClay
The challenge of Christopher Shannon.
- View Issue
- Subscribe
- Give a Gift
- Archives
The University of Scranton Press may be one of the least well-known academic publishers in the United States. But for my money, it has stepped right into the big leagues by deciding to reissue, in a revised edition, historian Christopher Shannon’s extraordinary 1996 book Conspicuous Criticism: Tradition, the Individual, and Culture in American Social Thought, from Veblen to Mills, originally published by the John Hopkins University Press. (The subtitle of the new edition is slightly altered.) Scranton is thereby performing a public service, and a courageous one, at a time when economic pressures are forcing university presses to become very nearly as bottom-line conscious as commercial houses. A book like Conspicuous Criticism will never be a bestseller. But one dares to hope that with this new edition, Conspicuous Criticism will, after a decade of languishing in the shadows, emerge from its status as a bit of an underground classic, with a following among young Christian intellectuals in particular, and at last begin to get the kind of respectful attention across the intellectual spectrum that it deserves.
Conspicuous Criticism: Tradition, the Individual, and Culture in American Social Thought, from Veblen to Mills (New Studies in American Intellectual and Cultural History)
Christopher Shannon (Author)
The Johns Hopkins University Press
224 pages
$33.99
Shannon’s book remains as fresh today as when it appeared, an unusually penetrating and challenging rebuke to the social-scientific outlook on human existence. The social sciences, he argued, have used the anthropological idea of “culture” to unsettle the very basis of everyday life, promoting “a destabilization of received social meanings.” Critical social-scientific analysis, which so presents itself as the heroic antidote to the ravages of “the market,” is in fact in “the vanguard of extending the logic of commodification to the most intimate aspects of people’s lives.” If Shannon is right, the most celebrated critiques of modern American society, from Veblen to Mills, are pernicious failures, launched in the name of a thin and debased understanding of “culture,” and imposing an obsolete and misleading apparatus for thinking about society and culture. If Shannon is right, we will need to change, and change radically, the way we are doing things in the study of human life and thought, if we expect to proceed beyond the current impasse.
Small wonder, then, that the book was almost completely ignored by the relevant tribes of academics and social critics when it appeared. What else were they likely to do with a book that calls them on the carpet, along with everything they do? For all that we academics claim to relish provocative analyses and paradigm-shifting arguments, the truth of the matter is much less flattering to our amour propre. Such claims are often just little more than a rhetorical flourish, or even a self-deluding fantasy. We pretend to love change, and may even believe that we do. But in practice, we have little patience for it, particularly if we are the ones required to do the changing.
As a matter of brute fact, there is no force in the institutional world of ideas more powerful than the inertia of business-as-usual, the familiar pattern of expectations revolving around the core activities of paper-giving, journal-editing, lecture-giving, conference-attending, monograph-publishing, and hiring and tenuring, all under the surprisingly powerful conforming influence of peer review. Even the advent of postmodernism and its avatars has for the most part taken place in a smooth, untroubled, institutionally conservative manner, changing very little about this core structure. Notwithstanding the roiling that always seems to be occurring on the academy’s surface life, or the constant charges of political and cultural radicalism coming from the outside, or the faculty’s proud boasts of “transgressivity” and willingness to “think the unthinkable,” the truth of the matter is that the academy is one of the most procedurally conservative institutions in modern life. By challenging the professional canons, and the assumptions behind them, a book like Shannon’s took a position that is almost unassimilable, hence more easily ignored than engaged.
What, indeed, is our age likely to do with an author who put forward an argument for “the recovery of necessity,” at the very same time that the techno-utopian computer wizard Ray Kurzweil, perhaps reflecting more faithfully the regnant moral theology of high modernity, is assuring us that “The Singularity”—the moment when man escapes entirely from the yoke of biological necessity—”is near”? What can our age make of an author who thinks that the most pressing political question before us today is not the increase of political “participation” but the recovery of the meaning of politics itself, as an avenue for the expression of genuine human freedom, and an escape from the relentless “instrumentalization” of life? An author who argues (much like the philosopher Charles Taylor) for a renewal of our appreciation of ordinary life, but who remains snappishly suspicious of any attempts to over-theorize such a move, contending—astonishingly, to modern ears—that “acceptance of ordinary life requires an acceptance of waste” and requires resistance to the transformation of ordinary life into a “locus of meaning”? Who admonishes us that “all things do not exist to be read,” and that “experience does not have to be written to be valid”?
Let’s stop there for a moment. What, you may ask, could Shannon possibly mean in opposing the exaltation of “meaning”? How can one object to “meaning”? Isn’t this something approaching a modern sacrilege? Isn’t “meaning” precisely that thing for which we are told “modern man” is perpetually questing? Yes, precisely so. But I think it may help flesh out Shannon’s point to consider how it is embodied in a literary example drawn from Walker Percy’s novel The Moviegoer—itself a story of a questing modern man, the book’s narrator, whose aspirations are diluted and diverted into his tendency to exalt the “textualization” of experience in movies.
The narrator is addicted to the movies because it is only when he sees something in the movies that he can feel it to have been validated as “real.” When he and his girlfriend go to see Panic in the Streets, a 1950 movie filmed partly in the same New Orleans neighborhood where they are seeing the film, they emerge from the darkness of the theater with the certitude that the neighborhood is now “certified”:
Nowadays when a person lives somewhere, in a neighborhood, the place is not certified for him. More than likely he will live there sadly and the emptiness which is inside him will expand until it evacuates the entire neighborhood. But if he sees a movie which shows his very neighborhood, it becomes possible for him to live, for a time at least, as a person who is Somewhere and not Anywhere.
Or consider a passage earlier in the book, in which the narrator observes a honeymooning young couple wandering the French Quarter of New Orleans. They seem unhappy, anxious, aimless, sensing something wrong, something missing—until they spot the famous actor William Holden walking on the street. The young man is able to offer Holden a light for his smoke, and in this brief, impersonally friendly interaction with the hyper-real figure of Holden, a radiant source of “certification” itself, everything suddenly changes for the young man and his wife:
He has won title to his own existence, as plenary an existence now as Holden’s… . He is a citizen like Holden; two men of the world they are. All at once the world is open to him… . [His wife] feels the difference too. She had not known what was wrong nor how it was righted but she knows now that all is well.
Holden has turned down Toulouse shedding light as he goes. An aura of heightened reality moves with him and all who fall within it feel it. Now everyone is aware of him… .
I am attracted to movie stars but not for the usual reasons. I have no desire to speak to Holden or get his autograph. It is their peculiar reality which astounds me.
Shannon’s account of things speaks directly to the condition that Walker Percy has so penetratingly described, a mad compulsion to grasp hold of textual “meaning” as a shield against the “emptiness” of everyday life, a shield which is itself a chief cause of the very emptiness it would counteract, much like the compulsions of a man who takes drugs to alleviate the pains of his drug-taking. Shannon’s challenge to our ways of thinking about culture takes us much deeper, then, than a mere intellectual critique of social-scientific ideas and techniques. It is also an exploration of the ways in which those ideas and techniques have insinuated themselves into the most intimate crevices of our souls.
I would probably never have become aware of Shannon’s work myself, had I not been asked to review Conspicuous Criticism for the Annals of the American Academy of Political and Social Science. I had never so much as heard Shannon’s name before, nor had I heard of the book, then only recently published. But something intrigued me about the title, and so I accepted the invitation. The book itself proved to be an utterly fresh and compelling critique of the social sciences, by means of a close and searching reading of a variety of the most influential social-scientific writers of the early-to-mid-20th century, from Veblen to C. Wright Mills.
Such was my introduction to the work of one of the most original of the rising generation of U.S. cultural and intellectual historians. Although a historian by training, Shannon is very much at home in the precincts of social theory, and his work is profoundly informed by his Roman Catholic convictions and commitments. In addition, he has the kind of interdisciplinary versatility and range that were the glory of the American Studies movement at its best. These traits come together in a most unusual way in him. His perspective on the larger subject of modern American culture is difficult to describe adequately. I suppose it would come closest to the mark to say that he has been deeply influenced by the critiques of the Enlightenment launched by Alasdair MacIntyre and others in the same line, critiques that have been especially effective in opening up the problems of “community” and “tradition” in modern America, and in identifying the enterprise of social science, as now practiced, as the most dangerous foe of those things, and indeed, of the very insights it ostensibly seeks.
To put it more bluntly, Shannon thinks it entirely possible that the enterprise of social science is inherently self-defeating—useful in identifying the essential preconditions of social order, but profoundly unhelpful in the end, because it does so by means of a vocabulary that, ironically, makes it impossible to believe in the legitimacy of that social order. Such language may, in effect, rob us of the wherewithal to buy back what we never should have sold in the first place. In that sense, Shannon’s argument reminds me of the witticism attributed to Karl Kraus, to the effect that “Psychoanalysis is itself the disease of which it would be the cure.” For Shannon, the reification and subsequent problematizing of “culture” is itself the great iatrogenic disease of our times, the error at the heart of the social-scientific method, and the source of the very social woes that the social sciences have proved so incapable of curing.
Shannon is, of course, not content merely to critique social science. For him, the roots of the problem represented by social science stretch back much further, to the antinomies and emphases inherent in our modern, Protestant culture. Looking at the effects of the broadly liberal social-scientific outlook on American intellectuals, as seen and understood through their most influential texts, Shannon finds, again and again, the same basic premises: modernist, liberal, rationalist, individualist, “enlightened,” anti-traditionalist, anti-authoritarian, cosmopolitan, and culturally (if not theologically) Protestant.
As a committed Protestant myself, I found myself wanting to quarrel with him about his sweepingly negative view of Protestantism, which seemed to me in need of qualification, and which included many elements that are just as vividly present in the present-day condition of American Catholicism. But I could not help but be stimulated by the boldness of his argument, and at the number of times that he found the mark in ways that more seasoned scholars covering the same ground (myself included) had failed to do. When I began editing a book series in American intellectual history for Rowman and Littlefield, I naturally sought out Shannon to see if he had another project in the wings.
Indeed he did, and the result was his second book, A World Made Safe for Differences, which captured the interest not only of historians but also of a broad range of social scientists, such as the communitarian sociologists Robert Bellah and Amitai Etzioni. To oversimplify greatly, what Shannon did with this second book is demonstrate how the postwar social-scientific understanding of “culture,” while appearing to endorse cultural and individual diversity, in fact imposed an imperial standard of behavior and cultural organization that was even more rigid than the standards it replaced, and all the more pernicious for failing to acknowledge its imperial designs. Hence the book’s title is an ironic one, since it points to the ways that cultural or individual “difference” was reduced to a commodity that was, in fact, fully commensurable with all those things from which it “differed.”
The resulting book read like a cross between the cultural criticism of the Frankfurt School and the constructive impulses of communitarian Catholic social thought—and in fact, Shannon’s work helps one to see that these two stances may not be as far apart as appears at first glance. Indeed, it would not be too far from the mark to label Shannon as a Catholic variant upon the vision of the late Christopher Lasch, the eminent historian who was Shannon’s teacher at the University of Rochester, and from whose spirit of moral and intellectual critique of modernity he continues to derive inspiration.
As I have already intimated, Shannon’s argument is difficult to relate to existing ideological camps. It is radically conservative, not only in its high general regard for tradition but also in its unhesitant condemnation of the American abortion license and its skepticism about the jettisoning of traditional sexual morality. In other respects, however, it recalls the Frankfurt School (and poststructuralist) critique of universalism and liberal toleration, as controlling regimes that are all the more insidious for their refusal to declare themselves as such, and their self-serving charade of “value neutrality.” Its critique of Ruth Benedict’s anthropological view of culture is conjoined, brilliantly, to a critique not only of American modernizing arrogance in Vietnam but also of one of the most scathing American critics of the Vietnam intervention, Frances FitzGerald. There was a deep consensus, Shannon argues, undergirding the relationship between the liberal establishment and the radical counterculture, a consensus that consistently inhibited the emergence of genuine alternatives.
Both of Shannon’s books deserve to make a considerable mark, and could serve even to reorient some of our national discussion of the problem of “community” and the need to recover a sense of the authority of tradition. But Conspicuous Criticism remains especially worthy of reconsideration, precisely because its publication marked the emergence of a voice that has yet to be adequately heard and confronted.
Skeptics will say that Shannon is merely giving us, at bottom, yet another critique of liberal modernity. But I think this pigeonholing underestimates the uniqueness of his achievement. He is relentless in pointing to the ways in which other critics of modernity have merely reshuffled its premises without seriously challenging them, let alone departing from them. It is both radical and conservative, combining a powerful attack on bourgeois liberalism and consumer capitalism with a ringing defense of the place of religion and tradition (and particularly traditional religion) in contemporary society. Writing with moral passion and critical verve, Shannon identifies the forces that isolate the individual in modern society and counters more than a century of efforts by “progressive” intellectuals to displace tradition in favor of a humanism that actually diminishes humanity in the name of freeing its potential.
In a sense, one could say that Conspicuous Criticism is a call to reinstate traditional relations to God, nature, tradition, and the common good. But it makes that call in a most untraditional way. It is not in any way a paean to nostalgia nor a brief for conservatism. Instead, it arises out of a keen sense of necessity, an awareness of the inadequacy of critical discourse, and the unsustainability of the unassisted modern project in all its triumphalist finery. It is a recognition that, when the road forward leads only to a dead end, or over the side of a cliff, the most urgent business at hand is to trace the way back.
Wilfred M. McClay teaches history and humanities at the University of Tennessee at Chattanooga, and is the editor of the forthcoming Figures in the Carpet: Finding the Human Person in the American Past (Eerdmans).
Copyright © 2006 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.
- More fromWilfred M. McClay
Lauren F. Winner
Faith in the suburbs.
- View Issue
- Subscribe
- Give a Gift
- Archives
A few weeks ago, I visited a church in a locale I’ll call Levittown. The building was mid-century churchy: stained glass windows; deep, dark wooden pews; prominent pulpit and altar; upright piano on a dais. But about twenty minutes into the service, something decidedly contemporary caught my eye: a giant (should I say venti?) Starbucks cup sitting proudly on the piano. How’s that for contemporary iconography? I wonder if it was a paid product placement.
The Suburban Christian: Finding Spiritual Vitality in the Land of Plenty
Albert Y. Hsu (Author)
IVP
220 pages
$14.19
Starbucks is an icon of suburbia, of course, even if the great coffee institution did start in Seattle, and it is fashionable to decry suburban living. Indeed, one of the few things agrarians and urbanites share is their utter horror for the suburbs, whose gated communities and starter mansions are poison for the soul. Even suburbanites themselves often engage in anti-suburb diatribes, albeit a tad sheepishly.
Two new books propose to redirect the conversation. David Goetz, a former editor at Leadership Journal, and Albert Y. Hsu, an editor at InterVarsity Press, ask what a spirituality of suburbia, a spirituality for people who drive mini-vans and tend manicured lawns (or pay someone else to tend them), might look like.
Suburban life, if pursued unheedingly, “obscures the real Jesus,” writes Goetz in Death by Suburb. “Too much of the good life ends up being toxic, deforming us spiritually.” But if obscured, Jesus is there somewhere, and Goetz’s book aims to help suburbanites find him in the ocean of lattÉs, in the aisles of Pottery Barn, and in the bleachers at the soccer field: “You don’t have to hole up in a monastery to experience the fullness of God. Your cul-de-sac and subdivision are as good a place as any.”
Goetz identifies eight “environmental toxins” that plague suburbia and offers a spiritual practice to purge each toxin from your system and help you realize that “even in suburbia all moments are infused with the Sacred.” By packaging his insights in this self-helpy formula—7 habits, 8 practices, 40 days to a more authentic Christian life—Goetz obviously opens himself up to criticism: this blueprint recapitulates some of the very problems of the suburban mindset that he is trying to offset. But I suspect he knew what he was doing, and chose the idiom to convey a subversive message to his target audience.
Consider environmental toxin #8, for example: “I need to get more done in less time.” Do you constantly wish you had more time—more time to catch up on email, get to the grocery store, pay your bills, please your boss, maybe even take your wife out to dinner? Consider keeping the Sabbath, a discipline sure to reconfigure the understanding and inhabiting of time for all those who faithfully practice it. (Scripture offers us a similarly counterintuitive antidote for the related sin of credit card debt: if you want to get out of debt, start tithing. Giving money to the church won’t get our Visa bills paid, but there is no surer way to escape being owned by money than giving it away.)
Environmental toxin #6: “My church is the problem.” Goetz has no patience for Americans’ pernicious church-hopping: “Only in relationships that permit no bailing out can certain forms of spiritual development occur.” Rather than switch churches because your pastor said something you disliked or the new church plant down the street has a livelier youth group, practice the discipline of “staying put in your church.” This manifestly countercultural advice cuts to the very heart of America’s restless anomie.
Environmental toxin #3: “I want my neighbor’s life.” Has life in the suburbs turned your skin permanently green with envy and taught you to covet the Joneses’ cars, careers, and Ivy League-bound kids? Try developing “friendship with those who have no immortality symbols.” That is, stop hanging out with your rich neighbors, and instead find “ways to be with the poor, the mentally disabled, the old and alone…. . essentially, all those who don’t build up [your] ego through their presence.” When you hang out with less wealthy people, you “begin to compare [yourself] to a different kind of neighbor,” and then you experience not envy but gratitude.
The point here is well-taken, but it still finds us measuring our worth against other people. And the examples Goetz offers underscore how hard it is for middle-class Americans to practice downwardly mobile sociability. His model of social “kenosis” is the writer Barbara Ehrenreich, who emptied herself by focusing her gaze on maids and waitresses. But Ehrenreich gazed at maids and waitresses because—on assignment for Harper’s for articles that became the book Nickel and Dimed—she was working undercover as a maid and waitress herself. It is worrying indeed if investigative journalism is the principle channel through which suburbanites can “face the humanity of another kind of person.”
Albert Y. Hsu’s The Suburban Christian finds in suburban living a deep spiritual longing. People come to the suburbs, Hsu says, because they are looking for something, a job or affordable housing or good public schools (or, less charitably, mostly white public schools). Like Goetz, Hsu insists that you don’t need to live on a farm or in the inner city to live an authentically Christian life. Nevertheless, “the suburban Christian ought not uncritically absorb all the characteristics of the suburban world.”
One excellent chapter teases out what follows from suburban reliance on cars. (Did you know that the average commuter spends three weeks a year commuting?) As a consequence of our driving dependence, says Hsu, the elderly who can’t drive are marginalized. Policy makers don’t prioritize public transportation. Indeed, we often don’t build sidewalks; as Bill Bryson has observed, “In many places in America now, it is not actually possible to be a pedestrian, even if you want to be.”
Alongside Goetz’s suggestion that we stay put in our churches through thick and thin, Hsu urges us to recover the parish mindset—that is, to go to the church down the block and join in what God is doing there, rather than shopping for the perfect fit and winding up at a church two suburbs away.
Consumerism goes hand in hand with suburban living. How can we “consume more Christianly”? Shop in locally owned stores; create holiday rituals that don’t revolve around gift-giving; regularly fast, not just from food, but also from media, new technology, and new clothes; buy organic, fair-trade coffee produced by companies that don’t destroy rain forests. (And if you agree with the skeptics who find the “fair-trade” crowd self-deluded, there are plenty of other ways to become a more discriminating consumer.) A basic guideline for simple living, says Hsu, is “to live at a standard of living that is below others in your income bracket. It you can afford a $400,000 house, live in a $250,000 one instead. Or, if you can afford a $250,000 house, live in a $150,000 one.”
In recent years we’ve seen a flourishing of books that take a fresh look at what might be called our “living arrangements.” The works of Wendell Berry and Albert Borgmann; books such as David Matzko McCarthy’s The Good Life: Genuine Christianity for the Middle Class, Eric Jacobsen’s Sidewalks in the Kingdom: New Urbanism and the Christian Faith, and T. J. Gorringe’s A Theology of the Built Environment: Justice, Empowerment, Redemption—these and many others examine the built-in assumptions of our ways of life and their often unintended and unexplored consequences. Add Goetz and Hsu to that growing stack.
Neither of these books pretends to offer the last word on the subject of suburban Christianity. They raise more questions than they answer—questions, for example, about the effects of suburban development on the landscape that may have attracted us to the suburbs in the first place. It would be salutary to consider suburban gender narratives, the ways that suburban living shapes our understandings of masculinity and femininity, and to probe the deep economic structures that make suburbia not only possible but seemingly necessary. What about the labor relations we practice in our suburban homes, homes so often kept clean by someone who can’t afford to live in the suburbs? What vision of redeemed creation do we encode when we build houses that aren’t designed to last more than 75 or 100 years—or tear down 30-year-old homes to build bigger ones?
Still, Hsu and Goetz have offered a welcome alternative to tiresome and self-righteous preaching about the spiritual superiority of agrarian or urban life. For Christians living in suburbia—and for those of us who share in the sins of suburbanism from our perches in the country or the city—these provocative yet loving books may prove invaluable.
Lauren F. Winner is the author most recently of Real Sex: The Naked Truth About Chastity (Brazos).
Copyright © 2006 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.
- More fromLauren F. Winner
Christopher Shannon
A squandered heritage regained.
- View Issue
- Subscribe
- Give a Gift
- Archives
Long before the current clergy sex abuse scandal, a significant portion of American Catholics had already come to identify themselves as survivors. Viewed rhetorically, the response to this all-too-real current crisis follows the script of an earlier abuse scandal of somewhat more questionable veracity: Catholic education. American Catholics who came of age in the 1960s like to identify themselves, for better or for worse, as the people who were beaten by nuns. As comedy or tragedy, this story has been American Catholics’ chief contribution to late 20th-century American popular culture, as witnessed by the broad appeal of stage productions such as Do Patent Leather Shoes Really Reflect Up?, St. Mary Ignatius Explains It All for You, Nunsense, and Late Night Catechism.
God and Man at Georgetown Prep: How I Became a Catholic Despite 20 Years of Catholic Schooling
Mark Gauvreau Judge (Author)
Crossroad
192 pages
$23.00
In God and Man at Georgetown Prep, Mark Gauvreau Judge writes as a survivor not of abuse, but of neglect. Coming of age in the 1970s, Judge missed out on the gory/glory days of tough-guy priests and ruler-wielding nuns. Drawing on the theological spirit, if not the anglophile cultural posturing, of the conservative Catholic William F. Buckley’s classic God and Man at Yale, Judge exposes and indicts the functional atheism that has shaped Catholic educational institutions in the decades following the Second Vatican Council (1962-65). Judge’s book is an unabashed plea for Catholics to recover the world they have lost and reclaim the birthright they have sold for the material comforts and cultural respectability of mainstream, middle-class American life.
As an approximate generational contemporary of Judge, I can attest that he knows the world of which he writes. Growing up in the Washington-Baltimore area, the son of a successful journalist who wrote for National Geographic magazine, Judge found himself in an Élite Catholic educational milieu that fully embraced the liberal interpretation of Vatican II. Judge’s account of his education gives me new appreciation for provincialism: reform came a bit later to upstate New York, so I was spared the worse excesses of vanguard Catholic liberalism. Judge began his Catholic education at Our Lady of Mercy grammar school, run by the Sisters of Mercy in Potomac, Maryland; he continued on at Georgetown Preparatory School, the Jesuit prep school founded by John Carroll, first bishop of Baltimore, in the 1780s.
From 1850 to 1950, Catholic schools stood as the single most important marker of Catholic separatism in America. But by the 1970s, Catholic education had become a pale imitation of an already bland liberal humanitarianism. In one example, Judge cites We Follow Jesus, a third-grade religion book used at Mercy in the 1970s, which retells the Gospel story of Martha and Mary with Jesus simply saying “Now, Martha, do not worry too much about dinner; just do the best you can.”
If the Sisters of Mercy watered down the faith, Georgetown Prep directly undermined it. After centuries on the front lines of the Church’s war with modernity, the Jesuits had finally gone native. Pierre Teilhard de Chardin, the French Jesuit censured by Rome for his efforts to synthesize Catholic theology and Darwinian evolution, became an intellectual hero. Teilhard’s displacement of the cross for a progressive vision of humanity evolving toward an “Omega Point” in history fit all too neatly with New Age spirituality. At Georgetown Prep in the 1970s, Eastern mysticism trumped Catholic theology and situational ethics replaced traditional Catholic moral teaching, particularly in matters of sex.
Alas, Judge found more of the same at the non-Jesuit Catholic University of America in Washington, D.C., an institution founded to promote a national presence for Catholic intellectual life in America but now in open revolt against the teachings of the Church. Judge attended Catholic U. in the 1980s, at the height of the controversy surrounding Father Charles Curran, the moral theologian who lost his position for supporting the right of Catholics to express faithful dissent from the Church’s teachings on sexual ethics, particularly the ban on artificial birth control reaffirmed in the 1968 encyclical Humanae Vitae. Curran lost his battle but clearly won the war of popular opinion, with most students and faculty rallying to his side in the name of academic freedom.
Judge looks back on these developments with a heavy heart, but he concedes that the near apostasy of his Catholic educational institutions caused him little distress at the time. Like many young American males of his generation, he was less concerned with theology than with sex, drugs, and rock n’ roll. His drug of choice was alcohol, which he was able to control for some time through the alcoholic/workaholic discipline of an upwardly mobile East Coast professional. After graduating from Catholic U., Judge began a successful career in journalism, writing on popular culture, politics, and religion for mainstream outlets such as The Washington Post and left-of-center weeklies such as The Progressive and In These Times.
Despite this professional success, Judge eventually realized that he had lost control of his life to alcohol. Sparing us the details of his collapse and recovery, Judge simply states that he drank too much, did stupid things, and overcame his alcoholism with the help of Alcoholics Anonymous. In a similarly refreshing manner, he insists that most of the best times of his life involved alcohol in one way or another. Drinking with friends, staying up all night talking, laughing, listening to music and dancing—these are good things. Looking back on his recovery, Judge sees Alcoholics Anonymous as at least as much of a problem as alcohol itself. Rooted in the tradition of Protestant conversion narratives, the 12-step program of A.A. found one of its earliest advocates in Father Ed Dowling, a Jesuit priest who saw in it principles similar to the Spiritual Exercises of Ignatius of Loyola. According to Judge, in recent decades A.A. has, like Catholic schools, largely rejected its Christian roots; like other popular therapies, the secularized 12-step program has become an end in itself. To Judge’s credit, he refuses to define himself in terms of his disease.
Still, God and Man at Georgetown Prep is a conversion story of sorts—but a distinctly Catholic conversion story. Judge never officially left the Church, and he presents the “reversion” to his childhood faith less as a turn from sin to salvation than from indifference to commitment. The turning point in Judge’s life came not with his recovery from alcoholism but with the death of his father from cancer. Here again, Judge writes refreshingly against genre expectations. His father’s death leads not to emotional trauma but to an intellectual and spiritual awakening: “My father had been dead for several months before it dawned on me that he’d been a Catholic.”
Judge knew, of course, that his father had always attended Mass faithfully, but only by going through his father’s book collection after his death did he realize that his father had been a serious intellectual Catholic. Judge’s twenty years of Catholic education had failed to impress upon him the possibility that being Catholic had anything at all to do with the intellectual life. Catholicism was rules, doctrines, and Mass on Sunday. Exploring his father’s book collection, Judge discovered the intellectually rich and challenging Catholicism of G. K. Chesterton, Jacques Maritain, Joseph Pieper, and Dietrich von Hildebrand. After reading the books that had shaped his father’s mid-century Catholicism, Judge came to a new self-understanding: “I am a member of a generation of Catholics raised after Vatican II who was cheated out of a Catholic education.”
Members of that generation will share in Judge’s delight at the recovery of his Catholic intellectual heritage. Catholics and non-Catholics alike will find in his account a model for an intellectual life firmly rooted in the particularities of one faith tradition, yet determined to speak to the world in a common language. In particular, Joseph Pieper’s writing on hope as a historical virtue and his major cultural works, Leisure: The Basis of Culture and In Tune With the World: A Theory of Festivity, provide a philosophical framework sorely lacking in contemporary historical and cultural studies. Judge sees in the intellectual world of mid-century Catholicism not lockstep conformity to particular doctrines but rather an expansive affirmation of the beauty and goodness of God’s creation. Beginning in the 1960s, Judge contends, liberal American Catholics severed this affirmation from orthodoxy and thus reduced it to a kind of “humanism within the limits of the Democratic Party alone!”
This political dimension of the recent history of American Catholicism plays no small part in Judge’s story. In the work of a Washington-based journalist, it is hard to see how it could not. From father to son, the Judge family seems to have followed the now familiar trajectory from New Deal Democrat to Reagan Republican. Critical of crass free-market materialism and the Wal-Martization of American life, Judge nonetheless takes as his contemporary Catholic intellectual guides the solidly neo-conservative George Weigel and Richard John Neuhaus. The war on terrorism simply carries on the work of the war against communism; the real evils of communism/terrorism seem to excuse the real evils of the alternative regimes America has supported in the name of democracy. If liberal Catholics have shamelessly used the “consistent life ethic” argument advanced by Cardinal Joseph Bernardin to make abortion and capital punishment equivalent evils, conservatives have used opposition to the greater evil of abortion as license to support a whole range of lesser political evils clearly condemned by their erstwhile hero, John Paul II. Catholicism at its best has never fit neatly into American cultural and political categories. Even as Judge points his reader to a more classical Catholicism, he may provide some Catholics with ammunition for a political battle that, in the terms presently operative, is simply not a Catholic fight.
Christopher Shannon is assistant professor of history at Christendom College. His book Conspicuous Criticism: Tradition, the Individual, and Culture in Modern American Social Thought, has just been reissued in a revised edition by the University of Scranton Press.
Copyright © 2006 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.
- More fromChristopher Shannon
Julia Vitullo-Martin
The formative history of suburbia.
- View Issue
- Subscribe
- Give a Gift
- Archives
At a reunion held a few years ago by my husband’s family outside Baltimore, my brother-in-law, an architect, suggested we explore Guilford, a section of Northeast Baltimore where their Italian immigrant grandfather had done stonework which he regarded as the finest of his career. An elderly relative wanted no part of the expedition. Yes, grandpa had been very proud of his stone houses, walks, walls, and porter lodges, she recalled. But Guilford prohibited any Italian from moving in. It was “restricted.” No Italian, Jewish, or black families need apply. My young, well-educated Italian American in-laws—bankers, professors, lawyers—pondered the unwelcome idea that their hard-working grandfather had treasured having built houses that he himself had been forbidden to buy.
Yet for the decades between the Civil War and the Great Depression—the first heyday of suburban development in America—most upper- and upper-middle-class prime residential developments routinely discriminated in a fashion we now regard as reprehensible. A family’s having the money to buy a house wasn’t sufficient. It also had to be the right color and ethnicity, and attend the correct church. Deeds carried restrictive covenants that set forth a series of proscriptions that bound both buyer and seller, as well as subsequent owners. In addition to Guilford, Maryland, restrictive covenants governed such famous developments as Forest Hill Gardens and Great Neck Hills in New York, Colony Hills in Massachusetts, Park Ridge in Illinois, Country Club District in Kansas, Palos Verdes in California, and hundreds of others across the country.
Some of the restrictions, particularly in the days before zoning, made eminent sense: no slaughterhouses, for example. No oil refineries, iron foundries, coal yards, hen houses, or reform schools. Some restrictions were a matter of taste, but surely enhanced property values, such as landscaping and set-back requirements. Other restrictions doubtless contributed to the deadening of suburbia that is so much criticized today: no stores, no theaters, no restaurants. But these are really policy considerations, having to do with personal preferences. Do you want to live in a quiet, serene, fairly uniform haven, or do you want to live in a lively, dense, urbane neighborhood? People of good will can and do disagree, and make different choices.
The pernicious restrictive covenants—not struck down by the Supreme Court as unconstitutional until 1948—had to do with race and religion. The most desirable developments were confined to white, Anglo-Saxon Protestants, preferably Episcopalian, with an understood hierarchy for everybody else—which meant that upwardly mobile Americans seeking the most desirable housing as a reward for their newfound wealth, education, and success were usually blocked if they were anything but white Protestants.
Now this little-remembered but immensely important practice has been given its own history by Robert M. Fogelson, a professor of urban studies and history at the Massachusetts Institute of Technology. To drive home just how extensive the practice was, Fogelson tells the story of an incident that occurred in Los Angeles in 1948. Singer Nat King Cole, one of the most successful entertainers of the 20th century, bought a 12-room house for $85,000 in Hancock Park, a restricted area. Hancock Park’s wealthy doctors, lawyers, and businessmen organized to keep Cole out. When the Supreme Court struck down restrictive covenants as unenforceable by the state, they decided to buy him out, telling him they did not want any undesirables moving in. “Neither do I,” said Cole. “And if I see anybody undesirable coming in here, I’ll be the first to complain.”
One of the most important intellectuals setting the stage for restrictive covenants, writes Fogelson, was Frederick Law Olmsted, Sr., long renowned as this country’s greatest landscape architect, designer of the finest American parks, most famously Central Park in New York City. Esteemed as a liberal who opened public spaces to the masses and as an innovator who devised transverses to separate traffic from pedestrians, Olmsted was also a class-conscious aristocrat who saw degradation and deterioration all around him in the 1860s—”the unmistakable signs of the advance guard of squalor.”
His solution was separation—to be applied to people in suburbs much as he had applied it to traffic in Central Park. Separate the bad from the good, the noxious from the clean, the tasteless from the tasteful. (This he called the “law of progress,” which would enhance the “cleanliness and purity of domestic life.”) In addition, ensure that the separation becomes permanent via agreements among property owners. “Suppose I come here,” he asked, writing about a suburban tract, “what grounds of confidence can I have that I shall not by-and-by find a dram-shop on my right, or a beer-garden on my left, or a factory chimney or warehouse cutting off this view of the water? If so, what is likely to be the future average value of land in this vicinity?” To emphasize its importance, he italicized his final sentence: “What improvements have you here that tend to insure permanent healthfulness and permanent rural beauty?”
In fearing change, 19th-century Americans were hardly being frivolous. As Fogelson points out, the late 19th century was a time of widespread civil disorder, brutal industrialization, financial panics, and unpredictable real estate markets. Elegant buildings were demolished and replaced by taller, uglier ones. He quotes a Unitarian minister in Cambridge, lamenting that “the want of permanence is one of the crying sins of the age,” and that Americans “are always getting ready to live in a new place, never living.”
As Olmsted noted, the point of his ideas was to ensure “tranquility and seclusion” and to prevent the “desolation which thus far has invariably advanced before the progress of the town.” On these matters he was a man of genius who set out the principles that still guide the best development: roads should be curvilinear, fitting into rather than destroying natural surroundings; a very few should handle through traffic, the others should be local; they should be beautifully landscaped, as should the front of all homes; the entrances to property should be distinctive, set off by wooden gates or stone lodges—stone being a crucial element used in all Olmsted designs.
Indeed, Guilford, Maryland, was a preeminent Olmstedian development, designed by the Olmsted Brothers, a Boston firm set up by the revered man’s sons. Laid out in 1903, Guilford was more stringently restricted than almost any predecessor development. A separate document of 23 pages banned nuisances on all lots, and banned businesses and multi-family housing on all but a few. Setbacks were required at the rear and sides of the houses as well as the front. A design review process allowed the Guilford Park Land Company to reject plans for “aesthetic or other reasons,” and to consider whether the house was in “harmony” with its neighbors. No house or lot could be occupied “by any negro or person of negro extraction,” nor did the company sell to Jews “of any character whatever.” In other words, ethnicity trumped class. A distinguished Jewish scholar, for example, at nearby Johns Hopkins University would be blocked from purchasing. These exclusions became part of the marketing campaign.
What Fogelson thinks of all this he pretty much keeps to himself, which is a disappointment. As a historian accustomed to casting a cold eye on the human condition, Fogelson in his younger years wrote on race, violence, riots, crime, and the disintegration of cities. His book of 2001, Downtown: Its Rise and Fall, 1880-1950, treats the destructiveness of the American pattern of separating business from residences. He concludes that the fall of downtown in the mid-20th century was due to the American development of itself as a nation of suburbs—a bourgeois utopia defined (so he suggests in his new book) as much by what it excluded as by what it included.
So how are our once-restricted bourgeois utopias doing today? Pretty well, actually. Guilford is still gorgeous, as are Forest Hills, Great Neck, and Bel-Air, to name just a few. For some reason, Fogelson doesn’t mention what is obvious to anyone walking through these neighborhoods: the restrictive covenants governing physical amenities like landscaping remain. But the neighborhoods are now a vibrant mixture of ethnicities and probably religions. He does target Palos Verdes Estates, which is only one percent African-American and two percent Hispanic, even though its mother city of Los Angeles hasn’t had a white majority since at least 1990.
What does this mean? Have we really not made progress, even though the Italians and Jews who were closed out in the 20th century now own many of the most beautiful suburban houses in America? Does the paucity of blacks and Hispanics in Palos Verdes reflect intractable injustice and discrimination? Or will they be following the groups ahead of them, much as the Italians followed the Irish, and the Irish followed the Germans, who followed the English? Fogelson doesn’t tell us, leaving us to draw our own conclusions about the meaning of it all.
Julia Vitullo-Martin is a senior fellow at the Manhattan Institute in New York.
Copyright © 2006 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.
- More fromJulia Vitullo-Martin
Kathryn Long
History and ‘The End of the Spear’.
- View Issue
- Subscribe
- Give a Gift
- Archives
Elisabeth Elliot once commented that she wrote her novel, No Graven Image, as a way to deal with her experiences on the mission field through fiction, especially the three years she spent among the Waorani, then known as “Aucas.” Margaret Sparhawk, the heroine of Elliot’s book, is a sincere young woman who struggles with the challenging and unexpected complexities she encounters in her efforts to live out the gospel among an indigenous people. Through Margaret’s experiences, Elliot explores the idea that there is much more to missionary work than meets the eye, in fact, a great deal more than the folks back home ever imagine.
Elliot’s novel offers a helpful cautionary note to viewers of The End of the Spear, a feature-length film about the story with which Elisabeth Elliot is most often associated and which she did much to memorialize: the 1956 killings of her husband and four other missionaries by Waorani warriors in the rainforests of Ecuador. Elliot’s 1957 book Through Gates of Splendor, along with a Life magazine photo essay and wide coverage in both the secular and the Christian press, made the deaths of Jim Elliot, Peter Fleming, Ed McCully, Nate Saint, and Roger Youderian the defining missionary martyr story for American evangelicals during the second half of the twentieth century.
The End of the Spear tells this story from new points of view, those of Nate Saint’s son Steve and one of the Waorani, named Mincayani. Mincayani (played by Louie Leonardo) is a composite character drawn from the life histories of several Waorani warriors but closely associated with the real-life Mincaye, one of the men who speared the missionaries. The film follows Steve (played as a boy by Chase Ellison and as an adult by Chad Allen) and Mincayani from Mincayani’s childhood experiences of tribal violence in the 1940s and Steve’s loss of his father in 1956 to a dramatic moment of confrontation and reconciliation as adults in the 1990s. Along the way, it portrays the love between a father and his son and recreates the efforts by Nate Saint and his friends to contact the Waorani that led to their deaths. It also brings the story up to date by sketching the subsequent peaceful contact by three women and a child who lived among these same people. The four were Steve’s Aunt Rachel Saint (Nate’s older sister, played by Sara Kathryn Bakker); a young Waorani woman named Dayumae (Christina Souza), who was Rachel’s language informant; Elisabeth Elliot (Beth Bailey); and Elliot’s small daughter Valerie (Laura Mortensen).
In the screenplay, Dayumae tells the Waorani, who turn out to be members of her extended family, about God’s Son, “who was speared and did not spear back.” The American women live the message by not seeking to avenge their slain family members and by nursing the Waorani through a polio epidemic. Young Steve Saint, his mother, and his sister ride out the epidemic with the tribe and make other visits. The Waorani choose to embrace Christianity and end their revenge killings. Rachel Saint lives the rest of her life among the Waorani. In 1994, when she dies, the tribe invites Steve to take her place, an invitation he and his family accept. Then comes the film’s climax.
The End of the Spear does a number of things well. Moviegoers, including evangelicals familiar with the story, learn that the name of this people-group is “Waorani” (or “Waodani”) and not “Aucas,” a Quichua word meaning “savages” and used as a slur.1 The movie is fast-paced, intense (including intense violence), and offers some beautiful aerial scenes of rivers and forests in Panama, where it was made. The focus on Mincayani enables the screenwriters to provide some context for the Waorani who killed the five missionaries, including the high homicide rates and patterns of revenge killings that characterized their culture. The choice of the Embera, an indigenous group in Panama, to play the Waorani (except for lead characters), reflects a desire to offer some authenticity to the portrayal of indigenous people. Although the explicit Christian message is muted, the film tells a new generation about five young men who cared enough about the Waorani to risk their lives.
The film has gotten mediocre reviews in the secular press, some because of the film’s message-based content, some (correctly, I think) for the lack of character development and other weaknesses in the script. For example, the word “missionary” is not mentioned until well into the movie, leaving the uninitiated to wonder why in the world the hotshot young pilot and his friends want to find and meet the elusive and hostile indigenous people. Even so, the film grossed about $4.3 million on its opening weekend, ranking eighth in U.S. box office receipts. Evangelicals, by and large, have responded positively. The main controversy among conservative Christians has centered not on the film itself but on the choice of gay activist Chad Allen for the dual roles of Nate Saint and the adult Steve Saint.
My own unease with the film has a different source. It goes back to the complexity of missionary reality. There is both much more—and sometimes less—to the history of missionaries and the Waorani than meets the eye in the film. In its effort to inspire and entertain, the film presents a story with all the complexities removed. In doing so, it also employs a great deal of fictionalization. The movie is honest about this, although most people will miss the disclaimer: at the very end of the closing credits, a brief statement acknowledges composite characters and fictionalized incidents. Of course, audiences recognize that history on the big screen is almost always fictionalized. At the same time, however, the film opens with the words, “From a true story,” and that phrase is prominent in advertising. The movie is based on the dramatized documentary, Beyond the Gates of Splendor, released in 2005. Unfortunately, in an attempt to appeal to a commercial audience, The End of the Spear loses much of the documentary’s charm and other strengths while sharing its weakness of glossing over large portions of the past fifty years.2
Much of the fictionalization in The End of the Spear is done to make the plot, which focuses on Steve and on Mincayani, correspond to the larger narrative of the missionaries’ deaths and the Waorani embrace of peace/Christianity. In essence, the screenplay adds the legend of these two characters to the familiar, and in some circles almost mythic, story of the five missionaries. Historical connections do exist, but not as the film presents them. As a boy the real Steve Saint spent many school vacations with his Aunt Rachel and the Waorani, but he became an influential participant in Waorani history only at about the point where the film concludes. He and his family did not relocate permanently to Ecuador as the movie implies, but lived there for about a year between 1995 and 1996. They have maintained their involvement through extended visits. Saint’s autobiography, published in connection with the movie and bearing the same title, reflects this. The real Mincaye, who is Dayumae’s half-brother, participated in many of the historical events narrated in the film and was one of the “Palm Beach” killers. However, other Waorani warriors played more prominent roles, which led to the creation of a composite character.3
Given the plot, it is logical that the Steve Saint character is introduced as an eight-year-old, frightened by the risks his dad is taking. The real Steve turned five the same month Nate Saint was killed. In the film, Steve secretly radios Aunt Rachel to find out from Dayumae how to say “I am your friend, your sincere friend” in the Waorani language. A phrase is given which the boy carefully repeats and writes down. He teaches his father, and these will become his father’s last words, as well as the words Steve later uses to reach out to Mincayani.
The historical record shows that the men did try to get phrases from Dayumae, including ones that they thought meant, “I like you; I want to be your friend.” Yet one real-life complication is that the Waorani are a kinship-based society and had no words in their language for friend or friendship. Contrary to the movie, Dayumae spoke no English. As a young teenager, she fled the violence of her people to become a peon or virtual slave on a hacienda at the edge of the rainforest. There she spoke lowland Quichua, the language of the Indians around her. When Jim Elliot visited her to learn phrases in Wao tededo, the Waorani language, he did not realize that she spoke a version of her native tongue corrupted by Quichua influences. Elliot was fluent in Quichua, but neither he nor the others would know that Wao tededo bore no relationship to that language. The End of the Spear certainly portrays the complete inability of the missionaries to communicate with the Waorani during their first peaceful encounter. Even so, “I am your sincere friend” is an invented theme that obscures the vast cultural divide between the Waorani and the missionaries who wanted to meet them.
More curious is the way the film depicts two events that have been the subject of controversy and criticism over the years: the shooting of a Waorani man during the attack on the beach and the circumstances surrounding the polio epidemic that struck the Waorani in 1969.4
Both the movie and historical accounts agree that the men took guns when they established their base camp in Waorani territory. They thought the Indians would flee if shots were fired, but they also believed that firing, even into the air, should be a last resort. The search party that arrived after the attack found a bullet hole through the plane window and some signs of struggle. Later, when the Waorani who participated began to talk about what happened, it became clear that a bullet from one of the pistols fired in the melee either hit Dayumae’s brother Nampa in the head or grazed his head. Nampa, who was one of the attackers, died sometime later—from a few weeks to more than a year—the time frame is unclear. Accounts generally connect his death to the attack, reporting that Nampa died of the bullet lodged in his head or from an infection related to the wound, though this, too, is disputed.5
Since 1974, critics have charged Rachel Saint and her sending agency, the Summer Institute of Linguistics (now SIL International), with trying to conceal the gunshot and Nampa’s death in order to make the missionaries look more heroic. In fact, neither Saint nor the sil denied the shot. The End of the Spear plays into the critics’ hands by offering only the slightest visual nod to the shooting of Nampa. One scene during the spearings shows an arm pointing a pistol to the sky, while another reaches around to knock it down so the pistol fires horizontally rather than vertically. There is no indication that the shot hit anyone.
Most accounts appear to suggest that the shot was accidental, in the context of a struggle, but we may never know for sure. Expressing a note of ambiguity might have added to the power of a film emphasizing forgiveness, reconciliation, and ending violence. The Waorani have known about Nampa’s wounding and death all along; perhaps American moviegoers should have been given the same opportunity. In the end, some Waorani still found the overall decision by the missionaries not to use their weapons in self-defense a significant witness to the potential of Christianity as a mechanism for ending violence.
The choice of the 1969 polio epidemic as a turning point in The End of the Spear seems particularly odd. Steve Saint was eighteen at the time, and, contrary to the movie, not present. No foreigner (non-Waorani) was there except for Rachel Saint, and her role was a mixed one of sacrifice, bravery, and a hard-headedness that cost dearly the very indigenous people she loved. In the movie version, the Aenomenane, downriver tribal enemies of the peaceful Waorani, arrive with their sick seeking help. The illness is diagnosed as polio and a six-week quarantine is imposed on the village. The quarantine includes young Steve, still not more than nine or ten, his mother, sister, Rachel Saint, and Elisabeth Elliot. The missionary women, along with Dayumae, Kimo, and other Waorani, demonstrate love for enemies by caring for the polio victims. They improvise wooden teeter-totters to rock the victims and help them breathe. Meanwhile, Steve’s nemesis, Mincayani, continues to resist the way of peace. He hunts game and throws it away rather than feed traditional foes. As the epidemic passes, old animosities dissolve and a peaceful kingdom dawns. “The teeter-totters had stopped and with them the cycle of revenge.” From this point, the film skips twenty-five years to Rachel Saint’s death in 1994, when Steve is challenged to pick up the mantle from his fallen aunt.
All this may make for good cinema, but it is deceptive history. The polio outbreak occurred at a time when Rachel Saint, a few of her colleagues, and a handful of Waorani believers were engaged in a controversial effort to find and relocate formerly hostile groups of Waorani scattered across their vast traditional territories. Saint and her colleagues pushed the relocation because they feared the Waorani would not survive hostile encounters with oil crews who were exploring their territory. They also believed that consolidating the groups would facilitate Christianization. The Waorani who responded to these efforts did so because they wanted spouses, trade goods, and peace (which also pretty much summed up their understanding of Christianity).
In September 1969, when polio first appeared, there were approximately 250 Waorani crowded in or near Tiwaeno, the clearing where Dayumae’s family first met Elisabeth Elliot and Rachel Saint in 1958. (Elliot left the Waorani in 1961.) About 60 percent of them were from two waves of newcomers who had arrived within a little more than a year. Some had already faced contact illnesses such as severe respiratory infections in the new location and were weak due to food shortages. Sixteen people died of polio, all from among the newcomers. About the same number were left handicapped, some of them taken to outside hospitals or clinics for treatment. Two people were speared in revenge killings, and one of the perpetrators died mysteriously shortly after a spearing.
Rachel Saint has been criticized because she had dragged her feet on immunizing the recent arrivals. More important, after polio was diagnosed she ignored doctors’ advice to immediately immunize because she was afraid that adverse reactions would lead to violence. For three weeks, as the disease spread, Saint refused to allow a missionary doctor and a nurse to fly into the clearing because of the danger of spearing. They finally came anyway and were the ones who designed makeshift treatments like the teeter-totters. Saint worked to exhaustion caring for the sick. She also risked her life, even breaking spears, to enforce the Christian ethic of peace by confronting warriors bent on revenge killings after polio victims died. A few Waorani converts did care for their former enemies, reinforcing the association of Christianity and peace. Nonetheless, it was a difficult time. BaÏ, an influential warrior, left with members of one group. He called Tiwaeno “a place of death.”6
The crisis highlighted Rachel Saint’s unwillingness to let any other outsider live and work with “her” tribe, even while population influx overwhelmed her attempts to serve as sole resource—medic, missionary, linguist—among the people. In partial response to perceived shortcomings, during the next five years the sil would add four more staff members to the Waorani “team.” Two of them, Catherine Peeke and Rosi Jung, worked with informants to translate the New Testament into Wao tededo. Another, Pat Kelley, served as a literacy instructor and developed reading materials, while Jim Yost did an anthropological field study of the Waorani, the beginning of a series of important research projects. Yost and his wife, Kathie, who arrived with toddler Rachelle and had two more children during the eight years of their assignment, were the first nuclear family of outsiders to live among the Waorani.
Although these individuals almost never appear in any of the well-publicized stories, they invested significant portions of their adult lives helping the Waorani face the pressures of increasing contact with the outside world. They gave them tools—Scripture and literacy—for an indigenous Christianity. They helped negotiate Ecuadorian citizenship and land disputes and worked with missionary nurses to train native health care promoters. Meanwhile, tensions increased between sil in Ecuador and Rachel Saint, along with outside criticism of the organization. As reporter Amy Rogers noted in the Pittsburgh Post-Gazette, the film “glosses over great accomplishments and simmering controversies” when it fast-forwards through this period.7
In the climax of The End of the Spear, Mincayani takes the adult Steve to the beach where his father was killed. The Waorani warrior digs up a tin cup and a battered photo of Steve that Nate Saint carried in his plane (the photo having survived forty years in the rain forest).
“They didn’t shoot us,” Mincayani says, part of an intense exchange.
“Your father was a special man. I saw him jump the Great Boa [a Waorani spirit guarding the afterlife] while he was still alive.”
A flashback shows what has been alluded to earlier: angelic figures above the riverbank and light flooding the area as the men died. Radiance comes down and touches the dying Nate Saint. “I speared your father,” Mincayani confesses to Steve. Mincayani points a spear at his own chest and urges Steve to use it. In a moment of long pent-up rage, Steve wants to do just that. Yet Saint draws on the deeper power of forgiveness and faith. “No one took my father’s life. He gave it.” Mincayani and Steve are reconciled and find peace.
In 1989, 33 years after the killings, several Waorani converts who participated in the spearings began reporting that they had seen figures, or lights, and heard singing above the riverbank after the men were killed. Apart from this reference, which the film accepts without question, the rest of the final scene is fictionalized. The actual Steve Saint and Mincaye were never estranged. This is the stuff of Hollywood, and a perfect way to end a missionary drama for the folks back home. The deaths of the five missionaries in 1956 became an archetypal narrative of missionary sacrifice and heroism for evangelicals in the United States and around the world. As believers, we respond to an apparent sequel that is just as dramatic and unambiguous.
The reality is more complicated. The challenge of reconciliation for the Waorani was never with the missionaries or their family members; it involved finding a way to end the bloody vendettas among themselves and to coexist with former enemies. The End of the Spear vividly conveys the Waorani as agents of violence. The movie is less successful in portraying efforts by the Waorani themselves to make peace once the gospel was introduced.8 Nor is the theme of reconciliation extended to missionary relationships. Such deeply committed and determined women as Elisabeth Elliot and Rachel Saint found it easier to forgive their loved ones’ killers and live among them than to get along with each other. The film also avoids the painful issue of children forgiving fathers who abandoned them in order to risk and ultimately lose their lives, even for the best of motives.
In the final voiceover, Steve Saint speaks movingly of the gain out of loss that has come to his family. Mincayani has lived to be a grandfather, and a grandfather not only to his own children’s children but also to Steve’s. The movie is silent on the complicated gains and losses—along with peace—experienced by the Waorani since some first embraced a form of Christianity almost fifty years ago. This includes the struggle to retain their cultural identity in the face of enormous pressures, while at the same time not remaining frozen in the past. It does not recognize the quiet efforts of others in addition to Rachel Saint—Peeke and Jung, for example—to help the Waorani survive in the modern world and to embrace a Christianity that means more than simply, “Thou shalt not kill.”9 They, too, have experienced gains and losses, seldom neatly balanced. For all their imperfections, they have tried to model Christianity in the midst of the beauty and the mud and the bugs and the dailiness of jungle living.
No one movie can do it all. And good cinema is often inaccurate history, though perhaps the bar should be higher when narratives explicitly concern God’s work in the world. The producers of The End of the Spear sought the authenticity associated with a true story without the difficulties that real history also brings. They gave the audience a stirring account, but not as much as we needed to know.
Kathryn Long is associate professor of history at Wheaton College. She is writing a book on the history of Waorani/missionary contact.
1. “Waorani” is a plural or collective noun; “Wao” refers to one person and is the adjectival form. To avoid awkwardness in English prose, I have chosen to use Waorani throughout as both noun and adjective.
2. Beyond the Gates of Splendor (Bearing Fruit Communications, 2002; Twentieth Century Fox Home Entertainment, 2005); available on dvd.
3. Steve Saint, The End of the Spear (Tyndale, 2005). Two fairly romanticized sources that include some of the Waorani history depicted in the film are Ethel Emily Wallis, The Dayuma Story (Harper & Brothers, 1960); and Wallis, Aucas Downriver: Dayuma’s Story Today (Harper & Row, 1973).
4. Although it was written in the late 1970s, the most credible summary of criticism published in English, including these two issues, remains chapter 9, “The Huaorani Go To Market,” in David Stoll, Fishers of Men or Founders of Empire? The Wycliffe Bible Translators in Latin America (Zed Press, 1982). Stoll was no friend of Wycliffe or the sil, but he did extensive research and refrained from the kind of spurious charges that have since been made. A Spanish source for extensive information on Waorani history, as well as criticisms against American Christians who have worked among the people, is Miguel Angel Cabodevilla, Los Huaorani en la Historia de los Pueblos del Oriente (cicame, 1999). The quality of some of the interviews and materials Cabodevilla has compiled is uneven, but his editorial voice generally is perceptive and balanced.
5. All English accounts are translations from versions of the story told by various Waorani eyewitnesses. Some early variations may have resulted from the fact that Rachel Saint was still learning the difficult Waorani language and misunderstood some elements of the accounts. Saint’s account appeared in an epilogue to the 1965 edition of Wallis, The Dayuma Story; see also an abridged version by Saint, “What Really Happened, Told for the First Time” in Decision, January 1966, p. 11. Anthropologist James Yost and writer John Man conducted extensive interviews in April 1987 with Geketa, the leader of the spearing party, including a visit to the site of the attack. According to a translated transcript, Geketa indicated that the bullet entered just above Nampa’s eye and lodged there. An edited version of Geketa’s story appeared in the sil film, produced in 1988, “Tell Them We Are Not Auca We Are Waorani,” where Geketa stated, “The man [missionary] shot Nampa.” Steve Saint reports a version where a Waorani woman grabbed the arm of a missionary with a gun. “Nampa was grazed by a gunshot and fell down hard.” He died a year later “while hunting.” See Saint, “Nate Saint, Jim Elliot, Roger Youderian, Ed McCully, and Peter Fleming, Ecuador, 1956: A Cloud of Witnesses,” in Susan Bergman, ed., Martyrs: Contemporary Writers on Modern Lives of Faith (HarperSanFrancisco, 1998), p. 151. The account in Saint’s book, The End of the Spear, is similar.
6. Wallis, Aucas Downriver, p. 100.
7. Amy Rogers, “Ecuadoran tribe transformed after killing of 5 missionaries,” Pittsburgh Post-Gazette, January 8, 2006. Among others, I was interviewed for this article. After a painful dispute, the sil asked Rachel Saint to leave the Waorani work in 1976. She did and later retired from the organization to return to Ecuador as a private citizen and live among the Waorani.
8. For Waorani agency and the missionary contribution, see James S. Boster, James Yost, and Catherine Peeke, “Rage, Revenge, and Religion: Honest Signaling of Aggression and Nonaggression in Waorani Coalitional Violence,” Ethos, Vol. 31, No. 4, pp. 471-494. Some isolated violence has continued.
9. These have included U.S. missionaries and Latin American nationals representing Mission Aviation Fellowship, the Plymouth Brethren, hcjb World Radio, and the Christian & Missionary Alliance.
Copyright © 2006 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.
- More fromKathryn Long
Thomas Albert Howard
Is a Catholic out of place on Wheaton’s faculty?
- View Issue
- Subscribe
- Give a Gift
- Archives
The Wall Street Journal did evangelical higher education and, just maybe, the task of Christian unity a favor when it published a front-page story on the plight of Joshua Hochschild (January 7, 2006). A philosophy professor at Wheaton College, Hochschild was dismissed from the faculty after converting to Catholicism. The president of Wheaton, Duane Litfin, ruled Catholic theology incompatible with Wheaton’s statement of faith, to which all faculty assent at the beginning of their careers and renew upon signing their annual contracts, a customary practice at many evangelical colleges.
L’affaire Hochschild, as we might call it, is but the latest manifestation of a simmering conflict of opinion over how evangelical colleges should posture themselves toward the future. In many respects, the episode at Wheaton mirrors another celebrated incident from the1980s, when the literary critic Thomas Howard (no relation, oddly enough) was obliged to resign from Gordon College in Wenham, Massachusetts (my home institution) after converting to Catholicism. Like Hochschild, Howard wistfully boxed his books, but his departure raised more questions than it settled. Hochschild’s departure raises similar questions.
Is an evangelical liberal arts college (i.e., not a seminary and not a church), and one that prides itself on intellectual engagement, served by a statutory environment that effectively excludes all Catholics, and indeed most non-evangelical Christians, from the faculty ranks? Having commendably avoided the seductions of secularism in the 20th century, do evangelical colleges—such as, say, Wheaton, Taylor, Gordon, and Westmont—now suffer from another problem: superattenuated retrenchment, a defensiveness increasingly unbecoming in a world in which many evangelicals look upon the legacy of Mother Teresa about as favorably as that of Billy Graham? By refusing even the possibility of a single Catholic faculty member—including self-described “born again” Catholics or those with deep sympathy for Protestant theology—are evangelical colleges failing to take seriously the biblical mandate for Christian unity? Would the prospects of genuine ecumenical work be improved if evangelicalism’s best and brightest had a chance to rub shoulders with a Catholic scholar or two during their college years?
While the Wall Street Journal article prompts such questions, it misconstrues an important aspect of the contemporary Christian intellectual scene. Its author attributes the firing of Hochschild to a “new orthodoxy” sweeping through church-related higher education, a novel vigilance to uphold the religious mission of Christian colleges. This rings true in some respects, especially for mainline Protestant or Catholic institutions trying to recover religious identities compromised by historical forces analyzed trenchantly in James Burtchaell’s The Dying of the Light (1998).
But the problem with many evangelical colleges is not necessarily the dying of the light, but rather hiding it under a bushel, a determined attachment to the certainties of a subculture derived from fairly recent historical experience at the expense of new, promising opportunities for theological depth and ecumenical engagement. Indeed, the phrase “new orthodoxy” for many evangelical scholars today, far from conjuring up strictures in hiring, will call to mind Thomas Oden’s recent book, The Rebirth of Orthodoxy: Signs of New Life in Christianity (2003). A Methodist theologian at Drew University dispirited by the trajectory of liberal Protestantism, Oden has long called for a “new ecumenism,” not the ecumenism in which social concerns often edged out doctrinal considerations, but a unity built around Nicene Christianity, a robust doctrine of the church, and reengagement with a shared apostolic and patristic heritage. Oden’s call for an orthodox ecumenism—one that elides while still recognizing the significance of 16th-century conflicts—has been borne out in numerous scholarly projects in recent years. The cumulative impact of these efforts on evangelical thought and culture has been estimable.
Consider, for example, the trends analyzed in Colleen Carroll’s The New Faithful: Why Young Adults are Embracing Christian Orthodoxy (2002). A journalist interested in the religious climate among young people today, Carroll documents the enormous interest in ancient, liturgical Christianity among younger, educated evangelicals—sometimes leading to conversion to Orthodoxy or Catholicism, more often leading to greater attentiveness to tradition and ecclesiology, almost invariably leading to criticism of stale Protestant-Catholic polemics and a weariness with the fearmongering anti-Catholicism that has pervaded much of twentieth-century evangelicalism.
And this brings us to the rub. The Hochschild case at Wheaton has a recognizable generational-cum-theological aspect, a conflict between those who want to circle the wagons around 20th-century evangelical doctrinal formulations (above all, a pinched definition of biblical inerrancy increasingly qualified or disavowed by evangelical theologians), encoded pointedly in faith statements, and those who believe that the fullness of Christian expression predates and transcends the wisdom of the last few generations. Put differently: on the one hand, younger faculty and many students (with some sympathetic administrators and trustees) increasingly feel that if evangelical institutions do not broaden their faith statements in the direction of orthodoxy (in Oden’s sense), they risk intellectual narrowness and impoverish students’ ability to act upon Scripture’s ecumenical mandate. On the other hand, many senior administrators, such as those at Wheaton, and many trustees (with some sympathetic faculty and students), equate tampering with existing faith statements as a dive onto the slippery slope of secularism. If colleges alter their faith statements, President Litfin of Wheaton writes in his recent book Conceiving the Christian College (2004), the ultimate destination is “entirely predictable”: “the institution will wind up just another formerly religious school, basically secular in reality if not in name.”
To be fair, Litfin’s worries are not unfounded: the evangelical schools that make up the Council for Christian Colleges and Universities (CCCU) don’t need to look very far to find examples of once-Christian colleges long estranged from their original mission. As I have become acquainted with robustly Christian institutions and those living off the capital of a former glory, I’m persuaded that the future lies with the former, not the latter. Judicious hiring practices and faith statements therefore remain of abiding importance, not only to ensure a clear mission but—and one can argue this on liberal grounds—to nourish a rich institutional diversity in higher education. CCCU colleges have contributed greatly to this diversity, not by “celebrating diversity” in the abstract, but by being attentive to their actual mission.
And yet—and yet. As Carroll’s The New Faithful and other analyses suggest, we are living in a new era. Not only are the anathemas, divisions, and stereotypes of the 16th-century breach breaking down all around us at last, but also the fundamentalist-modernist controversies of the early 20th century, which account for much of the embattled sense of present-day evangelicalism, are increasingly remote from current challenges. If one uses political clout, publishing notice, and church attendance as barometers of cultural authority, evangelicals are now in the driver’s seat with respect to certain aspects of American society. (It bears remembering that power corrupts, as Lord Acton famously said, and power that retains an embattled sense of powerlessness, is, well, … Acton would have something pithy to say about this too.)
Signs of the new understanding across formerly hostile lines are aplenty and, for some, the catalogue below is a familiar one, but it’s important for Christian educators, not just scholars, to take these into consideration as they consider the future. Particular importance should be attributed to the following:
• The watershed of the Second Vatican Council (1962-65), especially the Council’s Decree on Ecumenism (Unitati Redintegratio), which made clear that “both sides were to blame” for the “crisis” of the 16th century, that “truly Christian endowments” exist outside the Church of Rome, and that greater cooperation with “separated brethren” was a theological necessity.
• A massive shift in opinion over the past few decades among evangelicals in their attitudes toward Catholics, from viewing them as apostates and threats to the American way of life to partners promoting a “culture of life”—or what Timothy George of Beeson Divinity School has memorably described as an “ecumenism of the trenches.”
• The historic papacy of John Paul II and his efforts toward mutual understanding, expressed best in Ut Unum Sint (“That they be one”), in which he even suggested that the Petrine Office is “open to a new situation” to promote ecumenical progress.
• Tremendous theological rapprochement, especially on the crucial doctrine of justification, the major point of contention during the Reformation. For many, the signing of the Joint Declaration on Justification in 1999 by representatives of the Lutheran World Federation and the Roman Catholic Church signaled the beginning of an hitherto unimaginable theological era.
• A greater understanding of the “catholicity” of the Reformation itself, as promoted by leading American Protestant theologians such as Carl Braaten and Robert W. Jenson in their book, The Catholicity of the Reformation (1996).
• The shift of Christianity’s center of gravity from the Atlantic North to the Global South and an attendant necessity for cooperation between Catholics and Protestants, lest 16th-century-style conflicts repeat themselves to the detriment of a compelling witness.
• The willingness of key thinkers, Protestant and Catholic, to point out the closing gap between former divisions. Witness Karl Lehman and Wolfhart Pannenberg’s The Condemnations of the Reformation Era: Do They Still Divide? (1990) and Mark Noll and Carolyn Nystrom’s Is the Reformation Over? (2005), extensively treated in these pages.
• Numerous collective efforts of unofficial theological cooperation, most notably in this country in the Evangelicals and Catholics Together initiative spearheaded by Richard John Neuhaus and Charles Colson.
• Greater efforts among some evangelical colleges to foster ecumenical understanding. The faculty at Gordon College, I’m proud to say, has begun relationships with St. Anselm’s College, a neighboring Benedictine institution, and Hellenic College, a Greek Orthodox college, for the purpose of conversation and mutual instruction.
• The considerable influence of what we might call the “Taizé ethos” among young people, and this French community’s commitment to serve as “a concrete sign of reconciliation between divided Christians and separated peoples.”
What do these developments add up to for evangelical higher education in general and for the maintenance of exclusionary faith statements in particular? How should schools proceed prudentially in this heady climate, faced with partisan A, who would like to abolish all faith statements in the name of academic freedom, and partisan B, who would reify current arrangements in perpetuity? Both partisans, I should reiterate, have good cause for their arguments and both recognize that genuine principles of intellectual integrity are at stake, not to mention practical concerns about alumni loyalty, faculty morale, student recruiting, and the like.
Let me suggest two provisional measures, which, although perhaps not entirely pleasing to the partisans, might at least create the necessary space for greater dialogue and understanding.
First, schools might consider establishing a working study group to examine the current faith statement, its rationale, and the intervening theological developments since its inception. For many evangelical colleges, present-day faith statements precede the aforementioned developments; many reflect lingering overtures to fundamentalist positions in the fundamentalist-modernist controversies of the twentieth century, which played out in a general Protestant milieu of entrenched anti-Catholicism. Such a study group could read together various documents, and present findings and recommendations to the faculty, administration, and board of trustees. Conversation often produces more conversation, and questioning can lead to questioning, but you have to start some place. Redoubled stasis is rarely the hallmark of dignified purpose.
Second, in light of the aforementioned developments, schools might consider an “exception clause” to current hiring practices. While maintaining current faith statements, this would recognize that certain scholars exist, Catholics but also Orthodox or mainline Protestants, who, while unable to sign the current statement, would not only respect but sympathetically engage the mission of the institution and offer themselves as a valuable conversation partner. Such an exception clause could even be designed restrictively, requiring for any given candidate the assent of the faculty senate and key administrators. This would not swing open the flood gates, but it might create the statutory possibility of, say, a Catholic of good will donning cap and gown on evangelical campuses, countering the tendency of such institutions to become, as someone quipped, coddling cocoons of the like-minded. Students would benefit enormously, for they would have the opportunity to hear the actual idiom of a “separated” brother or sister and they would thereby gain greater understanding of the distinctiveness of their own faith. All too easily, I have discovered, evangelical students can finish their undergraduate years with misperceptions of Catholicism inherited from their subculture largely intact. He who knows only one, knows none, the poet Goethe said about language. The same applies to expressions of the faith; young evangelicals need encounters with non-evangelical Christians not just to understand “the other” but to understand themselves.
What is more, an exception clause would amount to a principled measure on behalf of the institution. In the history of higher education, indeed in the history of most institutions, statutory changes are provoked, belatedly and awkwardly, by crisis and controversy. One could readily imagine, for example, a faculty member converting to Catholicism and then suing a college for discrimination if forced to leave her post. Presently, the courts might give preference to the institution in such a case, but this is a trend in jurisprudence that colleges shouldn’t bet the farm on. Or, a college might act unbecomingly from pecuniary interest alone, after, for instance, some future study demonstrated a significant number of ecumenically inclined (“new faithful”) parents are withholding their children and dollars from evangelical institutions for fear of a narrow education. (Alas, from anecdotal evidence, I know this is already taking place—and I have even heard promising young evangelical scholars express a preference not to teach at evangelical colleges, fearful of too restricted a range of theological outlook.)
Instead, it’s better to proceed upon theological principle and an astute reading of the times. For Christian educators, the virtue of prudence might sometimes demand an impassioned defense of the status quo, but this is not an inexorable law. What appears as a high-minded defensive strategy from present seats of authority risks appearing as unimaginative and narrowly preservationist from the standpoint of the future. And this precisely is the challenge (and opportunity) of evangelical educational leadership today. In addressing the challenge, it bears remembering that the task of leadership is not simply to express the loyalties of one’s constituents, but also to educate these loyalties in the direction of more capacious understanding and deeper propriety. Such actions might prove unpopular in the short term, but right action and popularity have always had a strained relationship.
The present challenge is especially pertinent to the current generation of educators who stand proudly in the reform or neo-evangelical tradition, associated with figures such as Carl Henry and Harold J. Ockenga. These thinkers, it will be remembered, rose to the occasion in the mid-20th century to challenge American fundamentalism for shortchanging the life of the mind and downplaying the need for social engagement. Carl Henry’s Uneasy Conscience of Modern Fundamentalism, in particular, stands out as a signpost of forward thinking in an otherwise uninspiring time for conservative Protestantism. Thankfully, great strides have been made with respect to intellectual seriousness and social engagement since the mid-20th century—and it bears noting that already in 1947 Henry spoke of a “truly spiritual ecumenicity” and the need to reconnect American evangelicalism to “the Great Tradition” of historic Christian orthodoxy.
Nevertheless, evangelical higher education still has an uneasy conscience to reckon with. The issue today is different from but perhaps not altogether unrelated to the one that Henry and Ockenga faced. In short, it’s a failure to understand that cultural authority necessitates greater magnanimity toward others, and that Christ’s words about Christian unity remain an imperative, not an option. Evangelicals need not fear the occasional non-evangelical Christian scholar in their midst, treating her like a infection to be excised. Rather, evangelicals should and can develop the institutional self-confidence to play the role of magnanimous host, recognizing in fact that there are certain crucial “other” voices that they should want among them. Indeed, what reflective evangelical parent in America today would not want the future Flannery O’Connor, G. K. Chesterton, or J. R. R. Tolkein to instruct their children?
But at an even deeper level, evangelical institutions should question the wisdom of current arrangements because they work against one of evangelicalism’s strengths: taking seriously the Great Commission. In Jesus’ high priestly prayer in John 17, He prays explicitly for the unity of the church: “I ask not only on behalf of these, but also on behalf of those who will believe in me through their word, that they may all be one.” Unity, and the fellowship it presupposes, makes truth attractive to those outside the fold; for, as Christ continues, “The glory that you have given me I have given them, so that they may be one, as we are one, … so that the world may know that you have sent me and have loved them even as you have loved me” (NRSV).
In the final analysis, the decision to welcome sympathetic Catholic scholars in the house of evangelical education should flow from the heart of the Gospel itself: from the evangelical concern about the Great Commission. Evangelism divorced from ecumenism, rightly understood, vitiates the cause it putatively serves. Evangelical liberal arts colleges are neither missionary agencies nor churches; they are not, in other words, on the front lines in proclaiming the gospel, baptizing and making disciples. But they are seats of intellectual growth, where young people can learn to think seriously and theologically; where ideas can be exchanged and improved upon; and alas, where divisions within the church’s history might be understood and, with grace, worked to overcome. Without an occasional flesh-and-blood Catholic on the faculty, this task is enormously compromised. And herein lies the cause of a new uneasy conscience.
Thomas Albert Howard is Associate Professor of History and the Director of the Jerusalem & Athens Forum at Gordon College. He recently published Protestant Theology and the Making of the Modern German University (Oxford, 2006).
Copyright © 2006 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.
- More fromThomas Albert Howard
Sarah Hinlicky Wilson
A female apostle? Impossible!
- View Issue
- Subscribe
- Give a Gift
- Archives
The Bible on your shelf doesn’t actually exist. No exact original of it is to be found in Greek, Syriac, or any other ancient language. It is, instead, the product of hundreds of compiled parchments and papyri, containing big blocks of text or little bits of it, some ancient, some more recent, some ancient but recently discovered. Along the way they got copied into uncials and minuscules, dubbed with names to inspire novels (Codex Sinaiticus; Philoxeniana), and now are signified in the clearinghouse Nestle-Aland Novum Testamentum Graece as
It would seem to be a straightforward work of science to sort, date, and judge each of the texts, and in many ways it is. There are rules for comparing scraps of majuscules and scraped palimpsests. The scriptural scholars of bygone eras help contemporary ones through their own questions scrawled in the margins of their Bibles. And it’s not too hard to recognize and correct the ever-so-slightly incorrect transcriptions of some sleepy monk in a tomb-cold scriptorium. But the letter is not copied alone; so is the spirit and the meaning. Text critics are inevitably exegetes, and aspiring exegetes must also be text critics. This is Eldon Jay Epp’s basic principle for biblical studies.
Which brings us to Romans 16:7, embedded in the oft-overlooked collection of greetings to various Christian luminaries at Rome. Here Paul hails his “relatives who were in prison” with him, “prominent among the apostles” and “in Christ before” he was. This impressive pair is Andronicus and his coworker. The latter is sometimes called Junia—thus the KJV, every other English translation up till the 1830s, and nowadays the NRSV. The lion’s share of recent English Bibles, though, give the name Junias, with the –s on the end. The RSV specifies Andronicus and Junias as “my kinsmen” and “men of note among the apostles”; the Good News generously adds a footnote after Junias suggesting the name “June”; the NIV—most widely read of all contemporary versions—offers no footnoted alternative to Junias at all. The matter at stake in the choice of names is the simple question asked of everyone upon entry into this world: Is it a boy or a girl?
Until about a hundred years ago, the consensus was universal. Junia was a woman. Every church father, without exception, thought so. Even John Chrysostom, not exactly famous for positive thoughts about the female sex, commented, “How great the wisdom of this woman must have been that she was even deemed worthy of the title of apostle.” However, in a curious twist of fate, the church a millennium and a half later concluded not that her wisdom was so great, but that, if she was indeed worthy of the title of apostle, then she wasn’t a she at all. The very liberal vanguard that exalted the historical-critical study of the Bible found the leadership of a woman unthinkable, and so made Junia into Junias, a man—even though there is not a single record of the name Junias anywhere in ancient Rome.
The switcheroo from female to male was possible, in the first place, because the apostle’s name appears only once, in the accusative form “Junian.” (Exegetes need to be not only text critics, but first-rate grammarians as well.) The suffix –n is found on both masculine and feminine nouns. The one textual clue to help choose between them, in this case, is an accent mark. A do start inserting accent marks in their fresh copies, they always choose the acute and never the circumflex. The only variant that they display is to the name Julia. Even this mistake is telling: Julia is another woman’s name (in fact, the most popular Roman name for women), and probably first appeared when one of those notoriously sleepy scribes skipped ahead to Romans 16:15 and borrowed the name from there. The grammatical and even accidental choices of the medieval copyists reveal the whole interpretive tradition behind them.
And yet—masculine constructions of must be a man, together begat great scholarly ingenuity. Sometimes Roman surnames were contracted to shorter forms; an example is Patrobas in Romans 16:14, which is short for Patrobios. Junias, then, was proposed as a contraction of the attested Roman surname Junianus. There isn’t the slighest shred of evidence that this is what happened, yet somewhere along the way the contracted-Junianus theory turned into a sure thing. Epp documents how the idea grew from conjecture to certainty in its own kind of scribal-transmission error. It culminated in the 1927 Nestle Greek New Testament, where the distinctly masculine version of the name, complete with circumflex, was offered as the definitive and undisputed reading—even claiming the oldest unaccented texts in its defense! Only in 1998 did the standard Nestle-Aland and United Bible Society editions replace the masculine with the feminine name. Accordingly, few English translations reflect the correction.
Is it really possible that plain textual evidence could be so obscured by plain bias? If John Chrysostom, of all people, allowed that Junia could be a woman and an apostle at the same time, could the progressive leaders of the twentieth century be guilty of such blatant prejudice? Epp cites the report of Bruce Metzger’s Textual Commentary to the UBS (2nd. ed.), from as recently as 1994, explaining the dispute about the name:
Some members, considering it unlikely that a woman would be among those styled “apostles,” understood the name to be masculine Wörterbuch, pp. 70f.). Others, however, were impressed by the facts that (1) the female Latin name Junia occurs more than 250 times in Greek and Latin inscriptions found in Rome alone, whereas the male name Junias is unattested anywhere, and (2) when Greek manuscripts began to be accented, scribes wrote the feminine
In other words, the Junia reading had textual evidence on its side; the Junias reading had none; yet until less than a decade ago, the latter still won the day.
Could Paul have called a woman an apostle? He certainly did not use the term lightly. He was compelled to defend his own apostolicity, as the last and untimely born, to the disciples of Jesus, whose friendship with the Lord automatically granted them apostolic status. It can only be the highest of Pauline praise to call Andronicus and Junia prominent among the apostles.
That he was capable of applying this praise to a woman is suggested not only by the textual evidence but by the context of Romans 16 as well. A woman and deacon by the name of Phoebe is entrusted with the letter itself. Seventeen men are greeted along with eight women (omitting Junia), but of the twenty-five, seven of the women are described as contributing the most to the churches, while only five men receive that distinction. Prisca is listed ahead of her husband Aquila, and in two places (vv. 6 and 12) four of the women are said to have “worked very hard,” the same verb Paul uses to describe his own apostolic ministry in 1 Cor. 4:12, Gal. 4:11, and Phil. 2:16.
The early church thought that Junia the woman was an apostle, yet remained indifferent to the implications of her status. The modern church disbelieved the apostolicity of any woman, and so ignored the hard evidence. Between the textual and contextual witnesses, in the interplay of exegesis and grammar, Epp draws the reasonable conclusion that there was indeed a female apostle named Junia. But, he notes, “human beings carry out not only textual criticism and interpretation, but implementation as well, and that makes all the difference.”
Sarah Hinlicky Wilson is a doctoral student in systematic theology at Princeton Theological Seminary.
Copyright © 2006 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.
- More fromSarah Hinlicky Wilson