What’s Wrong with the World

The men signed of the cross of Christ go gaily in the dark.

About

What’s Wrong with the World is dedicated to the defense of what remains of Christendom, the civilization made by the men of the Cross of Christ. Athwart two hostile Powers we stand: the Jihad and Liberalism...read more

Disenfranchising the Dead

Postmodernists hate the democracy of the dead. They hate the reality. They presumably also hate the phrase, coined famously by G.K. Chesterton. All those dead white males, pshaw!

Writing as I do here for a web site that bears the name of one of Chesterton's books, it would not behoove me to overlook a particular academic phenomenon that has recently come to my attention. Not that I would otherwise have overlooked it. Indeed, twenty years ago, when I was fighting the wars of the liberal arts with a (metaphorical) claidheamh mor in the English and Philosophy lounges of Vanderbilt University, the only thing that would have stopped me from blogging about it in a white heat was the minor historical fact that blogs did not yet exist. Still, a number of professors and fellow students would have gotten an earful--either as patient, sympathetic listeners or as perpetrators. That despite the fact that I had no strong contacts with the History department.

I have recently learned from sources that professors in the History department of Western Michigan University proscribe the use of the expository present tense in writing about historical personages, authors, or ideas. In other words: According to these professors, students are not supposed to write something like, "Aquinas famously tells us in his Summa Theologiae that 'the existence of God can be proved in five ways[.]'" Or "Moreover, though Aquinas regards the human soul as subsistent, he does not think of it as a substance in an unqualified way, but rather as a kind of incomplete substance[.]" These are two sentences taken at random from the pages of Ed Feser's Aquinas. The philosophers among my readers could of course cite hundreds, nay, thousands, of sentences like them from works in philosophy, where the expository present is (last I checked) still ubiquitous. Even, perhaps especially, when referring to people who are dead.

But in the new historical mode, all of that is o-u-t. Aquinas, rest his soul, is dead. Let the dead bury their dead. Those who wish to talk about him, if we must talk about him, should keep the coffin lid firmly nailed down and the Angelic Doctor firmly boxed into his own period. The very notion of the Great Conversation, where "he being dead, yet speaketh," where mind meets mind maugre the little barriers of time and space, is anathema.

It's not that I didn't understand in the abstract the way that the postmodern and historicist mind works. I just hadn't previously heard of the diabolically clever method of trying to insure that the concept of the Great Conversation never occurs to students: Make sure that they never write, "Aquinas argues..." or "Aristotle claims..." or "Berkeley holds..." Chronological snobbery is now made visible in language. Lex orandi, lex credendi. If Aquinas, Aristotle, and Berkeley are allowed to speak to us now, we might have to take them seriously. If we write only, "Aquinas argued," "Aristotle claimed" or "Berkeley held," we can keep Aquinas, Aristotle, and Berkeley safely at arms' length. They were, after all, merely men of their times.

Since I prefer to end a post like this with concrete suggestions, here are a few, of different types:

--We should find out how widespread this pernicious nonsense is. If you think your Christian alma mater is the greatest thing since sliced bread and you contribute, find out if their humanities departments are destroying the concept of the Great Conversation and take that into account in your alumni giving decisions. "Commandment IX" on this page contains a bit of weird historicist pedantry that comes all too close to the proscription just described, though it does have one small loophole.

--Teach your children about the democracy of the dead and the conversation across time. Teach them how exciting it is. Arm them against people who would try to force them to write without using the expository present. Tell them you'll be proud of them if they resist. Tell them why this is important.

--Ditto for your students if you are a teacher at any level of education.

--If you know that some other department in your institution has professors who push this, tell students that you will back them up if they resist. You can be especially effective doing this if, e.g., the student is a major in your department and professors in your department encourage the expository present, or you are an outside member on the student's thesis or dissertation committee.

--Never, never think that things like this are trivial.

Comments (56)

Amen to all that, Lydia. Just for the record, we in the English Department here are absolutists on using the historical present when using the words and ideas of those long gone as well as those just published yesterday. I haven't heard any complaints from our majors about being instructed to do otherwise in their other classes (and they DO tell us when other professors do things differently), but I'll keep my ears open.

Thanks, Beth, I would have expected no less. I hope that the same is true of Hillsdale. I shd. specially ask someone about that.

I thought of actually quoting the "Commandment IX" which I cited in my practical applications. The only reason I didn't was one of flow. It is from "Ten Commandments of Good Historical Writing" by someone named Theron F. Schlabach and is posted at the web site of St. Thomas University (!) in St. Paul, Minn. Here are some pompous samples:

The "historical present" causes more confusion than it is worth. Sense of time and context is first among the historian's contributions. Writing of past events in the present tense is usually evidence that the author lacked appreciation for historical setting.

Historical essays and book reviews present special problems. But even the author's act of writing a book took place in the past, even if only a year or two ago. Thus, Hofstadter ARGUED, not "argues", in his Age of Reform. Hofstadter is now dead, and presumably cannot argue (present tense). Even if he were still living, we do not know that he has not changed his mind; authors do change their minds.

Schlabach (whoever he may be) does allow the following rather bizarre exception:

On the other hand, the book, if it is the subject of the verb, does always continue to make the same point, so that you do use present tense. Thus, Hofstadter's Age of Reform "argues," not "argued".

I am told that even that exception would not pass muster at WMU among the history profs.

Hmm. If I am referring to an actual *event* that happened in the past, then of course I use past tense: "John Henry Newman wrote The Idea of a University during the Victorian era." However, when I discuss his ideas/words from that book, then I use the historical present: "In it, he explains the purpose of higher education from a Catholic perspective" -- because he is always explaining it to us, right now, in the text we have. The event of writing was a discrete action of the past; the ideas contained in the writing are always present and active. This is not a difficult concept, and not merely conventional -- as you say in the post, when we relegate the ideas to the past, then it makes them seem unimportant to us in the present, which is foolishness at best. And our students today have a hard enough time with the concept of respect for the past; they tend to think that nothing older than they are is of much value. The appropriate use of the historical present can be a help to us in our attempts to change that thinking.

And by the way, I don't ever let my students write that a book or article argues something. Only people can argue (say, write, assert, etc.). If the author is unknown (say an editorial or something), they are to write "the author argues" . . .

And by the way, I don't ever let my students write that a book or article argues something. Only people can argue

Exactly! If Theron F. Schlabach is so darned concerned about precision, to the point that he pontificates about the fact that Hofstadter is dead, why does he recommend making a book the subject of the verb "argues"?

Miss Beth is absolutely correct: it is putrid prose to make the work itself the subject in a sentence. Dr. Schlabach's injunction about the past tense is simply and absurd and promotes boring writing. His "exception" about making books the subjects of sentences violates The Historian Book Review Office Style Rule #10. As a former senior editorial assistant for said office, I accept that more or less as gospel. (And to boot, you can see that Dr. Spall's guide includes examples of the historical present.)

Agreed that this is important. I'm more on the side of the history department here, though I haven't thought about this idea at all till reading this post.

First of all, the argument against the "historical present" (a new phrase to me) doesn't strike me as post-modernism. It looks like good old-fashioned 19th-century historicism to me, and that's apparently how it's being defended. Conservatism traditionally has a strong historicist strain to it, too. Remember that was Leo Strauss's complaint against Edmund Burke. Studying historical thought in its concrete, historical context, rather than abstracting so-called universal principles and ahistorically applying them to the present; conservatives have traditionally supported the former and opposed the latter. Carl Schmitt is another good example of that.

It's not an attack on the Great Conversation (another new phrase to me). If we weren't interested in these ideas, we wouldn't be writing about them. To take your example, I'm sure that very few students of Aquinas study him from some purely disinterested motivation to learn about the times in which he lived. People study him because they're interested in his ideas. They're conversing with Aquinas. Just put his words in context, is all.

There are two extremes to avoid: quoting people out of historical context, and treating their words as if they apply only to their own time. I think that our tendency, in our Enlightened age, is to err towards the former. Avoiding the historical present in scholarly writing seems like a good corrective.

The other issue here is "the book argues that...." I'm definitely in favor of that formulation, without being against "the author argues that...."

As I understand it, there's a traditional distinction between two categories of "author": the implied author, and the actual, or historical, author. I think a lot of the confusion and error comes from conflating these categories.

The actual author is the real, flesh-and-blood person who sat down and wrote the book. He has a history. The implied author is a figure of speech that we create based on the text itself. He's determined completely by the text. If all we have is the book itself - for instance, if we don't know the identity of the author - then when we say "the author argues that...," we can only be talking about the implied author. Everything we say about this imaginary construct is derived from the book. Therefore, "the book argues that..." is actually more immediate, more "literal."

For example: Suppose that, unknown to the reader, there's a typo in the book, or maybe some intentional edit, that significantly affects the argument. That's the book's argument that's affected, and by definition the implied author's. But the actual author's argument is different. By attributing the book's argument to the actual author, we're making a substantive mistake in this case. In every case, even if we say, "the [actual] author argued," we're making a category mistake; of course we're usually right in substance, which justifies our conflating the two categories of author. And if we say "argues," then we're making a much worse mistake; as was said above, the actual, historical author argued the point when he wrote the book; he's not arguing it eternally.

I haven't heard of the Great Conversation before and assume it means something like the intellectual legacy of eminent thinkers which we study and with whom we try to maintain a fruitful 'dialogue' that continues through time.

The prohibitions against using the expository present tense (when writing about historical figures) are presumably driven by ideology rather than mere pedantry. Is this an attempt to censor ideas that are anathema to postmodern mind - if there is such a thing - by leaving them inert in an historical context where they can do no harm?

It surprises me a bit that professors at American universities appear to pay close attention to questions of style when supervising dissertations. Apart from the usual reminders about the mischief of plagiarism, proper citation of sources and the like, none of my tutors gave me any advice about how to write a thesis. However, they did criticize sloppy writing, poor organization of material, lack of originality, etc., when the work was handed in.

Aaron, I doubt that anything I say will convince you, but does it not seem rather, shall we say, extreme historicism to insist that students abandon the historical present in writing about the ideas in an ancient book? Are you seriously of the opinion that Ed Feser is in some danger of confusing his readers (or himself) by placing Thomas Aquinas insufficiently in the proper context in the history of ideas and needs the "corrective" of eschewing the expository present?

As a philosopher, I find such an idea frankly risible. I've been surrounded by historians of ideas for twenty-five years, all of whom use the expository present. As I said in the main post, it is ubiquitous in philosophy. While of course I have encountered discussions that I consider to be anachronistic, the idea that anachronism should be countered by the draconian banning of the expository present never occurred to me and seems completely wrong-headed. The solution to anachronism is simply not to be anachronistic, to be a good, careful historian of ideas. A careful, traditional historian of ideas will not be anachronistic, yet he is the one most likely to have learned to use the historical present. By contrast, a contemporary scholar filled up with trendy nonsense is highly likely to be anachronistic even if (perhaps, ironically, especially if) he has been carefully taught only to say, "So-and-so said." The convention of the historical present has such an important good effect--namely, of making the interaction with the ideas vivid in the present--that I would not trade it for the world, much less for some imagined effect in making students less likely to be anachronistic.

And Aaron, I won't agree that this is old-fashioned historicism unless you can find an old historian recommending it long ago. It appears to be a relatively recent phenomenon. If you could find a non-Marxist professor from, say, sixty years ago recommending the proscription of the historical present, I would be willing to admit that in his case this was merely an extreme (and in my opinion wrong-headed) version of old-fashioned historicism. I wd. note that Schlabach seems to think that he is recommending some kind of a change--at least, a change from something that has or had at one time been a widespread convention. A new convention has now been put into place.

No, it doesn't strike me as an extreme demand to make of students. It seems like a possibly useful gimmick to help them avoid a mistake that most of us tend to make. But your replies seem at odds with your original article. There, you said that the professors are "trying to insure that the concept of the Great Conversation never occurs to students." That's accusing the professors of espousing one of the two obviously absurd extremes. That's why I agreed with you that this stuff is important. Now, though, you seem to be saying, "Well, of course all of us believe in the Great Conversation, but we don't need this semantic gimmick to remind us that we're participating from different rooms." It's not that you haven't convinced me; it's just that I don't think there's much substantive difference between your replies and what I think.

By the way, Marxism - historical materialism - is famously against historicism. See Walter Benjamin's Theses on the Philosophy of History, especially theses five through seven, for a concise explanation of the antagonism between historicism and historical materialism. (The theses are only one paragraph each.)

That's why I agreed with you that this stuff is important. Now, though, you seem to be saying, "Well, of course all of us believe in the Great Conversation, but we don't need this semantic gimmick to remind us that we're participating from different rooms."

I don't know who "all of us" is, but I know for a fact there are plenty of professors who don't believe in it, and I'd be willing to wager a good bit that most or all of the ones proscribing the expository present do not believe in it and are using this rule to push a historicism that is congenial to their ideology. If I seemed to be nicer in my recent comment, it was an attempt to be nice to _you_. Since I don't know anything about you or your approach to scholarship, I was giving you the benefit of the doubt that you are not an historicist (which I would not mean as a compliment--an historian is quite different from an historicist) and that you do believe in the Great Conversation. I'm willing to grant for the sake of the argument (though of course you could convince me otherwise) that you are merely wrongly advocating a use of a silly, trendy gimmick in the cause of avoiding real anachronism.

Schlabach's been through a lot of changes in the past few years. He was a Protestant peacenik in Nicaragua when Danny Ortega was wowing sophisticated New Yorkers with his designer sunglasses and Reagan was the most evil man in the world. He got his doctorate from Notre Dame's theology department when its ghastly anti-Catholicism had only barely started to ebb. Now he looks like a guy who's slowly becoming an orthodox Catholic and not entirely comfortable with the non-progressive baggage that goes along with the whole drawn-to-the-Eucharist convert-your-life thing. I'd give him a break.

Besides, I get more torqued about academic Catholics, writing for Catholic audiences, who dogmatically insist upon BCE/CE. In terms of subsidiarity, that seems both more noxious and more critical to remedy than whether Cdl Newman said or Cdl Newman says.

Aaron, your comments, taken with Lydia's specific example of Ed Feser's expositions, raise a rather funny contrast. As far as I can tell, a major reason Ed has been forced to write extensively in his stuff, instead of new and helpful developmental work in philosophy, but remedial baby-steps philosophy, is PRECISELY because modern thinkers, in particular modern philosophers and physicists, have unwittingly and almost uniformly failed to note that the philosophers of yore (especially pre-Descartes) were not making their arguments from the same conceptual framework as is used today. It is not that these philosophers are largely unaware of history in general, or even the specific major events surrounding the lives of Plato and Aristotle (hopefully any educated philosopher knows those). Rather, these men simply have not taken into account the paradigm difference between thinking (for example) that formal and final causes are part of the explanatory picture, and the post-Newtonian picture that says physical agent causality is the only explanatory power we shall call science. They don't understand this difference in paradigm, because they work within an educational framework that rarely notes the sheer difference in paradigms, and virtually never seeks to actually GRASP the old paradigm in its full panoply (on the few occasions in which the difference is noted, the assumption that our modern one is not only preferably but is wholly superior is not so much stated as imbibed).

But this is due to NOT being engaged in the great conversation as if the pre-Descartian philosophers might have a worthwhile conceptual point of view, and that assuming we don't have to consider their arguments for that point of view as if they might be, simply, TRUE. True for them, maybe, (to be said condescendingly, of course), but not true for us, and not simply TRUE. I seem to remember this in reading Descartes (can't remember if it was the Meditations or not) opening his argument starting the conversation as if there were nothing to be borrowed or used from the past: tabula rasa.

Needless to say, this all speaks of an ideology of thought: ideas that were proposed by the ancients were proposed in a context, and can be properly evaluated only within that context, so that they can only pertain to us in a very superficial way. This ideology rejects common human nature, rejects that humans share the same basic rational and passionate capacities, the same basic aspirations for peace and betterment and long-term meaning. Thus this ideology assumes (without argument) that the argument of a Plato or Plotinus should not be considered as an argument about what is TRUE, true for us just as much as true for his contemporaries, because rooted in a valid capacity to know reality as it really is and communicate that. This ideology of skepticism is of course never established by argument (it cannot be, as it is self-defeating), but is always the backdrop behind the constant blather trying to erase human nature from today's part of the Great Conversation.

This current attempt at a mandate is of course a superficial, even trivial example of the problem, and on its own it would be of little note. But culture is composed of 10,000 little things, each one of which alone is trivial. Take away one, and you have the same culture. Repeat ad nauseum, and you do not. We have been engaged in a great effort, for 60 years quite open, (for some years before that mostly hidden) to take away one by one little bits of that Christian culture handed down by an accretion of thought and action over 2 millennia and more. We have already lost so much of our culture, we can no longer afford to treat individual instances as something to be ignored.

Both are pretty bad, but I'd say it's a case of apples and oranges. Remember, David, that my main post isn't about Schlabach. Schlabach just came up in the comments as an illustration that something like this isn't confined to Western Michigan U. My main post is about a completely secular department full of completely secular professors who just really don't love the past and who are in a doctrinaire fashion trying to separate their students from it. I'm sorry to say that Schlabach doesn't sound much better; look at his pontificating about the fact that dead people are dead and can't argue right now. However, if you really feel moved to give him a break, just try to see this requirement as applied by secular history departments all around the country as part of the death of the humanities. Because I really think that's how it is. The idea that the dead really can be co-conversants with us, that they are not just DSWMs, that their ideas should be alive for us and that we should take them seriously, is of the very essence of the humanities disciplines. Yet that concept has been severely under attack for decades by those who want to dismiss them as products of their time. This convention plays into that absolutely beautifully. Really beautifully. And I believe that for many professors who enforce it, it is intended to do just that. Hence it's a really important Bad Thing.

It sounds like another tired attempt to undercut the authority of our intellectual forbears by keeping them at a disdainful distance, the rationale provided being the demands of a phony grammatical rigor.

So I pulled out Newman's history of the Arians and found early in the first chapter such phrases as "Ancient writers inform us that ...which the Synodal letter describes as leading him to express contempt...Chrysostom's panegyric on the festival of his [Lucian's] martyrdom is still extant, Ruffinus mentions him in honourable terms, and Jerome praises his industry, erudition, and eloquence in writing..."

What a sloppy fellow, Newman. Why, you'd think people who begin sentences with "In the Bible, Jesus says..." were committing a heresy. But what those people know that these modern profs don't is that the truth speaks to us always, because it has no expiration date.

Of course, since Jesus who is the Word is now alive, and since the Bible is also His word, there is a delightfully correct sense (even under this new mandate) of saying "Jesus _says_ if you love me, obey my commandments".

But what those people know that these modern profs don't is that the truth speaks to us always, because it has no expiration date.

YES!

"I get more torqued about academic Catholics, writing for Catholic audiences, who dogmatically insist upon BCE/CE."

Yes, that's a drag, although in some cases that's probably a publisher's requirement, like dreadful gender-inclusive language.

Speaking of which, it seems that the current fad among Christian/conservative academics who don't want to go whole hog with gender-inclusivism is to switch back and forth between the generic male and the generic female pronouns. In some ways this strikes me as even more annoying, as it's so obviously a capitulation trying to avoid appearing as one.

Tony, your 7:04 post is right on the mark. If there's anything I've learned from reading Weaver, Montgomery, Berry, et al, it's that a)ideas have consequences, b)words mean things, and c) these two notions are inseparably related. Hence even seemingly small instances affect the whole -- it's the death by 1000 cuts.

Yes, that's a drag, although in some cases that's probably a publisher's requirement, like dreadful gender-inclusive language.

NM, I've never written a textbook, such as a history textbook, so it might be different there, but I've had great success resisting publishers on the "gender-inclusive language" thing. I think people trying to publish in history shd. do the same with the expository present. Maybe it's worse in history but my experience in philosophy with "gender-inclusive language" is that the claim that publishers won't let you avoid it is something of a bogeyman. I always encourage young philosophers to resist it, politely but firmly, and to make this clear before they sign any book contract. And, no, this isn't just "because I've already published." It just hasn't been that big of a deal. I've known of one publisher who wasn't asked before the contract signing and pushed a young philosopher at the proofs stage, but when it became evident that he would try to find some way to get out of the publication of the book altogether before backing down, the publisher backed down. And that was the worst case. All the others, it's been like, "Oh, is that in our house style guide? Well never mind, no problem. Sure, of course you can use generic 'he' and 'man'." In one case--not malice but just the left hand not knowing what I had discussed with the right hand--they tried to change it at the proofs stage anyway, but it was all resolved when I pointed it out.

Might not be that easy with BCE/CE or with this silly new policy, or it might be worse in history than in philosophy, but scholars should stand up to it anyway.

I have read through this discussion and, frankly, find it to be typical of the reasons scientists yank their hair out at the roots after spending time in the humanities building.

Somehow, we in the sciences have been able to get along just fine writing things like, Newton calculated that F = ma, without having to call Newton forth imaginarily into the present in order to affirm the value of his work or to converse about it. In fact, most of his colleagues would suspect that he is a nut if a scientist said, "Pasteur says...".

Of course, scientists do not communicate usually through books, but through highly stylized, rigorously formatted (IMRAD) "letters" called journal articles in which present, past, and future tenses have specific implied meanings with respect to facts. And not only that, the use of tense is usually restricted in each section of the article. This way, the reader knows better what the writer means or intends to mean. It may be time for the humanities to catch on to this and agree on a universal style & format for scholarly works in those areas. But then, no doubt, that would just become another exercise in herding cats.

Lastly, probably the only water-tight argument to support the idea of stating in the historical present what someone has said in the past would concern the words of Jesus, the eternal Word made flesh.

I'm a lurker on this blog, and I don't have a dog in the tense fight. But I did take a class with Theron Schlabach, and he's about as far from a postmodernist as you can possibly be. He is also one of the most devout Christians I know.

And David, if you're going to diss him, at least get your facts straight.

Schlabach's been through a lot of changes in the past few years. He was a Protestant peacenik in Nicaragua when Danny Ortega was wowing sophisticated New Yorkers with his designer sunglasses and Reagan was the most evil man in the world. He got his doctorate from Notre Dame's theology department when its ghastly anti-Catholicism had only barely started to ebb. Now he looks like a guy who's slowly becoming an orthodox Catholic and not entirely comfortable with the non-progressive baggage that goes along with the whole drawn-to-the-Eucharist convert-your-life thing. I'd give him a break.

Theron got his Ph.D. in History from the University of Wisconsin. He is a peacenik, though, in that he is a Mennonite, and therefore a pacifist. I don't know where you are getting the Catholic stuff, but it is an unrecognizable description of his views, as he is squarely in the Mennonite tradition.

'scientists do not communicate usually through books, but through highly stylized, rigorously formatted (IMRAD) "letters" called journal articles in which present, past, and future tenses have specific implied meanings with respect to facts.'

Yes, and that's the difference between science and the humanities; the latter aren't limited to "facts." The humanities speak differently than the sciences. One might even say that one of the things that's gotten us into this mess is the effort to have humane studies speak and behave like the sciences.

I beg your pardon, Don't Really Know. I'm afraid I skipped straight from the Theron page by clicking on the Maintained By link to get to http://personal2.stthomas.edu/gwschlabach/, which sent me to Gerald's cv, his links to sources for this and that, and the copyright, contents, partial introduction and index to Gerald Schlabach's latest book, the one from Brazos, via Amazon. From these I conjectured wildly. And I have no idea what relation Gerald and Theron have, father/son, uncle/nephew. I'd guess father/son. Probably fewer Schlabachs than Brandts here in the U.S.; they must at least be cousins. I'm not an academic, just an interested bystander who generally finds that whatever Lydia's interested in is interesting to me too. I don't know how or why she picked this particular page of Ten Suggestions, but it seemed interesting that she was- oops, it seems interesting that she illustrates her point via a website hosted by a Catholic university, with a faculty member who's a recent convert to Catholicism, Benedictine oblate, and other interesting things. That Gerald is not the Schlaback who wrote this set of suggestions but is merely probably related to the guy who did is, I think, irrelevant: If those Ten Suggestions had been credited to Anonymous or to John Henry Cardinal Newman, they would still be appearing courtesy of, and evidently therefore endorsed by, the said Gerald, the Mennonite Benedictine.

So, that's enough of exploring the tangent. My, how I miss card catalogs. I didn't mean to threadjack, or to appear to confuse the two Schlabachs. And I do think that the refusal among Christians writing for Christians to acknowledge that Jesus is Our Lord in the way they mark their years is more fundamental then a fashion for thinking that Kant 'is' no longer with us. In both cases, it's a matter of their loss, Toots. But how much greater the loss to give up Jesus as Lord than Kant as lunch partner.

he's about as far from a postmodernist as you can possibly be

Well, see, let me explain this again, with a little more detail: One of the very same people who gave me the info. (from firsthand experience) about the WMU history department, sent along (presumably by googling) the page from St. Thomas U. as a further illustration of what he claims is a widespread prohibition in History, his personal experience being with WMU. Now, when I read that page by Theron F. Schlabach, I knew, I just knew, deep in my heart, that if I made it the centerpiece of my main post, there would be people coming from all over to defend it and to say that it wasn't what it appeared, had nothing to do with postmodernism or historicism, was just an attempt to help undergraduates with their writing, or something like that. I don't know exactly how I knew this. Perhaps it had something to do with the fact that it's a Catholic university, and I have a lot of Catholic readers and some Catholic blog colleagues. Maybe it has something to do with the fact that it's surrounded by all these other writing rules, most of which look pretty garden-variety. But based on this informed hunch, I deliberately didn't make Schlabach's Ten Commandments the centerpiece of my main post.

However, for the record, I stand by my characterization of what he has written there as pompous and wrongheaded. It positively makes me grind my teeth to read a professor of history tell us all that so-and-so is dead and cannot say anything, that he may have changed his mind, and that the historical present is more trouble than it's worth. If he's not a postmodernist (which I'm prepared to grant to someone who knows him) then

a) he's evidently playing along very helpfully with a trend that, I have reason to believe from other sources, is being effectively used by objectionable historicists for their own reasons, and

b) he's, on this point, a pedant of entirely the wrong sort--namely, a pedant whose pedantic innovations and "corrections" create a problem while trying to solve a non-problem.

And urging students to write that a book "says" instead of that the author "says" is just icing on the cake. (And, I'm told, an exception that wouldn't pass muster among professors in the department discussed in the main post.)

Why did the term 'postmodernism' get into this discussion? Regarding the description of Mr Schlabach: Would the phrase 'a Christian postmodernist' be an oxymoron?

As an afterthought to my previous comment: It's not altogether clear to me why (some) professors in the humanities would try to impose a ban on using the expository present tense when discussing the ideas of "all those dead white males". I understand it isn't just some curious pedantry that's at work. No doubt there's a deeper ideological incentive for the professors to purge some truth or other from the minds of students - but I don't know which.

It's difficult to plant universal scepticism as a grounding principle even in the minds of pliable students: it obviously falls victim to its own writ.

Would the phrase 'a Christian postmodernist' be an oxymoron?

I think that in reality that phrase is an oxymoron, but as a sociological fact there are people who do identify themselves as such. However, I'm willing to take the word of an acquaintance of that particular author that he isn't one of them.

The term "postmodernism" got in here because I introduced it. I introduced it because postmodernism is rampant in the humanities and because postmodernists are interested in doing mischief that includes reading old texts in terms of "race, class, and gender" (just for starters) and effecting a radical separation between contemporary students and traditional or ancient authors. (Except, of course, for those older authors whom they wish to revive, such as, I don't know--de Sade or something, or maybe some obscure female writer here or there.) My understanding from the source I have been getting information from is that such attitudes of race-class-gender interpretation of texts and history are exemplified in professors pushing this requirement. This makes perfect sense to me. The radical historicizing of ancient texts is a hallmark of at least one type of postmodernism.

It's difficult to plant universal scepticism as a grounding principle even in the minds of pliable students: it obviously falls victim to its own writ.

Well, it it would be difficult, if you were limited to doing it during only a few classes out of a full range of 4 years of undergraduate courses. But the reality is

(a) the planting and watering continues through most classes;
(b) it continues on in spades in graduate work;
(c) it is emphasized even more in training future teachers; and
(d) those teachers do the prep-work in grade school and high school with all sorts of beating common sense out of kids.

Because the principle so obviously falls victim to its own writ, the teachers make sure that students never examine the assumptions much. The principle is NEVER taught directly, it is always taught through indirect means. Thus the student is forced to absorb it by osmosis through constant exposure to its effects, rather than by directly teaching it in philosophy class. Net effect, more than 80% of college graduates have absorbed it at least to a large degree, if not perfectly. Most college graduates take at least 10 years of post-college life to recover even a modicum of common sense, and unless they discover in an explicit manner how bamboozled they were, they NEVER recover fully.

On the "book says" versus "author says" controversy, what do you say when there's no author? If you have a text that was written over time by multiple authors/redactors, all with different and sometimes mutually contradictory "messages" to get across, each unaware that "his" text would later be modified, perhaps drastically, then isn't it nonsense to say "the author says"? That is, if you're referring to the (non-existent) actual, or historical, author, which seems to be the way the word "author" is used here, as opposed to the implied author.

And if "the book says" is more accurate in this extreme case than "the author says," then isn't it at least as accurate in less extreme cases? - or rather, specifically in those cases where our knowledge of the actual author's mental state at the time he wrote the text is inferred only from the text itself?

This is a separate question from whether students should be required to write that way. I'm just arguing that "the author says," even when that phrase makes sense, is both less precise and less accurate than "the book says."

Or if "says" is neutral enough not to be controversial, then substitute "argues," "tries to convince us," "claims," etc.

Yes, and that's the difference between science and the humanities; the latter aren't limited to "facts." The humanities speak differently than the sciences. One might even say that one of the things that's gotten us into this mess is the effort to have humane studies speak and behave like the sciences. - Nice Marmot

How well I understand your point! Wouldn't you say that the impetus propelling this is to put the humane studies on a vector towards progress? It is not surprising that historians, possessing something at least that could be called empirical data, would be behind it.

A question I have never satisfactorily answered is this. Do the humane studies ever progress towards any kind of definitive answer or do they simply go forever around and around the questions, on one merry-go-round after the next? From the outside it looks like this unending, personal orbiting is the entire point of the enterprise.

Take for example Edmund Husserl, an enormously talented philosopher. Did he make any progress towards a definitive answer to anything that has been built upon afterwards? Not that I can see. Even his finest student, Sartre, was able to rip most of it down by the gravitational force of his person in orbit. And what has become of Sartre's ideas? Nothing that I know of except for perhaps the old joke:

To be is to do - Socrates
To do is to be - Sartre
Do be do be do - Sinatra

Wouldn't you say that the impetus propelling this is to put the humane studies on a vector towards progress?

Which is a joke. The whole notion of "progress" in the humanities in the scientific sense is a confusion. Which doesn't mean that the humanities are a joke. I saw someone who has a degree in history and would like to think of himself as a philosopher the other day suggesting the incredibly pompous idea of having a web site that declared what has been "established" in philosophy, what is "now uncontroversial" or something and could act as a basis for further research. I've seen big cases of science envy before, but this was one of the more ludicrous.

Aaron,

what do you say when there's no [single] author?

In extreme cases, where we _know_ there were multiple authors, etc., we do this, and it makes sense. The Talmud is a good example. The Book of Common Prayer. If one wants to talk about "the Bible" rather than a particular book of the Bible, or if the author of the book (e.g., I Chronicles) is unknown. And, the best example: A law. Which always has multiple authors, sometimes with conflicting goals. Which is why we say, "The law says" and other locutions.

However, please note my inserted word. If there is _literally_ no author, then you just have a bunch of marks or noises. This is why accidental chicken scratch in the dirt is not a text (there's no author) but cuneiform writing is.

And if "the book says" is more accurate in this extreme case than "the author says," then isn't it at least as accurate in less extreme cases? - or rather, specifically in those cases where our knowledge of the actual author's mental state at the time he wrote the text is inferred only from the text itself?

Absolutely not. It is precisely because the extreme case is extreme in specific and describable ways that "the law says" is more accurate. Where there is a known author, it is better to bear in mind that these are not the deliverances of machines or vague spirits-in-the-air or depersonalized forces or whatever. Newman says this, Aquinas says this. Where there are two or more named authors who truly published together we say, "Smith et. al." argue or "the McGrews argue" The glory of the humanities is contact with other minds, other persons, real persons, not bound by time and space.

Lydia, the law is actually in a different category than my example. With laws, the various legislators typically authorize the final product, knowing that they each intend different things. The concept of "author" used by all of us here really combines two concepts: contribution and authority. In my example, unlike in a law, the contributors do not authorize the final text, except for the last one to touch it; but even the last one to touch it wouldn't necessarily be called an author. We would never call Thomas Bowdler or Thomas Jefferson "the authors" of their respective Bibles.

So that's why I think you're wrong to say that there are multiple authors in my example. There is no historical author at all, in the sense relevant to our discussion. It doesn't make sense to say, based on the text, that any historical "author" (contributor) was "saying" something in the text, when he did not authorize the text at all. Similarly, if a legislator contributed to some bill but then renounced his support before it became law, I would not call him an author of the law, even if it contained words that he contributed.

On your view of "author says"/"book says" depending on single/multiple historical contributors: I think even Beth Impson's view seems more consistent than yours, though her view is obviously absurd. (If one were to take her prescription seriously, one couldn't even say, "General Motors claims that....") You accept that "the book says" is meaningful, accurate, and precise in some cases, and presumably meaningful, at least, in all cases; and at the same time, you apparently say "the (historical) author says" is more accurate and precise in others. I don't see how it can be, if all we know about the historical author's mental state at the time of writing comes from the text. It just seems that you're always adding another layer of uncertainty. On the other hand, if all you're saying is that one should use a certain gimmick to remind students that they're conversing with real, historical people, I've got nothing against that.

"On the other hand, if all you're saying is that one should use a certain gimmick to remind students that they're conversing with real, historical people, I've got nothing against that."

No. The gimmickry is on the other side. That it has intent and is pernicious doesn't make it any less gimmickry. Change words and phrases and eventually you change the way people think.

I was referring to clearly authored books and articles; an editorial, for example, doesn't have a name to it but is obviously authored by a particular person or persons ("The editors write . . ." not "The Herald News writes . . ."). I did not have in mind organizational white papers or propaganda, which would be fine to refer to as being "authored" by the organization. There just happen to be very few occasions on which my students would use such work so I wasn't thinking of it when I wrote my comment.

You accept that "the book says" is meaningful, accurate, and precise in some cases, and presumably meaningful, at least, in all cases;

No, actually, I think when one uses it one is saying something that is always rather imprecise and the meaning of which is somewhat difficult to parse out. Which is why one uses it only in extreme cases.

I think your whole thing about "authorizing" is somewhat confusing, Aaron. You should stick to something clearer. You really don't believe that one should refer to an author where there is no mind behind the thing at all. Presumably you _don't_ refer to chicken scratch in the yard as "saying" something unless you think God or some Power (with a mind) is imbuing it with meaning.

Aaron, your argument would be similar to the following, obviously poor, argument:

There could be cases where one would say, "The city says"--for example, if the city is a party in a lawsuit and one is quoting from the legal brief filed on the city's behalf. Therefore, if one has never met Lydia McGrew and one's only clue to Lydia's state of mind on some subject is the first (signed) blog post one has read by Lydia, it would be just as good, if not better, to say, "The blog post at Extra Thoughts says" and to avoid referring to Lydia, who wrote the blog post under her own name.

I would hope that this parallel would be an obvious reductio of your argument that it's always at least as good, if not better, to say "the book says."

(And by the way, I don't know what other evidence about the author's state of mind you're talking about, but often it's going to be _other_ things written by that author, especially if you've never met him face to face. If it's always somehow "adding a layer of uncertainty" even to refer to the actual author, one would never do so even if one possessed _many_ letters, books, and other works written by a particular author.)

Side question for Lydia: you say you "just knew" that if you made Schlabach the centerpiece of your post, you would attract many of his supports. However, if you just knew - i.e. you can't cite any specific evidence to justify your belief - wouldn't that contradict your evidentialism?

Irenaeus, I was partly being humorous. Let's call it a hunch, and I told you what my evidence was. Let's spell it out a bit more: It seemed to me not implausible that the college is one that has some sort of positive reputation. I didn't know that, but it didn't look like one that I've heard has totally lost its Catholic identity or anything like that. It therefore seemed possible to me that some people might wish to defend it. Moreover, because the policy Schlabach articulates has an exception (a stupid one, but still, an exception), unlike what is imposed here, and because it is surrounded by other ordinary writing advice, I realized that someone could make a case that he was just being an odd and somewhat pompous pedant. That, in fact, is what I now have even more reason to believe is the case. Put that together with possible defensiveness for a Catholic institution of higher learning, and I think I had _some_ reason to think that I would have dissidents if I focused too much on that page. Moreover, I wanted to make the stronger case and the stronger point and didn't think Schlabach's pontifications were the very best evidence for that purpose, for some of these same reasons. Hence I didn't want to feature him in the main post.

This isn't really very heavy stuff.

Husserl's finest student was Edith Stein.

The very notion of the Great Conversation, where "he being dead, yet speaketh," where mind meets mind maugre the little barriers of time and space...

I believe you've inadvertently stated one major aspect of the Logos right there. It is a little bit strange to have a "conversation" when there cannot be a response. Of course we can learn plenty of lessons from the dead and from history in general, so long as we correctly understand what the lessons mean and when they apply to the present.

They were, after all, merely men of their times.

Yet they still capture something fundamental about the world, or at least they do for many people. I wonder if it is was because they pushed the envelope of language and changed the way people think?

...if it is was...

That was obviously a mistake on my part, but it actually makes sense given the context. That's funny.

You shouldn't have said it anything, Step2. I thought at first you had done it on purpose.

I can't remember where I found this originally but credit to Ed Feser for talking about Thomistic drinking songs on his blog

Should old Aquinas be forgot, and never brought to Mind?
Should old Aquinas be forgot,in days of Wittgenstein?
Can quiddity and haecceity, analogies divine,
Resolve the paradoxes of Willard Van Orman Quine?

Should symbols bleak replace the speech we learned at Mother's knee?
Or should we now reverse ourselves, and write the backwards E?
Can form and matter be preserved, and analyticity,
If we but put particulars for variables free?

Now Henry Veatch and Peter Geach we really must berate:
The subject and the predicate they leave to copulate.
Intensions pure we can't secure with Frege, Russell, Boole,
By treating good old Barbara with a novel kind of tool.

And Hesperus and Vesperus are entities distinct--
Or should we say, not this, but that they're analytically linked?
Shall we aver they're one indeed, with Smullyan, Church and Fitch?
Or should we moan "Ah, Quine alone can tell us which is which"?

A final clarification: I don't think I ever said that it was better to say "the book says" instead of "the (historical) author says." I said that it's both more precise and more accurate. That's true even for your supposed reductio. I don't see anything wrong with saying or writing "the author says" when that makes sense (as in your example), but I also don't see anything wrong with requiring students to write the other way as an exercise.

but I also don't see anything wrong with requiring students to write the other way as an exercise.

That reminds me of a problem I have had crop up on me recently. The organization I provide services to has had a galloping case of "keep it simple, stupid" itis. It has directed all members to write in "plain language", which typically can be a good thing. It then sent out explanations that detail what they mean: simple, short sentences; no complex phrasings etc. No "technical" terminology. No passive voice. You get the picture: assume your reader is capable of an 8th grade reading level, and has the same difficulty an eighth grader would have maintaining an interest in abstruse legal arguments, and cast your writing that way.

This naturally makes it essentially impossible for the employees to perform some of their tasks, which in part is explaining law and regulations to citizens (sometimes) and to lawyers and other professional practitioners other times. When Congress writes the laws, they don't engage in "plain language" practices, they don't write in simple sentences, and they SURE don't limit themselves to non-technical jargon - sometimes they make up their own new terms or new definitions of terms. How, then, will it be possible to explain the law to a client without engaging in some of the proscribed usages?

The "no passive voice" thing really gets me: I know that 80 years ago (and since) English teachers have laid down an absolute rule "Don't use the passive voice." The rule is in Strunk & White, for example. But there is a little problem: it refuses to recognize reality. A pretty good share of perfectly sound writing uses passive voice, and indeed almost every good or great writer uses it not just a little bit, but with some frequency. Strunk & White use the passive voice in the very text book that prohibits using the passive voice. Worse yet, in some cases the passive voice is precisely the correct, indeed ONLY proper way to convey the concept in its ideal form. For example, if you are a civil servant explaining the impact of a law on a citizen or class of citizens, you probably ought to use the passive voice a fair amount. You (as a member of the executive branch) are not responsible for the law, Congress is. The law is not a person causing its effects, the law is an impartial non-personal constraint on actions. Citizens receive the law passively, and thus speaking about that process naturally descends into passive voice. The active voice is stronger, more forceful. If what you are about calls for being weaker and less forceful, then it calls for the passive voice.

The Chicago Manual of Style admits in black and white that the rule "no passive voice" is now considered obsolete and is not a true rule of good writing, but will you find modern English teachers explaining this to students? Not in my experience.

Tony,

I do tell students to generally avoid the passive voice because it is wordy and harder to process. However, we go over examples of when it is appropriate and even stylistically "better" than active voice. One reason we emphasize use of active voice as much as we do is that passive is often the lazy writer's (and especially the lazy student writer's) way of avoiding the bother of finding out "who did what," which leaves readers feeling cheated and distracted; it's also quite often a way of avoiding taking responsibility. One of my children was an expert in passive voice before he was two -- "Mommy, my toybox was dumped out on the floor . . ." Good grief. I once had to edit an administrator's report for accreditation which was 90% passive voice -- but the point of the report was to tell who did what, not just that things were done. Worst editing job I ever had . . . and he didn't much like me by the time I got done asking him who did what 50 million times! So, yes, it has its appropriate and helpful uses, but most people use it badly and we are trying to help students to write clearly -- which means using active voice unless you have a good reason for the passive.

As for short, simple sentences -- goodness, even 8th graders have grown beyond Dick and Jane! In reality, too many short, simple sentences are much more difficult to process than well-structured longer ones. This is because short sentences require the reader to make the connections between ideas for himself (even transitions are not especially helpful because we "chunk" ideas by punctuation and the period forces us to form one complete idea, then move on to a new one); longer, well-crafted sentences make the connections much easier to process -- dependent clauses and various kinds of phrases stand in a particular relation to the main clause which allows the reader to grasp them more quickly.

I highly recommend a book by a Dr. Gopen (George? I can't recall the first name for sure off-hand) called The Sense of Structure. It was developed specifically for those who write in the "real world" to understand how readers process written information and how to take advantage of that. I use it in my editing classes and the students love it because it helps them see very practical ways to make their meaning clear to others.

p.s. I don't think most people realize how really badly most students write and how desperately we try to make any kind of dent in their lack of ability. Sometimes we give them "rules" that are too restricted but that will help them avoid certain particularly dumb errors just so we can focus on more egregious problems. Pray for us!

Re: the passive voice and its target(s)

"It was strongly suspected." This is the true gossiping, calumniating style. All verbal assassins speak in the passive voice, that, what they cannot prove, they may at last throw on public report. If you had said, "I suspected many months ago," though it would have led to a detection, you would have acted more like a man; and this might have been expected too, in a volunteer of your "determined zeal and firmness."

http://itre.cis.upenn.edu/~myl/languagelog/archives/002540.html

However, we go over examples of when it is appropriate and even stylistically "better" than active voice.

Any teacher who leans on students to use the active voice, but then explains that the passive voice is appropriate in some cases - and, better yet, goes into detail showing when and why and how - is fine in my book. I was ranting against the style nazis who insist that there is no place that the passive voice is allowable. I have had some my documents edited by these nazis and they redline every single use of the passive voice no matter what, without ever once taking context into account. More than once, I have seen them take a perfectly reasonable passive expression, delete the passive usage, and insert the thought-free comparable active voice expression using about 10 extra words and ending up with a much worse construction. They were quite convinced that reversing a passive to its immediate active is all that was necessary, no matter how badly fractured that left the sentence.

Step2, it is well known that the passive voice is used by the cowardly to avoid responsibility. What is not as well known is whether the not-responsible ought to use the passive voice so as to not claim responsibility that isn't theirs.

Correction: Step2, it is well known that the cowardly use the passive voice to avoid responsibility.

Now, did that clarify the matter? Or did it effectively remove the emphasis away from connecting the passive voice and avoiding responsibility, and instead place the emphasis connecting the cowardly and avoiding responsibility (practically a tautology), with the passive voice an insignificant bystander?

I know I said "final," but this question just occurred to me. How about students in cooking schools? Should they be forbidden from uttering such solecisms as, "The recipe says to fold in the egg whites next," and forced to say instead, "Julia Child says to fold in the egg whites next"? Of course we're assuming for the sake of this example that Julia Child is the sole, undisputed author of the recipe. After all, if "clearly authored books and articles" cannot "say" something, then neither can clearly authored recipes, right?

My point is that these are somewhat arbitrary conventions. Note: somewhat arbitrary, not completely. Pedagogical exercises aside, I think the best approach is to leniently follow the conventions.

Amen. One more reason I'm getting the heck out of Dodge--er, Western. Keep fighting the good fight, Lydia. Michelle and I will miss you. Stay in touch.

Thomas Hardy makes deliberate use of the passive voice in this (mediocre) poem to suggest a weak and shy man who lacks 'romantic courage' at a moment that might have changed his life. To quote the old adage, faint heart never won fair maiden.

Faintheart in a Railway Train

At nine in the morning there passed a church,
At ten there passed me by the sea,
At twelve a town of smoke and smirch,
At two a forest of oak and birch,
And then, on a platform, she:

A radiant stranger, who saw not me.
I queried, “Get out to her do I dare?”
But I kept my seat in my search for a plea,
And the wheels moved on. O could it but be
That I had alighted there!

Thanks, Travis!

Alex, I'm sorry to nitpick, but I don't actually see the passive voice there. "There passed" is not in the passive voice. If you imagine diagraming the sentence, you will see that the subject is "a church" (for example), and the church performs the action of "passing." In a sentence using passive voice, something other than the subject of the sentence performs the action upon the subject. Hence, "He was killed by an arrow." Or "The action is performed by something other than the subject."

Post a comment


Bold Italic Underline Quote

Note: In order to limit duplicate comments, please submit a comment only once. A comment may take a few minutes to appear beneath the article.

Although this site does not actively hold comments for moderation, some comments are automatically held by the blog system. For best results, limit the number of links (including links in your signature line to your own website) to under 3 per comment as all comments with a large number of links will be automatically held. If your comment is held for any reason, please be patient and an author or administrator will approve it. Do not resubmit the same comment as subsequent submissions of the same comment will be held as well.