Tag Archives: Byron

For Dante, hope is the one thing needful.

On the lowest terrace of Ante-Purgatory — that is, the lowest possible level for a soul whose ultimate destiny is salvation — Dante and Virgil meet Manfred of Sicily. According to the (perhaps unjust) accounts by which the poet Dante knew him, Manfred had been a moral monster, excommunicated by the Church and denied Christian burial. Among other enormities, he had allegedly murdered his own father, brother, and two nephews, and attempted the murder of a third nephew. In other words, he would ordinarily have been condemned to the very lowest Circle of Hell, to the realm of Caïna, as one guilty of treachery against his own kin.

Manfred, however, repented at the moment of death — or perhaps it was not even repentance in the usual sense of confession and contrition. He says simply “I gave myself back” (io mi rendei) to God.

After my body had been shattered by
two fatal blows, in tears, I then consigned
myself to Him who willingly forgives.

My sins were ghastly, but the Infinite
Goodness has arms so wide that It accepts
who ever would return, imploring It.

. . .

Despite the Church’s curse, there is no one
so lost that the eternal love cannot
return — as long as hope shows something green.

— Purgatorio iii. 118-23, 133-35 (Mandelbaum trans.)

The choice of words is highly significant: not “as long as he repents” or “as long as he dies with the name of Jesus on his lips” or anything like that, but “as long as hope shows something green” (mentre che la speranza ha fior del verde)Manfred died in the hope of salvation, and it was that — rather than repentance per se — which saved him.

(Contrast Dante’s Manfred with Byron’s character of the same name. The abbot implores the dying Manfred to “Give thy prayers to heaven —  / Pray — albeit in thought, — but die not thus,” but Manfred, having spurned the fiends, spurns God as well. His last words, as the abbot begs him again to make “but yet one prayer,” are “Old man! ’tis not so difficult to die.”)

Manfred, the lowest of the saved, makes an interesting contrast with Virgil and the other virtuous pagans, the highest of the damned. The latter are “punished just with this: we have no hope and yet we live in longing” — and, as discussed in my previous post, one possible interpretation is that the “sin” for which the virtuous pagans are punished is also a lack of hope. Lacking the Christian revelation, they hoped for nothing higher than the Elysian Fields, and so that is all they receive. “I am Virgil,” the poet says later, in purgatory, “and I am deprived of Heaven for no fault other than my lack of faith.” Dante certainly seems to be portraying hope as the one deciding factor in the soul’s destiny. With it, even Manfred is salvable; without it, even Virgil is damned. That hope is the key distinction between purgatory and hell — between the suffering which saves and the suffering which does not — is reinforced by the inscription over the gates of hell, ending in the famous line “All hope abandon, ye who enter here.”

Having noticed this, I now find an emphasis on hope jumping out at me from many different parts of the Comedy. It is mentioned again and again in the first canto of the Inferno, when Dante confronts the three beasts. The leopard “gave me good cause for hopefulness,” but “hope was hardly able to prevent the fear I felt when I beheld a lion.” Then, when the she-wolf appears, “I abandoned hope of ever climbing up that mountain slope.” And of course every cantica ends in the word “stars” — a traditional symbol of hope.


I am not yet ready to comment on Dante’s ideas regarding hope — I want to go through the whole Comedy again and spend some time digesting it — but I just wanted to point out an aspect of Dante that I had never noticed before.


Filed under Christianity, Ethics, Literature

Contrarian verses

“Not to admire, is all the art I know
To make men happy, or to keep them so.”
Thus Pope quotes Horace; but had none admired,
Would Pope have sung, or Horace been inspired?
(These rhymes are clipped from Byron, every line:
For God’s sake, reader! take them not for mine.)


I’ll never see nor ever hear
A tree as lovely as Shakespeare,
Nor think that God shall ever make
A tree to rival William Blake.
Not since the tree that wrought Eve’s curse
Have leaves of green matched leaves of verse.


If cut worm lives,
It ne’er forgives.


What was once proved
Is now only imagin’d.


Two Principles in human nature reign;
Reason, to urge, and Self-love, to restrain.


What Browning meant I think I see
And understand — yet men there be
Whose grasp exceeds their wildest reach,
Who practice what they dare not preach,
Whose flesh is willing, spite of pride,
And for these, too, I think, Christ died.

Leave a comment

Filed under Poetry

Some notes on the dark arts of rhetoric

The most effective put-down is one that employs — and deftly eviscerates — the very same terms which would ordinarily be used for praise. This is roughly a million times more effective than name-calling. Witness Byron’s masterful deflation of pretensions of immortality:

Pride! bend thine eye from heaven to thine estate;
See how the Mighty shrink into a song!

The power of these lines hinges at least in part on the choice of the word “song” — put at the end of a line for extra punch. This is the same word usually used to refer to fame as a kind of apotheosis (as in “to be immortalized in song”), but Byron makes it sound rather paltry — not by actually saying it is paltry, but by casting his verse in such a way that the reader is forced to presuppose it is paltry. The addition of that little word “a” is also a slick touch. How much less glorious it sounds to be immortalized in a song!

Another good example of this is in the film The Aviator, when Howard Hughes (Leonardo DiCaprio) says to Katherine Hepburn (Cate Blanchett), “Don’t you ever talk down to me! You are a movie star — nothing more.” By simply using the (usually positive) term “movie star” as an insult, he presupposes that both he and Hepburn already know that movie stars are contemptible — and presupposing your point can be much more effective than making it directly.


Walter Winchell mocked Nazis by calling them “Ratzis” (Rational Socialists?) and “swastinkers”. Now “Nazi” itself is enough of an insult. Likewise for liberals, feminists, and fundamentalists. If you can ridicule or denounce something whilst using the very same name that its supporters use, it’s far more effective than making up some derogatory term.

Likewise, it’s usually better to embrace the common—even if hostile—terminology for what you support rather than insisting on something else. Groups that insist on politically correct euphemisms for themselves imply that they need euphemizing.

Insisting on special terminology for oneself or for one’s enemies is a sign of weakness. The best way is to use common neutral language, pushing it very slightly in the direction of sarcastically imitating the terminology used by your enemies—but not too much, or you’ll sound like you have a chip on your shoulder.


When you compare the president to a Nazi, your scorn for the president sounds shrill, but your scorn for Nazis sounds reasonable. Again, this is because your comparison takes it for granted that everyone knows Nazis are bad. If X is the real target of your scorn, don’t compare X to something worse; instead, find excuses to compare other things to X in a way that presupposes a negative opinion of X.

I once saw this comment on a blog: “You sound like a goddamn Christian with all that ‘People hate me because I’m awesome’ bullshit.” This may have been an effective put-down of its ostensible target (an atheist who would presumably object to being compared to a Christian), but it’s a far more effective put-down of Christians. (Corollary: Pro-religion commentators who compare outspoken atheists to religious fundamentalists are shooting themselves in the foot.)


These techniques are forms of sarcasm, which Studies Have Shown is more effective than direct criticism.

The psychologist Ellen Winner and her colleagues have shown that people have a better impression of speakers who express a criticism with sarcasm (“What a great game you just played!”) than with direct language (“What a lousy game you just played!”). The sarcastic speakers, compared with the blunt ones, are seen as less angry, less critical, and more in control. This may be cold comfort to the target of the sarcasm, of course, since criticism is more damaging when it is seen to come from a judicious critic than from a dyspeptic one (Steven Pinker, The Stuff of Thought, pp. 380-81).

Part of the power of sarcasm is that, to some extent, it only works if you’re right. “What a great game you just played!” will be understood as a sarcastic put down only if the listener already knows that he didn’t just play a great game, or at least has some doubts.

Sarcasm disarms its target. There is no safe reply. If you say, “What a great game you just played!” and I respond defensively (“Come on, it wasn’t so bad!”), I’m implicitly admitting that you are right. I understand your comment to be sarcasm, which means I know you couldn’t have meant it sincerely, which means I know I played badly. If, on the other hand, I don’t get the sarcasm (or pretend not to get it) and respond with “Thanks!”, you can answer with a withering “I was being sarcastic.”


Filed under Psychology, Rhetoric

Two solipsisms

Though I’m generally leery of coopting the vocabulary of theism to refer to non-theistic philosophical ideas, I find it irresistably appropriate that the King James translators chose to represent the name of God, the ultimate mystery, with the words “I AM” — existence, first person, present tense — a trinity of incomprehensibles economically expressed in three letters. Setting the issue of existence aside for the moment, I find that the other two — the two solipsisms — can profitably be considered together, that thinking about the one sheds light on the other.

I know, in theory, that there are billions of conscious minds in this world, each with its own subjective experience, but one of them, the one known as Wm Jas Tychonievich, stands out from the others because I am Wm — and by that I don’t mean anything as simple as “Wm is Wm” or “Wm wrote this sentence.” What I mean is that Wm is in some sense the person; that his is the point of view; that qualia are experienced only when he sees or hears or feels something; that when he closes his eyes the world goes dark; that while others have subjective experience in theory, only he has subjective experience in fact. Others no doubt feel that the same is true of themselves, and to some extent I can grant that. From Joe Schmo’s point of view, Joe Schmo is the person; ditto for every other sentient animal in the universe. There’s nothing special about Wm. And yet, and yet — there undeniably is something special about Wm. “From Joe Schmo’s point of view” is a counterfactual. When I say, “What would it be like to be Joe Schmo?” (or a bat, or whatever), I’m imagining a possible world which is different from the real world — not different in any objective way that could be detectable by a third party, but different nonetheless.

The same goes for time. Time is a vast and perhaps infinite continuum, but one particular point on that continuum, the one called “the present,” is the time, more real than any other. While few people subscribe to literal solipsism with regard to persons, temporal solipsism is much more mainstream. Many people would be quite comfortable saying that the past and especially the future do not have actual existence, that only the present moment is real — an odd point of view, given that the present is infinitesimally small and impossible to pin down with any precision. And just as there is nothing objectively different about the person called Wm, there is nothing objectively different about any particular point on the timeline. Just as every person thinks he is the person, people have at every point in history thought that that point was the present — and are we going to say they were wrong? As I write this, I feel quite sure that the present is a point contained within the span of time referred to as the year 2009, but last year I felt just as sure about 2008. You can say, “In the past 2008 was the present year, but now it’s 2009,” but that doesn’t really mean anything. As used here, “in the past” and “was” are counterfactuals analogous to “from Joe Schmo’s point of view,” and the sentence means something like, “If the past were the present, then 2008 would be the present,” an uninteresting near-tautology. In fact 2008 “is” not the present (a fact which would be easier to express if English didn’t require every verb to have a tense).

If I say, “I wish it were tomorrow,” I’m wishing that something were different about the world — but, again, the difference is not anything objectively observable. If my wish came true, not a single objective fact about the history of the universe would be different. I’m not wishing that two days become one day or that the calendar skip a day; I don’t want to change the timeline of history at all. All I’m wishing is that a different point on the timeline be the time, the present, the only point in time which is really experienced (and only by the person). It’s a comprehensible wish, just as comprehensible as “I wish I were a bat,” but neither wish can mean anything at all unless you take for granted the quasi-solipsistic idea that there is one particular mind and one particular point in time which exist more directly — are more real — than any others. The wish is that a different mind, or a different time, be the special fully-existent one.

There are different ways to approach these two issues, varying degrees of actual solipsism. The most extreme solipsism, the kind we usually have in mind when we use the word, denies that anything or anyone other than “I” exists in any sense at all. The same extreme solipsism could be applied to time. An intro-to-philosophy staple is the speculation that perhaps the universe sprang into existence five minutes ago, complete with false signs of antiquity and false memories of a nonexistent past. We can whittle that five minutes down until we reach the logical limit: that perhaps the universe has no past at all, that perhaps the present moment is the only moment and all else is illusion.

A more moderate solipsism tries to have it both ways, usually with the help of possibly meaningless expressions like, “The past was once the present, and the future will be the present anon.” This takes advantage of the tense system of English to ascribe some sort of existence or reality (“was,” “will be”) to the whole continuum of time, while at the same time singling out the present moment as different, because it alone is the present. Various verbal tricks can be used to do the same thing with regard to persons, admitting the reality of other people and their subjective experience while still seeing oneself as somehow different and “What would it be like to be Bill Gates?” as a counterfactual. This kind of moderate solipsism seems to be what comes naturally to most people.

The final option is to reject solipsism completely, accepting that every person and every point in time is equally real and that the idea of one special person called “I” and one special time called “now” — the idea implied in the appropriately monotheistic words “I AM” — is an illusion. It’s probably a logically inevitable illusion — since, while my subjective experience and your subjective experience are both real and are both experienced, they cannot both be experienced by the same person — but an illusion nonetheless.

My natural tendency has always been to be more solipsistic about person than about time, and I think this is probably a near-universal proclivity for which there are good Darwinian reasons. A high degree of reality is ascribed to other points in time, particularly to the future, but not to other subjectivities — other points in time will be the present, but no other person will ever be “I.” This lopsided solipsism is where the fear of oblivion comes from: the horrifying realization that there will be “real” points in time at which the one “real” person no longer exists. That’s what we’re really afraid of when we fear death — not just that one particular person will cease to exist, but that, because that one person is “I,” the only “I,” subjective experience itself will cease with his death. When “I” dies, the whole universe might as well have come to an end. Philip Larkin expresses it well in “Aubade”:

This is a special way of being afraid
No trick dispels. Religion used to try,
That vast moth-eaten musical brocade
Created to pretend we never die,
And specious stuff that says no rational being
Can fear a thing it cannot feel, not seeing
that this is what we fear — no sight, no sound,
No touch or taste or smell, nothing to think with,
Nothing to love or link with,
The anaesthetic from which none come round.

But if we are but consistent in our solipsism, approaching time and person in the same way, this fear turns out to be unfounded. Under pure solipsism, there is only one real person — but there is also only one real point in time, so the imagined future in which that one real person no longer exists is an illusion; it will never come. Pure anti-solipsism acknowledges that the future is real — but also that other minds are real and that each is equally an “I”; subjective experience will continue for as long as sentient minds — any sentient minds — exist.

Actually, pure anti-solipsism goes further. Even if there comes a time, as there almost certainly will, when all intelligent life is snuffed out, there is no need to fear oblivion, because time does not actually pass. Past, present, and future are all equally real, just as all points in space are equally real. The fact that life exists during this time period and not during that one is no more significant than the fact that it exists on this planet and not that one.

Acknowledging the past as fully real is probably the hardest part of maintaining a consistently anti-solipsistic point of view, because (again, for good Darwinian reasons) we naturally focus on the present and future, considering something not to exist at all if it exists only in the past. But the whole continuum of time is equally real. It exists as a unity. Designating a particular point “the present” doesn’t make everything to the left of it disappear. To adapt a catchphrase from my Mormon upbringing, everything that exists, exists for a time and for all eternity. Good poets like Byron remind us of this:

I die — but first I have possesst,
And come what may, I have been blest;
(The Giaour, 1114-1115)

And so do bad poets like James Blunt:

And though time goes by, I will always be
In a club with you in 1973

Recently, in coming to terms with the death of someone who had been close to me, I found myself thinking in very Giaouresque terms: “She will always have existed.” Though it may sound like just so much grammatical prestidigitation, conjuring up an “always” where there is none, it was nevertheless only that — not turning to pipe-dreams of heaven or reincarnation, not settling for memory as a substitute for existence — that in the end had any real power to console. Eternity isn’t about things continuing for an infinite span of time; it’s about the recognition that all time is equally real.


Filed under Philosophy, Time

No freedom to fight for

I’m only going to talk about the first line of this untitled poem by Lord Byron, but it’s short enough that there’s no good reason not to quote the whole thing.

When a man hath no freedom to fight for at home,
Let him combat for that of his neighbours;
Let him think of the glories of Greece and of Rome,
And get knock’d on the head for his labours.
To do good to mankind is the chivalrous plan,
And is always as nobly requited;
Then battle for freedom wherever you can,
And, if not shot or hang’d, you’ll get knighted.

There’s much to like in this little ditty — it’s clever and quotable — but for me it’s that first line that’s the real stroke of genius, and an excellent example of what poetry is all about: namely, using language in such a way that its surface characteristics (rhyme and rhythm most obviously, but also syntactic and lexical quirks) harmonize with, reinforce, and even add to the meaning it conveys.

“When a man hath no freedom to…” sounds like it’s talking about a man whose freedom is limited — another way of saying “When a man is not free to…” — and you expect it to be followed by a verb phrase indicating what he is not free to do. When the next words are “fight for,” another possible interpretation become salient (it would be the only interpretation if the clause ended there), but the first is still possible. (You might say, for example, that a man who has been barred from military service, “hath no freedom to fight for his country.”) It’s not until “fight for” is followed by “at home,” rather than by the noun object that the first interpretation requires, that the reader is forced to reanalyze the syntax, realizing that “freedom to fight for” is actually a phrase of the same type as “work to do” or “new worlds to conquer.”

Byron’s got the right idea, but in my opinion he doesn’t lead the reader far enough down the garden path before forcing the syntactic reframe. If I were Byron, I would have put a line break between “fight” and “for,” and then followed “for” with something which the reader could misinterpret as being its object. Here’s how I might have written the first stanza:

It is said when a man has no freedom to fight
For his country and people and home and birthright
Will all lose their appeal. Then crusading he goes
To win other men’s freedom from other men’s foes.

The garden path, whether my version or Byron’s, isn’t there just for the hell of it, but is central to the meaning of the poem. After getting the mistaken idea that we’re talking about a man who lacks freedom, the reader realizes, with at least a little bit of subconscious surprise, that, no, the man in question actually has no freedom to fight for — which means that he does have freedom, as much freedom as he could possibly want, that he is in no danger of losing it, and that he is therefore not free to fight for… wait, how’s that again? The lack of freedom we encountered on the garden path comes back to get us.

Because people — some people, anyway — don’t just want freedom, they want the experience of fighting for freedom. Maybe Greece needed Byron, but it’s much more obvious that Byron needed Greece; and if we ever have a world where all people everywhere are granted freedom and liberty, the Byrons of the world will be going crazy, itching for a fight, and feeling — however paradoxical it may seem — unfree.

The dynamic Byron described is very clearly at work in the modern West — not only in the obvious case of the American neocons fighting for Iraqi democracy, but in that of pampered classes in any number of countries agitating on behalf of their local oppressed (or not-so-oppressed) minorities. Trying to write something of my own in the spirit of Byron’s first line, I came up with the following:

How the masses grew restless and got out of hand
In their anger at having no rights to demand!

1 Comment

Filed under Literature