is triggering,” a young woman has recently been captured crying out on video. I’ll
say. This past week on the University of North Carolina campus at Chapel Hill that
selfsame young woman was so triggered by an image at a pro-life display that
she allegedly hauled off and punched one of the young men staffing the exhibit.
I must say “allegedly,” as there is a misdemeanor assault charge to that
effect, and until she is proven guilty of it in America one always is careful
to say “allegedly,” even when, as in this case, there is video of the event.
Still, “allegedly” is the right word within the confines of the American judicial
jumped off the page—or rather jumped out of the video—at me was the fact that
she said, even as she was about to hit the fellow she had the presence of mind
to exclaim, “this is triggering!” It seemed odd to me that bit, for I had
always assumed that the way something triggering had to work would be more
subliminal, i.e. something you really couldn’t identify until later, perhaps even
through counseling or the like, as having been the triggering factor. But she did
indeed identified it even as she, again, allegedly, launched a barrage of
punches on the guy who was standing there near the display.
that’s pretty triggering! And it didn’t take long for me to ponder this and
consider that if a mere image on a-frame sign could trigger someone up like
that, how much more powerfully words might do so. After all, Socrates didn’t
walk around carrying image-rich red-figure or black-figure vases, though both were
readily available in the Athens of his day, trying to trigger people up, but he
walked around merely equipped with probing words and ideas. Words and ideas
meant to keep freedom safe both through instruction, for that was in part what
he was about, and by a sense of devotion both to individuals and community at
large. His chats were trigger-rich, at least of Plato’s accounts of them are
even halfway correct representations of the original events. I imagine Plato
wrote the Socratic dialogues more to catch the spirit of that great unpublished
philosopher than to try to conjure from memory the exact words, or even to
capture Socrates’ own recollection of them that could have been recounted to
Plato privately, over a glass of wine or two.
What’s my point? My point is really a question. Is triggering all that bad, especially on a college campus? I mean, when I went to college my professors challenged me to think, to ask questions that made me uncomfortable, to consider issues from angles and vantage points that had hitherto been foreign to my way of thinking. In fact, my best professors were more trigger happy than Billy the Kid, maybe even than Al Capone. One professor, Mary Schweitzer De Grys, who taught an anthropology course I took on South American cultures (quite a large topic), forced me to ask hard questions about class structure and urban development, and in so doing provoked a level of compassion in me for South America’s urban poor (and, synecdochically, with all impoverished people) that I had not hitherto known. Dr. De Grys was, I suppose, what would now be called a triggerer. And, as this is teacher appreciation week, it is appropriate that I, all these years later, thank her for it.
My friend, the philologist whom I mention from time to time, may or may not have been a spy for the United States government, and if he were, then at one point perhaps he was more than a mere triggerer—perhaps he was a triggerman. I’m not entirely sure about that, but nowadays he seems to have all the marks of at least a triggerer, insofar as he is an educator and seems to take his role as one quite seriously. On that note, I shall close these natterings, with the hope that that same friend doesn’t get punched in the face for challenging his students. Yet, now that I think about it, if he should, chances are that he will have deserved it anyway.
It is just too easy to become jaded these days. The last two blogs have perhaps revealed a bit of my personal frustration about living in an age where it seems that ideas (admittedly only ideas)—sometimes known as “values,” such as truth, goodness, justice, which Plato called the “forms”—are no longer valued by folks so much. Rather, personal goals seem to come first, no matter what they might be. In other words, what is deemed valuable is any individual’s personal agenda, and facile applause follows achieving that, with little thought given to the value of that enterprise or its value to the common good. The idea of community is lost, it seems, or at least placed far behind the notion of the individual’s personal growth, even if that growth is in a direction that just may in fact be harmful to those around that individual, or at the very least, in conflict with what had hitherto been regarded as transcendent values.
Assuming I am even partly right about what I have suggested above, then one might have every right to ask the following tough question: “How can I, in the face of changing values or, better put, the devaluation of traditional values, do or even say anything of value?” And I spent some time thinking about this very thing this week, and it came to me that there really is only one thing that one can do to make a difference in an Orwellian world such as I have described.
That difference can be traced, I’m sorry to say, to a source. I say sorry because the notion of any source aside from the individual is, these days, rather unpopular. The individual, it is believed, has the capacity and, more importantly the right, to determine for him or herself what is right, or should I say to determine what is right for him or herself. These palindromatic notions seem, as I hint at in the opening paragraph, to be essentially the same thing. But for those of us who might want to suggest a different, less popular and, yes I’m afraid traditional, perspective, we will look to find the source that I speak of.
That source is a mountain. Not one of the seven hills of Rome, not Athens Mars’ Hill, not Dharamsala in the Tibetan Himalayas, not even Mt. Zion in Israel. No, it is a much smaller “mountain,” really only a hill, one you probably have never heard of, known as Har HaOsher. It lies between Capernaum and Gennesaret, where once, it is said, were spoken by an itinerant rabbi something called the Beatitudes. These teachings can be summed up with any one of a number of quite positive words like grace, compassion, even love. Among those summary words, to me one, however, stands out: redemption. They are redemptive teachings, blessings on those who seek to practice even a fraction of them. That rabbi broke that blessing into bite-sized pieces. They’re not hard to do, they don’t lie “beyond the sea, that thou shouldest say, ‘Who will cross the sea, get it and proclaim it to us so that we may follow it?’” No, “the word is very nigh unto thee, in thy mouth, and in thy heart, that thou mayest do it.”
If there is a solution to a world whose values are in dissolution, then, it seems to me, that the way through the chaos may just be to speak redemption, to show compassion and kindness to everyone we encounter. That rabbi did that very thing when the world he inherited was in at least as much disarray as ours is today. He chose to bless, to redeem. Perhaps we can, too, if we put our mind to it. After all, if we look for it, that redemptive word may just be very nigh unto us, already in our mouths and our hearts. And if it is, perhaps we should just speak it, for redemptive speech might be the first step toward a better world, precisely as it was quite a long time ago on a hill in Galilee.
Coincidentally, I was in a hotel shuttle with a couple who hail from Oskarshamn, Sweden. “What a small world,” I said. “One of my favorite authors, Axel Munthe, comes from there.”
“Oh, yes,” they said, “we love Axel Munthe.” They were on their way to Disneyland, but I on quite another errand of consulting for a Californian liberal arts college.
“It’s a small world after all,” I said, not being able to resist, once I had discovered where they were heading. Chuckles all round.
But the essence of today’s blog is yet another coincidence. Not that seeing my old friend from high school was coincidental, for it was not. Indeed, a few weeks before we had planned the rendez-vous at a restaurant on the San Clemente pier; and what spectacular views of the Pacific coast can be seen from that pier! And the conversation was loaded with coincidences, too, if you believe in that sort of thing, for it takes a certain kind of faith to believe in coincidence. I haven’t that faith; I rather invest mine in Providence.
A quick synopsis of the conversation with John: life, family—kids in particular—jobs. And that is when it got interesting—how he had gotten his current position through a labyrinth of coincidences. And mine, too, I said. How I had come to be writing what I am writing now—no, I shan’t tell you, my reader, as that must remain between me and John until it is completed—and so much more. My work in California, and the potential for more where that came from, and on and on. All of which was loaded with coincidences, coincidences that can, in my view, best be explained by Providence, as it seemed that some of them were so coincidental as to suggest the evidence of the intervention of a divine hand, a divine plan.
“As you know, I am a moral agnostic,” John said, and then he added with a wry smile, “Probably the only happy agnostic I know.” I agreed that he is one of the few truly content moral agnostics that I know. And I agreed that he is moral, for he is. He lives by a moral code. And in spite of his clearly moral posture, a friend had, he shared with me, given him a copy of Lee Strobel’s The Case for Christ. I told John about an old friend of mine, a doctor also named John, who had read that book and become a Christian.
“Yet,” I added, “I think you would enjoy C.S. Lewis’ Mere Christianity more. It’s really written for moral agnostics.” I then recapitulated a bit about C.S. Lewis’ life and his connection to J.R.R. Tolkien and the other Inklings.
We parted, John generously picked up the tab, and I got in my car and thought of what I should have added, of course, about morality, for I agreed with him that these days our society needs a good dose of morality and its twin sister civility. But what I didn’t state as clearly as I might have is that morality must have a source, an authority outside of ourselves, for if morality just comes from within us, one person’s morality could look very different from that of another’s. One person might justify stealing or lying or coercing or bullying and even casting aspersions on someone as means to a greater end, while another might see lying or the other nasty behaviors just enumerated as wrong under nearly all circumstances, or even all circumstances. In other words, as Lewis shows deftly in Mere Christianity, we are ourselves not the buoys or the stars and we are certainly not the compass or the magnetic poles. We are, rather the ships, or better the pilots of our own ships, and sailing out of line can damage or even sink our neighbor’s ships, too.Without doubt we, as captains, can and sometimes must use dead reckoning to sail, but that would only be on a cloudy day when we can’t see the sky and we have misplaced our compass. So, being moral is great—good ship captains are welcome—but it necessarily derivative. And then the question becomes, derivative of what source? And that source does in fact matter very much. Do we really want it to be textless, ever-shifting cultural groupthink? Are there not founts (maybe Cicero Plato, Aristotle?) or an even higher source (perhaps the Ten Commandments?) that speak to our moral formation better than pop music, reality T.V. shows, Dear Abby or the op ed page?
Alas, I neither got that far in my thinking nor we in our conversation. Why not? I would like to say it was only because I had a plane to catch, but in reality it was because I am not as mentally quick on my feet as I would like to pretend I am. Yet it was a delight to see an old friend, and a joy to think through the need for civil discourse in a world so fallen, so in need of kindness, so lacking in grace and forgiveness. But there I go again, sounding like someone lamenting, “In my day it was much better…” But maybe, just maybe it was, and the only way back to that day or an even brighter and better one is to find, once again, our moral moorings and, most importantly, the Source that gives those moorings its authority. Not that it was all perfectly clear even “in my day,” but maybe just knowing that it is there at all can be our first step toward what Plato calls “the good,” as we navigate in these waters that have of late become choppy in terms of morality and simply civility. But the faith to get through it, to find the moorings, and to act on their teachings—that’s where coincidence ends and Providence begins.
It is fun to go on a college campus, even that of a college you never attended. It reminds you how privileged you are. You are walking across a mall that famous scholars have walked across and, more than just famous scholars, so have some of society’s great leaders. The campus that I was on this week for a breakfast with some old friends who are heavily engaged in the academic enterprise once held the soles of the shoes of visiting lecturers such as LBJ, Margaret Thatcher, Desmond Tutu, and Ronald Reagan. So it is that a college or a university campus has a way of making you feel small, small in a good way—small, as in part of something greater than yourself—young and fresh, and eager to learn, whatever your age might be.
Yet today you find two dangerous, and perhaps not unrelated, trends developing on college campuses. These were among the otherwise quite pleasant topics of conversation that I had during my breakfast with old friends when I found myself visiting a local college this past week. The friends and I had been in a Think Tank, or if not quite that, a talent cluster whereby which we had spent a few weeks thinking together about how best to lead—years ago, considering leadership in a variety of settings. And now we had all grown in different directions but, on the invitation of one of us, we were once again sitting and talking delightfully in a campus dining establishment enjoying a delicious breakfast and a rich, multi-various and even for a few moments, disturbing conversation.
I say disturbing because we happened to light upon a ghastly topic, which is one of the two trends that I mentioned above, campus rape. We agreed that it is much more widely reported now than it had been even fifteen years ago when we had been in our select group together. And that, of course, was good. We agreed, too, that in the current climate the alleged aggressor was more or less guilty until proven innocent—not a good thing but perhaps apotropaic or at least admonitory. We spoke about the relative lack of a moral code among college students today, with relative being the operative word, as the notion behind the phrase “it’s all relative” (and old phrase now) had, over the last twenty years not just gained ground but flat out triumphed. Then we all laughed, as we knew that now we, too, sounded “old,” as we once thought, when we were in our twenties or thirties, people in their fifties had sounded to us.
But sadly we only brush-stroked a part of the solution to the current amoral climate. Let me define “amoral” here before I try to address the solution. By amoral I mean not simply that rapes happen on a college campus, but that many young men and women, whether of religious upbringing or not, nowadays are swift to engage in premarital sex. I’m not saying that premarital sex didn’t happen when I was in college—indeed, it did, as my generation found itself in the midst of the so-called sexual revolution. But I am saying that the trend toward premarital sex as the norm that began then has by now supplanted, by and large, even the attempt at chastity. Less people come to college with a moral foundation that was forged in their homes; or, if they do, their parents would seem conveniently to have left out the idea that sex is a special thing to be enjoyed by a married couple, not by just any two people who find each other attractive.
Why? Sociologists and many journalists would say that this is the case, at least in part, because the parents themselves had sex before they were married, whether with each other or multiple other partners. Now parents would seem to feel it is hypocritical to tell their children that they should be married first. Besides, many may reason, that kind of legalistic thought is old-fashioned, not part of today’s mainstream thought, whether that be simply the popular morality one hears espoused at a Starbucks on a Saturday morning or one might hear in a mainstream church. And we want to be in the mainstream, we want to keep in step with our environment, to do what the world around us is doing. Right?
Let me now return to the setting of the delightful breakfast, delightful in every way except, of course, the sad moment when we considered campus rape. It seems to me that the current way of dealing with the vast problem of campus rape is to create a thoroughgoing legalistic culture, with “Report It!” reminders everywhere adorning a college campus—on T-shirts, on posters, on the university webpage—all prompts to the young person that she (or occasionally he) needs to let the authorities know if something dreadful has occurred. Certainly that is important, as the gathering of proof must be done almost immediately after a violent act such as sexual assault.
But to get at the underlying causes—to prevent rape from happening in the first place—that seems to me to be something that should ideally first come from a home environment that teaches young folks that their bodies are not commodities to be “had” by another or “used” by themselves, even if the use is intended to be the beginning of a beautiful relationship. That is still “use,” maybe even abuse. Secondarily—and this, too, runs counter to mainstream thought—perhaps another arena in which discussions about one’s body and one’s sexuality might come into play could be a college classroom, via literature. If a student has the opportunity to read Virgil’s fourth Aeneid and have a robust discussion about it, maybe, just maybe, he or she can see the unintended consequences of a relationship founded on sex (what Dido saw as marriage, Aeneas saw as a fling). If those same students might read C.S. Lewis’ Four Loves, or read about tragic love in Shakespeare’s Romeo and Juliet or the humorous circumstances of courtship in Love’s Labor Lost, then real conversations might be held on a college campus—conversations between friends, flowing from classroom to dormitory—about love, whereby love might be distinguished from lust and so on. I know in my college that very thing happened. I can remember Plato spurring conversations about ideas, Aristotle about virtue, Augustine about life’s journey and God’s call.
“Take away those great books,” I said as I directed the discussion to the second topic that I referred to above, “and you take away the opportunities for rich and meaningful conversations. You’ve changed “liberal” education to “illiberal” education. As learning becomes more and more career-oriented, we should expect our young folks to see their education as merely a means to an end, and their bodies, too, as merely something to be used with a view to a goal—even a good goal, such as a loving relationship. That good goal of the loving, perhaps even monogamous relationship,” I waxed on, “parallels the good goal of eventual gainful employment. But the means by which each is achieved—that makes all the difference.”
I was done. As you may have guessed, I had managed to throw a wet blanket over an otherwise delightful social event. I succeeded in wiggling my way out of the momentary yet deafening silence that followed my disputation by making a quip about my penchant for biking just about everywhere and my friends thinking it is because I’ve had a DUI. They laughed about that heartily. But I meant what I had said. The solution to our social ills must rely exclusively on the moral formation that may or may not occur in the home. Years ago that environment may have been the incubator of virtue; it is no longer. Rather, it may be that the last bastion of moral formation lies in books, books with great ideas and great ideals, perhaps out-of-fashion but never out-of-date. These ideals, shared via literature with many of the great men and women who came before, might just make us feel small in a good way, a part of something greater than ourselves, and eager to keep on learning, whatever our age may be.
I have a good and richly devout friend who says no one but God can really change anyone. All change, he insists, must come from on high. Well, at some deep, theological level, he may just be right. But in the world in which I live, I’ve seen a lot of things help one at least to see the need for change, and therefore, I think, it may be useful to look carefully at my friend’s formula. Maybe there are a lot of different ways that God changes people. Could he do so through other people, especially those involved in one’s life in certain key ways?
Long ago (in 1372, to be precise) Boccaccio wrote to Petrarch, suggesting that he had been put on the right path by none other than Petrarch himself. That path, Boccaccio states, is the “ancient path” that Petrarch had traced out with so much vigor and talent that “he could not be stopped by any obstacle or even by the difficult road.” Petrarch was, in fact, Boccaccio’s teacher. And what Boccaccio had learned from Petrarch was presumably the same thing that students of another teacher of rhetoric, a millennium earlier, had tried to teach his students: the path of virtue, a path opened by rhetoric and persuasion. That ancient teacher was named Cicero, the Roman statesman/philosopher par excellence. But more on him another time.
For now, I would prefer to return to my friend’s central premise, namely that God alone can transform someone. Again, that may be true in a theological sense, but in a practical sense, I think I agree with Boccaccio: education can, and in particular a great teacher—and that teacher need not be a Petrarch or a Cicero—has a peculiar role in that transformational work. Thus, what is known as a liberal arts education can produce some startling and quite valuable results.
Indeed, I would say that the most valuable thing I own is not my great-great-grandmother Lucy Hughes Jones’ tea pot or her not-quite-Welsh (really Bavarian) cheese plate or even the old black trunk that transported them both, but my liberal education. At Dickinson I read Milton for the first time, and he taught me to understand what faith was long before I had faith to speak of. Plato led me to think about the best things—he called them forms—and he did so in his original Greek. Shakespeare taught me how to laugh, to care, to love and even to speak and write more dexterously. And Richard Wright made me at least a bit more aware of what it is like to be scared, make mistakes, and to understand such fear and error by looking through a poignantly pathetic character’s eyes.
And these were just the literature classes. I took an anthropology class, too, that educated me as to how poor so many folks in this world are. Subsequently, I would myself go to China and, later, Ethiopia and understand in person what I had read about and studied years before. And history, what can I say about that? I learned to love history from a great professor named Leon Fitts. He could bring Rome alive like no other. For another history class, I wrote a paper about my family’s history. Was that the prototype of The Curious Autobiography? I’m not sure, but I think it may have had something ultimately to do with the scribbling down of that collection of tales. And Latin. Where do I start? Where do I end? If in the manner of the forty-third verse of Virgil’s second Georgic, I had a hundred tongues and a hundred mouths, could I ever truly explain?
What changed me the most? While I agree with my devout friend that encountering and wrestling with God is the most transformative moment one can have, one of the most important ways change has come to me is through the echoing ideas that found a permanent seat in my mind during my college years. In any case, I know the answer to a question a bit different from the one that opens this paragraph. That question is simply what the most valuable thing I own might be. I can say without hesitation that that most prized thing is my liberal arts education—not the degree itself but the degree to which it changed the way I think—for by it I learned to embark on Boccaccio’s (or was it Petrarch’s?) ancient path and to appreciate life’s journey along the difficult road.
“You see,” I recall him saying as we stood on the dank stairwell of Dickinson College’s Old East Building located at the northeast end of the campus mall, “It is very simple, pal. Either there is one or there is not.” The one he was referring to is, of course, God. Dr. Philip Lockhart had the uniquely Presbyterian knack—to wit, the Westminster Shorter Catechism—of taking the difficult and reducing it to something highly condensed and yet entirely comprehensible.
In response to my query based on the conversation that Dr. Lockhart and I had on that old stairway, Roz replied, “Yes, of course, yes, yes, of course I do.” Roz, along with her husband and nephew, just happened to sit next to me in an airport restaurant in Toronto, where I spent a large portion of the day waiting for my sempiternally delayed plane. Indeed her response was enthusiastic: “I am a Jew. Of course I believe in God.”
I had only asked that basic theological question because I was offering her a slice of the story of Elaine Jakes, a story quite improbable—well, you know if you’ve read the book. For the fact that Elaine had been, mostly at different times, a Jew, Chinese, and African American reveals how each individual vignette elicits the annoying question as to whether the sum of the details of her story could just be coincidence. Frankly, it is just easier to explain if the person you’re telling it to begins with at least a hint of faith—in Roz’ case a good bit more than a hint.
Thus could I relay more confidently one of those improbable stories from the book, and thus did she smile, even chuckle, with amusement and delight. “But how did it happen that you became a writer?” she asked. “How could you decide to become a writer and study Greek and Latin, no less, in college? These are not highly marketable subjects.”
“That same professor,” I said,” Phil Lockhart, directed me to listen to the voices of the past, to hear what the ancients could tell me not only about history and art and battles but about honor, and justice, and bravery. ‘The words of Plato, Cicero, and Virgil,’ old Dr. Lockhart so sagely said, ‘resound eternally. Learn Latin and Greek so that you can press your ear to the pane of glass and hear them for yourself.’ And he was right, of course. Dr. Lockhart was, like my mother, always right.”
Roz, a lawyer by trade raising a son of her own, was astounded, “So a teacher, a single teacher made such a big impact on you?”
“Yes,” I said, “and so did and still do the voices he referred to that I was able to hear through the glass pane. I can still hear Dr. Lockhart’s voice as if it were yesterday. And I learned enough Greek and Latin in college to begin to hear those other, older voices pretty well.”
“But how did you happen to take Latin or Greek in the first place?”
“Well, this is the part that requires some measure of the faith we spoke of earlier, for it, too, involves an improbable string of coincidences. I wound up in Latin simply because on one solitary evening no less than three people—an Alpha Chi Rho fraternity brother whose name escapes me, a future college president named Chris Reber, and his roommate, Russ Fry, if I am recalling his name correctly after so many years, all told me to take Latin instead of waiting a semester for a spot in French to open up. ‘The prof is great,’ they all said independently of one another; ‘You simply have to take Latin!’ or something to that effect.”
And that prof was, of course, none other than Phil Lockhart. “He and the voices behind the pane of glass,” I continued, “all left quite an impression on me, ever directing me to higher moral ground, better thoughts, nobler action. Plato taught me something like faith, Cicero, honor, and Virgil, compassion, I think. Perhaps, Virgil taught me a bit more than just compassion; perhaps they all taught me more than those solitary ideals. And Dr. Lockhart …,” I paused, “taught me not only how to read them and understand their words but also how to write and speak and think.”
“I wish my son would have such a teacher and experience of college.”
“I hope,” I said, “that he does, too. I hope that he gets a chance to hear the voices behind the pane.”
“I wish I could have that kind of education myself,” she added. “Where can I learn something of this? Do you have a podcast?”
I think it was Roz who asked about the podcast—or was it the woman sitting next to me on the plane? In any case, it was now twice in one day that someone had asked me this question, for at breakfast Marci, a kind woman from Pennsylvania staying at our lovely bed and breakfast in Toronto (Elegant Cabbagetown), had asked the same question.
“Alas, no, but I think that The Curious Autobiography tells a lot of the story and can direct you toward some of the ideas and ideals I spoke of earlier.”
“I will buy it and read it!” Roz said enthusiastically.
Oddly enough, Marci had said the same thing at breakfast. Marci and Roz, if you are reading this now, I hope you can hear Elaine’s voice behind the glass. Her ideas and even her perception of life are built upon the great thoughts of the past. She is there, right now, just beyond the windowpane, sharing a pot of tea with Dr. Lockhart. Unless I am mistaken, it just may be that Cicero or Plato is sitting there with them.
Now I don’t often mention the same person, in this case Mother Teresa, in a blog within a two or three week sequence because it seems to me a tad tautological to do so. But this time, I suppose, I have to, because two things happened this week that made me think of Platonic forms. One had to do with socks that a football player Colin Kaepernick wore during a recent game. The apparel in question portrayed police officers as pigs. For the sake of clarity, I quote here from Josh Peters’ USA Today article that records the words spoken by the executive director of the National Association of Police Organizations, Bill Johnson:
“It’s just ridiculous that the same league that prohibits the Dallas [Cowboys] football club from honoring the slain officers in their community with their uniforms stands silent when Kaepernick is dishonoring police officers with what he is wearing on the field.”
It seems to me that I needn’t quip, “He has a point.” I imagine that is self-evident to any sound thinking person. Rather, let me quote Kaepernick’s Instagram response:
“I wore these socks, in the past, because the rogue cops that are allowed to hold positions in police departments not only put the community in danger, but also put the cops that have the right intentions in danger by creating an environment of tension and mistrust.”
I shall leave Mr. Kaepernick’s misunderstanding of the proper use of commas aside for the moment; to point that up would be petulant. Rather, let us look at what he is saying. He seems to me to be stating that the exception is more important than the rule. Put another way, he is suggesting that the particular is more important than the form. And that worldview also explains why he won’t stand for the national anthem. The exceptions to the basic goodness of his country have, in his mind, become of greater weight than the idea or ideal that the country could possibly represent. Even any semblance of such an ideal is absent. It’s gone.
Where did it go? I suppose it followed Hemley Gonzalez to India, who went there in 2008 “to take some time off and get in touch with [his] compassionate and creative side.” How thoughtful of him. Hemley (if I may) “decided to split [his] time in India between backpacking and volunteering, giving them two months of [his] time and energy; it was then that [he] discovered the serious medical negligence that had been taking place for quite a while inside the organization [Mother Teresa’s Missionaries of Charity] and began to document and report the abuse [he] witnessed.” Now I can only imagine that our friend Hemley hitherto had enjoyed very little familiarity with a third world nation. Hemley had probably not gone backpacking through Ethiopia. What a novel concept: touring a place where people are dying every day from starvation. And, then, just to ensure that he’s done his part for humanity, good-deed-doing Hemley documented the “abuses” of those trying desperately to help the starving and afflicted. Novel indeed.
I need not go on. Our friend Hemley made in 2008 the same mistake that Colin Kaepernick is making now merely by putting on his socks in the morning and sitting through the national anthem. Both decided, at some fundamental level, that the form of goodness (in Kaepernick’s case, merely the country, but in our friend Hemley’s case, God Himself and the vicar of God on earth—in this case Mother Teresa; wow!) is less important than the particular exceptions to the rule. Hemley found among the abuses that the heirs of Mother Teresa (for she was no longer living in 2008) perpetrated was their dearth of adequate utensils for the treatment of the very ill of Calcutta; they were discovered to have reused some of the equipment that had not been sterilized to Hemley’s satisfaction. I can just imagine Hemley’s version of Jiminy Cricket, perched right on his shoulder saying in his ear, “Good heavens, Hemley! What abuse you’ve stumbled upon! Report it! Go tell it on the mountain, over the hills and everywhere! Yea, even from the Himalayas!”
I can only hope to sit next to Hemley the next time I fly. He reminds me of a business man I once met on an airplane. I got a one day pass to a fight club and, in my mind at least, I smacked him: but you’ll have to go back to another blog for that account. In the meantime, I recommend to our friend Hemley that the next time he wishes to get in touch with his compassionate side—which side is that, right or left?—or his creative side, that he simply go to the local humane society for the former and an art exhibition for the latter. As for our quarterbacking friend, I think he needs to grow up. (As an aside, may I say that I would love to have been lucky enough to be the patriotic blitzing linebacker that has him in his sights the next time they step on the gridiron together?)
More importantly, I would wish that both gentlemen take the time to read Plato’s Euthyphro, for they both are Euthyphro. They have particularized the moment. They have forgotten that there are ideas that transcend any individual event. And, perhaps they are emblematic of the way many of us, in America at least, think today. So I close with that thought, not a one day pass to the fight club, but simply the notion that too often nowadays we are so quick to wish to glut ourselves with an ephemeral “correction” of a particular instance of injustice that we lose sight of the greater good, the form of Justice itself. This Labor Day, which represents the hard work of our citizens rather than any particular instance of it and is therefore a Platonic holiday, may we think about something nobler, something ennobling, something to which to aspire that we might inspire others. Mother Teresa did that and tomorrow, though she was merely a fallen human being like the rest of us, she will be made a saint. Let us look to the hills whence cometh our strength. Hemley come down from there! Like Mr. Kaepernick, you’re simply out of your league.
Moral relativism is not new. It has been around since Gorgias of Leontini (in Sicily) arrived in Athens in 427 BC, and really even before that. In his Protagoras, Plato interpreted the teacher of the same name’s dictum, “Man is the measure of all things,” to be an advocacy of moral relativism, i.e., that any human being is capable of determining what truth is from a personal vantage point. In other words, from the mid-fifth century B.C. on, Protagoras’ view competing with the notion of a moral absolute was established, an early form of phenomenalism that suggests that a single individual deems true what is true for that person. It would quickly devolve into an essentially nihilistic view expressed by the Sicilian sophist Gorgias in his now lost (but preserved piecemeal in two other sources) treatise entitled On Non-Existence which suggests that nothing exists (i.e., being has no existence) or, if it were to exist, what it consists of would be impossible to know, explain or be understood. (Coincidentally, these are the very opinions most of my agnostic friends advance about God).
The most important aspect of Gorgias’ argument—what he has successfully transmitted to the modern age—is that there is no such thing as an objective point of view, for each individual’s point of view is precisely that—individual. And that is where his argument dovetails with Protagoras, and it is on that confluence that I want to focus this blog, for I met a man in Italy who happened to be advancing essentially the same argument as that of Gorgias and Protagoras.
Now a disclaimer: normally these kinds of conversations happen to me on an aircraft but this time it was at a bar. Still, the argument, which I am paraphrasing here, was worthy of any aircraft: it was stated in very anti-platonic terms (but of course, as it is essentially a sophistic argument) that since there is no objective vantage point, all moral codes are constructs. No one can say whether any is better than another or, for that matter, which is good at all, since even the notion of good is a construct. Put metaphorically, there is no “north”; there is only an agreed upon direction that many folks say is north, but if even one person should say that north is not north, then there can’t be a true north. Or, even if there is a true north, it is not knowable, as each person interprets the direction “north” in his or her own way.
On this view, the question of what north is ultimately becomes a preference—do I find north preferable or not? I may have my own ideas about north, but those are just my ideas, constructed for me, most likely, out of the worldview that I inherited. So, even if I say I prefer my interpretation of north I cannot discount another person’s interpretation of north, which might really be east, or south, or west, or some other direction. I cannot say to that person, “No, if you go west when you’re intending to go north it will be quite dangerous for you. I really want to dissuade you from taking the wrong direction.”
And the reason one should not do that, according to the view of the man at the bar, is because we ourselves actually can’t possibly “know,” however certain we may feel about it, where north really is; we only know what we prefer about what is called north and we may like (or simply be habituated to) our own “north” but we have to recognize that someone else’s west might serve just as well as a north as our own north does.
This sounds clever, and at first blush, even generous. Let’s start with the positive: it is generous and very “non-judgmental”—so much so, though, that even when it sees someone going the wrong way, it doesn’t intervene on the principle that true north is not a knowable concept. To press the north analogy just a bit, one might say, “After all, true north is not precisely magnetic north, which itself differs from grid north. So, who is to say what ‘north’ really is anyway?” And thus it is that the person who has thoroughly adopted this mindset can’t intervene when someone is going the wrong way on the principle that he or she should not presume to know that his own way is the right way. He prefers his direction, but it is only a preference.
The only comfort I can find in this argument really is that it is an old one; as Solomon wrote (though obviously not in Latin), nihil novum sub sole, and he was right, there is nothing new under the sun. The relativistic argument has been recycled nowadays and fobbed off as new, sc. post-modern. But really it is very un-modern, a bit humdrum, and in any case very old. And it is also countered not only by the obvious—that we do exist and that there is a such a thing as life, liberty and happiness, honor, dignity and worth—but by the fact that north itself does exist, entirely independent of us, our point of view, or even whether or not our compass should be working properly. While what we call “north” may vary both in terms of precisely where it is (as magnetic north does move a bit) and by what it is called—the Chinese (Mandarin) word for north is Bei, Japanese is Kita (though the symbol [北] for both is virtually the same, since the Japanese calligraphic kanji is based on Chinese Hanji), Hebrew is tzafon, Hungarian is északi; yet despite all these differences, north is, in the end, indeed northward, however tautological that may sound. Since that is true, it is especially important to call attention to the direction in which north lies when we find a person heading west but thinking that he is going north, who we know is clearly sailing into dangerous waters.
Thus it is not ethnocentric cultural superiority to say to the cannibal that it is simply wrong to kill and eat one’s fellow human being. Nor is it a matter of going too far to say that if one sees a woman being beaten by a man, it is good, even necessary to intervene. It is not wrong to tackle a bad guy who is running from the police, not wrong to prevent a terrorist from being successful in his attack (if it should fall to one’s lot to be in a position to do so), not wrong to stop any act of sheer evil. It is not the case that we should say to ourselves, “But I can’t know what the precise motives of that person happen to be, nor can I say that this or that person’s version of right and wrong are the same as my own, so I can’t and shouldn’t intervene.” We are not hardwired to conform to the non-interventionist “prime directive” of the old Star Trek series—the consistent failure to do which, by the way, made Captain Kirk the admirable hero of the series; indeed, do we not innately wish to do precisely what Kirk does?
Thus, we are born with an internal compass that suggests to everyone from every culture a sense of right and wrong and those of us who can recognize true north, actually have a kind of moral obligation—for we ultimately believe in morality, that morality is something given to us by a higher power, by God himself—to direct lovingly, wherever possible, those who are so far off track, whose moral compass is so broken, that they are likely to render harm to themselves or others. Is that ethnocentric cultural superiority? Someone might try to make that argument, but the moral code I am referring to as “north” has been transmitted by the votes of what G. K. Chesterton calls the “Democracy of the Dead,” handed down in many cases by wise teachers like Socrates, Jesus, Gandhi, Martin Luther King Jr., and, most recently, Mother Teresa of Calcutta. All of those individuals had a pretty good idea of the direction in which north lies. And what has been demonstrated for us by their example is instilled in us, ultimately, by God.
In closing, what can we learn from my friend at the bar? Well, first, we should recall that his ideas are not new: they are very old. They devolve from Protagoras and Gorgias. Second, we can learn that while being empathetic and seeking to understand as best as one can, the point of view of another is certainly a good thing—love your neighbor as yourself is an unqualified command—that does not mean that to do so we must deny our God-given internal compass. (And one should be very careful here, for if we deny it long enough, we may corrupt it or simply lose it, as so many of those who have joined the ranks of ISIS clearly have.) Rather, let us gage our journey by the North Star which means, from time to time, if we are following the internal compass aright, we may even have to direct others trying to find their way on the same path on which we are going. I am heading north; please feel free to join me.
 This treatise, by the way, enjoys the highly ironic title Περὶ τοῦ μὴ ὄντος ἢ Περὶ φύσεως, which, when translated, means “‘On Not Being’ or ‘On Nature’,” the latter of which the former clearly undermines.
In a recent article about the terribly difficult subject of suicide, Gretchen Winter and Gillian Mohney write, “the suicide rate went up 24 percent between 1999 and 2014 … according to the CDC.” The article notes that one particular group, middle-aged women, saw a huge uptick in suicides: 63%. Among males, the middle-aged also saw a large increase, some 43%. The authors go on to quote Jane Pearson, chair of the Suicide Research Consortium at the National Institute of Mental Health: “We don’t know why. We would like to know why. Knowing it’s going up, we are concerned, but we are not surprised because we have seen this trend happening.” Pearson, who in the article suggests stress is a root cause, added that more research is needed, especially because suicide is currently the tenth leading reason for death in the United States.
Another person consulted in the article, Dr. Russell Rothman, who serves as Assistant Vice Chancellor for health research at Vanderbilt University Medical Center, noted that this particular increase comes even after numerous public health initiatives targeting suicide in particular. He, too, is quoted: “The fact that we’re seeing increasing rates particularly among women—it’s probably multifactorially related to economic pressure, social pressure, our culture around acceptance of suicide.”
I can brook the facile and probably only partially accurate assumptions about economic pressures and social pressures (whatever those might be), I can even pass over Jane Pearson’s surprising lack of surprise at the rise in rates, but this last bit, the “acceptance of suicide” is what jumped out at me. It must have jumped out, too, at the authors of the article, for they took the liberty of adding in square brackets [and stigma] before “of suicide.” Yet that is apparently not what Assistant Vice Chancellor Rothman said.
It struck me precisely because I found myself wondering how it is that suicide has become acceptable. It hasn’t always been so. For the great Catholic writer G.K. Chesterton, suicide is “the sin … the ultimate and absolute evil.” He expands on this somewhat unsympathetically, viewing it as “the refusal to take an interest in existence; the refusal to take the oath of loyalty to life. The man who kills a man, kills a man. The man who kills himself, kills all men.”
By citing Chesterton’s indictment of the overarching notion of suicide, I do not wish to suggest that we, who in terms of greater knowledge of mental health issues have the advantage of living several generations after Chesterton, should fail to acknowledge the complexity of the issue or in any way lack sympathy for someone who struggles with depression or other manifestations of mental illness that sadly can lead to suicide. What, a la seventeenth-century scholar Robert Burton, might have been deemed by Chesterton merely a bout of melancholy is now correctly recognized as depression, bipolar disorder, or some other serious struggle with mental illness or even a physical illness that may cause mental problems. As a tragic and famous example of the latter, one may take the case of Robin Williams, whose death was the result of his struggle with Lewy-Body Syndrome. It is very likely the case not merely that he was depressed because of his illness and thus decided to end it all but rather that a manifestation of one of the frightening hallucinations associated with the disease compelled him to take his own life. This example and others like it reveal that what we call suicide is not a monolithic black-and-white issue but, like many things, once studied closely proves to be highly complicated.
Martin Luther understood as much. Luther’s view, recorded in 1532 in his Tischreden, was no doubt regarded at that time by the Church and the non-church alike as one more heterodox link in a long chain of heresies: “I am not of the opinion that those who kill themselves must be in our minds considered ‘damned.’ My reasoning is based on the idea that they do not kill themselves of their own volition but are simply overcome by the power of the Devil.” Luther’s view perhaps summarizes best in theological terms what I am trying to convey here in human terms. Indeed, as portrayed in perhaps the most poignant moment in the filmLuther, Luther’s assessment, put into practice, is decidedly humane. If you haven’t seen the movie and this topic is one close to your heart, I would urge you to click on this link.
Yet even though Luther’s interpretation of suicide is gentle and reveals how complicated the issue is, that does not mean the concept or idea of suicide—Plato’s word for “form” is idea—can be viewed merely as an alternative or a choice. And thus, what Dr. Rothman seems to regard as modern society’s de-stigmatization of suicide, might be an aspect or result of the way that suicide has been promoted as an alternative to pain. One thinks of the late Dr. Jack Kervorkian who even though he helped end human life—or rather precisely because he did so—is held as a hero by so many. Laws in many states now allow physician-assisted suicide. Such a way to escape pain has of late been touted as a reasonable alternative to living a life deemed less than worth living.
While there is certainly no single source for the shift in modern posture toward suicide, some discussions of it undoubtedly have been more influential than others. A few years before Kervorkian, in the late 1970s, Peter Singer, then a professor of Ethics at Monash University and now the Ira. W. DeCamp Professor of Bioethics, University Center for Human Values at Princeton University, took on the issue of the taking of human life. By virtue of his exalted status consisting of an endowed professorship at an Ivy League university, it might not be an overreach to say that Dr. Singer could be viewed as “America’s ethicist.” In the academic world, his ideas, perhaps more than any other individual scholar, have shaped the current American ethos, the American moral climate. (I have corresponded with him a time or two by e-mail.)
On the topic of suicide, Dr. Singer is in favor not just of suicide, but even the taking of one human life by another to alleviate pain. I haven’t time to rehears all of Dr. Singer’s arguments here; they are easily found online. At bottom, Singer’s views come down to a kind of practical hedonism. Why are Dr. Singer’s views so widely held today? Why has America (and much of the West) shifted away from the notion of life being sacred to pleasure being sacred? In part, it has to do with mere pragmatism. Most of it, though, has to do with this: we have allowed ourselves to become removed from any sense of story, any sense that we are part of a larger narrative, a saga that has meaning the way a joke has a punch-line or story has a moral. When we remove ourselves from that way of viewing life, there can be no morality that isn’t merely practical: hence, the title of Dr. Singer’s most famous book, Practical Ethics. Such ethics are situation driven, based on practical outcomes. Singer’s position is that of moral pragmatism in the extreme. If pain can be eliminated by one’s taking one’s own life, then suicide is acceptable.
Let’s look at a bit more of Chesterton’s rant against suicide for just one moment: “The Christian attitude to the martyr and the suicide was not what is so often affirmed in modern morals. It was not a matter of degree. It was not that a line must be drawn somewhere, and that the self-slayer in exaltation fell within the line, the self-slayer in sadness just beyond it. The Christian feeling evidently was not merely that the suicide was carrying martyrdom too far. The Christian feeling was furiously for one and furiously against the other: these two things that looked so much alike were at opposite ends of heaven and hell.” And thus Dante’s seventh circle. Yet perhaps Chesterton might better have said that there is simply meaning in suffering and that avoiding it at all cost is to deny that meaning.
Martin Luther’s gentle response is surely more humane than Chesterton’s harsh condemnation. Chesterton’s observation, if unlikely to help the person struggling with mental illness, nevertheless may usefully address those who prefer to sit back and theorize, who find acceptable the current acceptance of suicide that Singer’s 1979 book precipitated or at least anticipated. Can’t we find a better way than death, even if that better way should prove to be less “practical”? If we wish to do so, we shall, at some point, have to acknowledge that there is meaning in suffering; that suffering itself is not simply to be avoided at all cost; and finally, that we most certainly are a part of a grander narrative that gives meaning to our individual stories. Here’s to a proper ending to a wonderful story, the wonderful story that tells a tale of and for each and every one of us!
http://www.livescience.com/52682-what-is-lewy-body-dementia.html. I know this from firsthand experience, as Elaine Jakes died from this disease. Her hallucinations varied from the frightening—so frightening that already in her mid-50s she was being awoken by them from her deep sleep—to the benign. As an example of the latter, in the months before she moved in with us, she frequently thought she saw “the admiral” wandering about her house, which was then just across the street from our own; the admiral came with her and made frequent visits to our home, where she died roughly five years later.
 Ego non sum in ea sententia, ut penitus damnandos eos censeam, qui se ipsos occidunt; ratio est quia sie thun es nit gern, sed superantur Diaboli potentia …” [my translation of lemma 222, 7 April 1532].
 In all fairness, Professor Singer and/or his followers might dispute what I regard as a hedonistic impulse implicit in his work. It might be fairer to qualify that impulse as one particular manifestation of hedonism (nowadays associated with a refined palate), namely Epicureanism. While the modern idea of being an “epicure” is not an aspect of Practical Ethics, the notion central to Epicureanism’s historical teaching, specifically the avoidance of pain, is very much an aspect of Singer’s work.
Ah, the infamous “lost art.” One could fill in a number of notions after the three dots in the above title. A few phrases or words come to mind: kindness, gentility, non-electronic friendship. Less serious, too: tea brewing, whittling, even for many of us, gardening. Yet here I would submit for your consideration, letter writing.
This week I had a very unique experience. I received in the mail a single packet of four letters; one of which was a thank-you note written to Elaine Jakes by my beloved high school teacher, Zinaida Sprowles, whose first name means “belonging to Zeus.” And godlike she was, for Mrs. Sprowles, who is mentioned in the Curious Autobiography (p.101), was the under-appreciated gem of the New Hope-Solebury High School faculty. Originally a Latin teacher, Zinny (for so she was called) was, by the time I had her in school, nearing the end of her career. By then they had phased out Latin (so was the trend then, as the administration could see no use for it) and relegated the tenured, and therefore not able-to-be-fired erstwhile Latin teacher to teaching English courses, though they allowed her to retain the honors students’ section of what amounted to the best college preparatory courses at New Hope-Solebury, classes that were essentially Great Texts (or what is sometimes called Western World Literature). I was not an honors student, and thus I had no access to that track or to Mrs. Sprowles, unless she happened to teach a regular English elective.
Fortunately for me, she did just that, but it was the second term of my junior year. Hitherto I had known Mrs. Sprowles only from the school hallways. Yet, having met with Mr. Karl Richter, the school’s guidance counselor, with his help I constructed a schedule that included a strange elective—strange for me, that is, because I was a numbers kid, excelling in Physics and mathematics and a member of the geekily (but sadly all too fittingly) “Mathletic Team.” The elective in question was “Detective Literature,” and it focused almost entirely on the works of Arthur Conan Doyle and the figure of Sherlock Holmes. It was taught by none other than my hallway-only acquaintance, Mrs. Sprowles.
Class by class Mrs. Sprowles vivaciously led discussions on the characterization of Holmes or Watson, Doyle’s craft in writing, tension, climax and resolution of each work, construction of plausibility, and the list goes on. I had never encountered a teacher of this caliber. Why, I wondered, was she the only teacher in New Hope-Solebury who had no desk, no classroom? Was it some kind of less than subliminal message from the administration? In any case, she was the self-styled peripatetic pedagogue, though she was far more academic and Platonic than she was categorical and Aristotelian. In fact, that is what made Mrs. Sprowles so profoundly delightful: she was not someone who observed and put things into boxes but she was utterly academic, someone who sought the highest origins and deepest forms.
And that is what must have frightened the administration of New Hope-Solebury High School in those days, the fear that students would become so enamored of learning that they would follow this peripatetic pedagogue just anywhere she might happen to meander in her academic wandering. Indeed, some of us did. Having used whatever influence she had left with Mr. Richter, she managed to squeeze me into her honors class (even though I had been, outside of math and physics a grade-wise dishonorable student), she led me and the rest of that senior seminar to the theater of Dionysus where we witnessed by reading the Oresteia and came to understand the importance of justice and democracy. We would follow her to the ancient agora, where we could overhear Socrates speaking with the young all too self-righteous and overconfident Euthyphro in front of the Stoa Basileios. And, like all the truly great educators such as Socrates, she was misunderstood by the higher-ups.
This is the area, I think, in which Elaine Jakes and Mrs. Sprowles would have fundamentally connected, for both were educators of a similar ilk, all too often misunderstood by all but their students. Yet that letter that Mrs. Sprowles wrote was never sent, presumably because it fell into a crack in the desk or was covered over by two days’ worth of mail and, by the time Zinny found it, it was too late to send. Yet why did she keep it all those years? That I cannot ever know. But I am glad that her daughter took the time to send it to me, along with three other letters written by a very young version of myself—a first-year college student at Dickinson—to his former high school teacher and inspiration, Mrs. Sprowles.
I’m not writing to say that I thought, when I read them, that my own three letters were well written or conveyed anything more than sincere appreciation to a wonderful teacher, or even that Mrs. Sprowles’ note to Elaine Jakes is anything to write home about. Rather, these four letters collectively reflect something bigger, something that is actually worth writing home about: the lost art of letter writing. It is truly a lost art, for art is an aspect of letter writing, as it involves several artistic choices.
First, one must find the right stationery. If one chooses a note format, as Mrs. Sprowles did in her unsent note to Elaine, one must ensure that the card befits the occasion, even if it is blank inside. Then there is the issue of penmanship. Here I’m afraid I fail miserably. Even my finest penmanship is shoddy at best, and I blame my fourth-grade self for snickering and treating as trivial the lessons of Mrs. Hendrickson, my teacher that year, who labored relentlessly to get me and one or two others in the class (was it Mickey? Todd?) to write more legibly. Then there is content, which of course is the most important bit. Yet even that comes out differently with a pen on paper than it does in a computer. It is not correctable on paper: one must get it right the first time.
And this is an art, an art that I was confronted with from a former generational iteration of myself. In case you’re wondering, other than the penmanship and poor choice of stationery, I did okay. But Mrs. Sprowles’ note was far more meaningful. How good it was to see her handwriting again after so many years. How rich and thrilling to know that she had cared enough to write my mother a note—a note I would never have known about had Mrs. Sprowles actually ever have sent it. And that is the key part of the art, the production of the artifact of the epistle itself.
Memory is such a funny thing: it allows us to record in some deep recess of the brain a meaningful event, and never let go. It is something like hope, but backwards. In his Confessions, Augustine demonstrates the power of memory by going back in time to his childhood and his life as a young adult and rendering it all in seven lovely and quite memorable books. But then in the eighth book he begins to shift the notion of memory around so that with the final five books he has reoriented his own and the reader’s mind as he engages ideas that are otherworldly, heavenly. The way he does this is to anchor himself and the reader in the past by memories, one upon another. Mrs. Sprowles’ short note did that for me this week, and my mind looks forward to an otherworldly hope of sitting for tea with her again in a place far away that some of us call Home. I hope she has some of her delightful cinnamon buns with that tea, for I recall the last time we met we enjoyed them together, yet another sweet memory; but a sweeter hope.