Coming Soon:

Now Available: Volumes I, II, III, and IV of the Collected Published and Unpublished Papers.
NEW: A Collection of Pebbles from The Philosopher's Stone
Volume I: 2009 Now Available at
Volume II: 2010 Now Available at
Volume III: 2011 Now available at
Volume IV: 2012 Now available at

Total Pageviews

Sunday, October 4, 2015


Immanuel Kant was, in my judgment, the greatest philosopher who has ever lived, but he is very far from being my favorite philosopher.  For sheer beauty, wit, depth, and ironic distance from the philosophical bog, as Emily Dickinson would have called it, I prefer Kierkegaard.  My text for today [it is, after all, Sunday] is this brief passage from the coruscatingly brilliant Preface to Kierkegaard's Philosophical Fragments:

"It is not given to everyone to have his private tasks of meditation and reflection so happily coincident with the public interest that it becomes difficult to judge how far he serves merely himself and how far the public good.  Consider the example of Archimedes, who sat unperturbed in the contemplation of his circles while Syracuse was being taken, and the beautiful words he spoke to the Roman soldier who slew him: nolite perturbare circulos meos.  [do not disturb my circles -- ed.]"

All but overwhelmed by persistent pain, I have decided to contemplate my circles and leave it to others to decide whether such meditation serves the public good.  My topic today, as it has been on many other days, is how one ought to study philosophy, how one ought to read philosophers, and -- by extension -- how one ought to write philosophy.

My answer to these questions places me in conflict with contemporary professional philosophers, at least in the American academic philosophical world.   To state my conclusion as simply as I am able, I believe that in studying philosophy, you would be well advised to devote your time to reading the writings of the great philosophers, and that it is imperative to read the entire books they have left for us, not merely those passages in which they appear to be addressing some problem that interests you.  When it comes time to put your own thoughts into written form, you should undertake a systematic book-length consideration of the problems or topics that seize you, rather than confining what you have to say to brief essays suitable for publication in the professional journals currently admired by the inhabitants of the bog.

This answer, as I say, puts me at odds with most professional philosophers in the American academy.  When I was young, an aspirant for admission to the guild was required to write a doctoral dissertation, which was understood conventionally to be the length of a short book -- perhaps 75,000 to 100,000 words.  It goes without saying that very few dissertations actually were short books, and fewer still found publishers.  Not every garage band becomes The Beatles, after all.  But the dissertation was understood to require a breadth of learning, a care in scholarship, and a quality of sustained argument that distinguished it from the seminar papers that by then one had cranked out in such proliferation.

At some point, when I was no longer paying close attention to the profession, the practice arose of substituting for the dissertation three "publishable" papers on related subjects.  These papers were to be modeled on the articles that were regularly published in professional journals, and as the competition for entry-level jobs intensified, students were encouraged actually to try to publish one or more of their "dissertation" essays, in hopes of improving their chances on the job market .

Seemingly as part of this fundamental change in the requirements for the degree [although there may be no connection here -- I simply do not know], professors stopped assigning entire books in their courses, and took to assigning selections -- a chapter here, a handful of pages there -- as though trying to communicate that Descartes or Kant or Hobbes really would have written journal articles, if only there had been journals in which to publish them.  Now that I have become somewhat more deeply embedded in the UNC Chapel Hill Philosophy Department [to use the term of art for reporters assigned to front-line fighting units], I have taken to asking the students I encounter what they are reading in their other courses and seminars.  For the most part, it seems, they read recent journal articles or selections from the classic canon of full-scale philosophical books.  There may not be a graduate student in the department who has been asked to read the entire Critique of Pure Reason, and I would bet that not one of them has plowed through all three books of A Treatise of Human Nature.

So what?  Let  me attempt a reply that rises above the level of a shocked "Well I never!"

Great philosophers, as I have often observed, see more deeply on occasion than they can say.  They grasp complex conceptual relationships that may actually exceed the capacity of their received philosophical language to articulate.  A fruitful engagement with the mind of a great philosopher is therefore not merely an effort to understand what the philosopher intended to say, but also a struggle to make connections among parts of his or her text that allow one to bring to the surface and clarify one of those deep insights.  The philosopher may actually believe that the several parts of his or her text cohere comfortably, but we, coming later and with the benefit of hindsight, may recognize things going on conceptually that the philosopher either did not fully see or could not clearly state.

Let me give just two examples, taken from my own encounters with great texts.  The first example comes from David Hume's A Treatise of Human Nature.  As even the most casual students of Hume know, far and away the most famous argument in the Treatise is Hume's sceptical critique of causal inference -- the critique that awoke Kant from his "dogmatic slumbers."  That argument is found in Book I, Part iii, Section iii of the Treatise, "Why a Cause is Always Necessary," and occupies a mere three pages of text. 

Having demonstrated that we have no rational ground for asserting the necessity of connection between an event and its supposed cause, Hume goes on later in Part iii to ask whence we derive this notion of necessary connexion.    Hume's answer occupies the eighteen pages of section xiv, although the heart of it can be found in the first few pages of the section.  The key, not to dive too deeply into the weeds, is a category of mental representations that Hume labels "impressions of reflexion."

A professor of philosopher these days would, I imagine, think it satisfactory merely to assign sections iii and xiv to the students in his or her class.  If the class were being taught at the graduate level, the professor might even go so far as to assign some additional sections from part iii, as background.

But in all likelihood, unless the professor had a better philosophical education than he or she was offering his or her own students, that professor would be blithely ignorant of the fact that the category of "impressions of reflexion" was actually invented by Hume to explain the passions of love and hatred, desire and aversion, subjects not mentioned until Book Two of the Treatise.   A student who does not read the entire Treatise will never really understand what Hume is talking about.

But why not therefore just beef up the assignment with a few selected pages from Book II, or even, if one really thinks it necessary, from Book III?

Because to do so would be to deny the student the opportunity to make his or her own connections and interpretations, drawling perhaps on part of the Treatise that I, or some other professor, did not consider provocative or suggestive or dispositive.  It would thus deny the student the opportunity to become -- a philosopher.

A second example, this one rather more serious [and also, I fear, a bit more complex to explain], comes from Kant's philosophy.  A central philosophical impulse driving Kant's philosophy was his desire to make the deterministic physics of his day compatible with the freedom underpinning our actions as moral agents.  His somewhat formulaic solution was to confine Newton's laws [and Euclid's] to the realm of things as they appear to us in space and time [phenomena, so called], reserving the realm of things as they are in themselves  [or noumena] for moral agency.  In organizing the extraordinary philosophical undertaking in which he would demonstrate all of this [while also making room for aesthetic judgments and heaven knows what else], Kant thought he had found a way to show that the concepts we employ in our scientific analysis of phenomena -- causation, substance, and the rest -- could have possible, consistent, meaningful application to the realm of noumena, so long as we did not make Leibniz's mistake of supposing that such application yields knowledge.

All was well, in the Kantian scheme of things, so long as one remained at a relatively superficial level [superficial for Kant, that is to say -- profound and deep for everyone else!]  But when Kant was in the depths of writing the Deduction of the Pure Concepts of Understanding, the most important passage in the Critique of Pure Reason [an effort that I have elsewhere on this blog compared to Gandalf the Grey's wrestling with the Balrog in the depths of the Cave of Moria, a struggle from which he emerged changed as Gandalf the White], Kant fundamentally changed his analysis of the nature of concepts.  One of the clear implications of that change was that concepts such as substance and causation do not have even possible application to the realm of noumena.

And that knocks Kant's "resolution of the conflict between free will and determinism" into a cocked hat.

This problem is so serious that it calls into question Kant's entire ethical theory.  Kant himself never realized it, and neither, so far as I can tell, have any serious Kant commentators save myself.  [This is my blog, damn it, and you are just going to have to allow me to channel Mr. Toad!]

You see, Kant is so hard that for a long time, until I came along, the only person writing in English who had ever attempted books on both the First Critique and Kant's ethical theory was the Scotsman H. J. Paton, who, unfortunately, never saw a sentence by Kant that he did not unthinkingly endorse.  So people have gone on writing about Kant's ethical theory without the slightest awareness that there might be a problem.

So not only is it a very bad idea to read snippets of Kant -- the Second Analogy from the First Critique or the famous four examples of the Categorical Imperative from the Groundwork of the Metaphysics of Morals.  It is even a very bad idea to read just Kant's theoretical philosophy without his moral philosophy, or vice versa.

Enough said.

Saturday, October 3, 2015


Please recall my severe warnings of personal incompetence.  I really meant them.

To Wallace Stevens:  Thank you.  Even at my age I ought not to have forgotten the invasion of Czechoslovakia.  Sorry about that.

Here is Nocomment's comment:  "Normally I find your analyses to be pretty accurate. This time, however, I believe you have missed something.

Having idly stood by while the US meddled and muddled in just about every Middle East country, Russia (once the proud sponsor of its OWN autocratic regimes there) cannot bear to have its last tinpot dictator taken out by the Yankees. I believe it is the US which has forced its hand. Going into Syria is not to prop up a pro-Soviet regime (Afghanistan) but to assert its rights to play at the same poker table. As the US expands its Dark Force throughout the galaxy, we will see increasing pushback from China and Russia. And as they travel down the same road as the US, expect some collisions. And proxy wars.

I'm not sure Obama (or you) are right in predicting another quagmire for Russia in a far-off outpost. Obama's hardly one to talk. What you've missed is that Russia will get out when the US gets out of Syria. Their quagmire is our quagmire."

I agree pretty much with your characterization of the situation in the first part of your comment, but I have a somewhat different view of the overall state of play.  As I see it [influenced as I am by the world view of my old friend and University of Chicago colleague Hans Morgenthau], the American imperium expanded into the power vacuum created by the collapse of the Soviet Union, and is now being challenged by Putin's attempt to re-expand Russia's influence.  I doubt that Putin is motivated simply by pique.  I would guess that his expansionist moves are connected with the weakness of the Russian economy.  Nor do I think [or guess, more accurately] that Putin will find it so easy to pull out of Syria once he commits forces there, even if the United States manages to withdraw under the color of an "international coalition."  That was my reason for invoking the Tar Baby. Putin is not more skillful than the Americans are, so far as I can see, and it will be politically very costly for him to exit from Syria, having failed to prop up Assad and to defeat ISIL [as I am convinced he will fail.]

But I repeat, I really do not know, nor do I even have the right to claim any plausibility for my guesses.  I suppose we shall have to wait and see.


Having successfully misidentified a line from Plato as having been written by Aristotle, I am emboldened to offer an opinion about something I really know nothing about.  That is, as I understand it, the raison d'ȇtre of blogging.  Herewith, therefore, my take on the decision by Vladimir Putin to thrust Russian military forces into the complex Syrian civil war.  I think it is only fair to point out that I have never set foot in any of the countries I shall be mentioning [save for the United States] and do not speak, read, or write any of the languages used by the residents of those countries.  Caveat lector.

Since the end of the Second World War, the United States and Russia have pursued quite different imperial paths.  Russia, dba [doing business as] the Soviet Union, expanded its empire almost exclusively by incorporating contiguous territories along its eastern, southern, and western borders.  At its height, the Soviet Union spanned eleven time zones and bestrode the Eurasian land mass like a colossus.  Not once during the entire post-war period did the Soviet Union engage its military forces anywhere that was not contiguous to its homeland.  Only twice that I can recall did the Soviet Union commit major military forces in a foreign action.  The first was the 1956 invasion of Hungary [one of whose many other consequences was bringing the Jesuit philosopher Zeno Vendler to Harvard as a graduate student], which took place by way of Ukraine, then a Soviet Socialist Republic.  The second was the disastrous ten year Afghan War, launched through the contiguous SSRs Turkmenistan and Uzbekistan.

The United States, in contrast, has not hesitated to send its military forces across the globe, to Korea, to Viet Nam, to Panama, to Grenada, to Afghanistan, to Iraq, to Syria, among many other places.  At the present time, the United States maintains well over 200,000 Armed Forces personnel in nearly one hundred fifty countries. 

Driven by a desire to reestablish some simulacrum of the Soviet glory, and eager to direct the attention of Russians away from a disastrous economy in deep depression, Putin first made characteristic Russian moves into Crimea and Ukraine.  But now he has made the fateful decision to thrust  Russia militarily into a region not contiguous to the homeland, even conceived in its most expansive Soviet moment.

I will offer a prediction [which, you must understand, is scarcely worth the corner of the Cloud that  it occupies]:  This will not go well for the Russians.  They will begin with surgical air strikes, which will weaken the anti-Assad forces and thereby strengthen ISIL.  Ineluctably, Putin will be drawn to supplementing his air force with "boots on the ground," first as target spotters, then as Special Forces, then as regular forces.  Like Br'er Rabbit in Joel Chandler Harris' story The Tar Baby, Putin will become more inextricably entangled in Syria the more he struggles to extricate himself from an unsuccessful military adventure.

I suspect this is what Obama had in mind during his extraordinary press conference yesterday when he described the Russian move as having been made not from strength but from weakness.

Friday, October 2, 2015


On the basis of consultation with a close friend and my sister, both of whom have suffered what I am suffering, I have tentatively concluded that the excruciating pain I have been experiencing for two months and more is caused [in some sense of caused] by polymyalgia rheumatica [PMR].  If I do have PMR, it can be treated more or less miraculously with low doses of prednisone.

I have pulled off a major coup and landed an appointment with UNC's Rheumatology Clinic for next Friday rather than January 20th, 2016, their original scheduling.  The Mayo Clinic website [a major resource for self-diagnosticians] mentions three blood tests that are markers for PMR.  With some difficulty, I persuaded my primary care physician [whom I have now managed to replace] to order the blood tests.  They did not show the markers.  A major downer.

However, the specialist I shall be seeing next Friday tells me in a message that those blood tests are not "diagnostic" for PMR.  What is "diagnostic" is successful treatment with prednisone.

In other words, if they give you prednisone and you feel better, they conclude decisively that you did indeed have what they gave you the prednisone for.

Unless I am mistaken, that is the reasoning that underlies the Hopi Rain Dance.

I am glad I have available to me the latest advances of modern medicine.

Thursday, October 1, 2015


I should like to offer a special thank you to Magpie for his kind words about Moneybags Must Be So Lucky.  I have always thought that Moneybags is, pound for pound [or page for page] the best thing I ever have written, and it distressed me deeply that, in the words of David Hume, "it fell stillborn from the presses."  Indeed, it was the failure of that book to gain any notice at all [save for a nice review in The Village Voice, of all places, by George Sciallaba] that turned me away from writing for a while and into administration of a sort.  I am thrilled that all these years later my little book is finding an appreciative reader.

By the bye, do all of my blog followers understand the literal meaning of the metaphor "fell stillborn from the presses?"  Since I suspect not everyone does, I shall explain.  [I really am the sort of person who could cast a pall over the most  delightful occasion!]

In Hume's days, women did not give birth lying on their backs.  They sat on pieces of furniture called birthing stools, using the force of gravity to assist in the birth, much as other mammals do.  Hence the verb form "falling."  The image is that just as a stillborn fetus falls dead as the mother gives birth, so Hume's Treatise fell dead from the presses when it was published.  Our modern metaphor "dead on arrival" probably has a similar origin, but absent the image of the birthing stool.

Since it is Hume's style, above that of all other philosophers, to which I have aspired throughout my life, I take comfort in the fact that when he published, as a young man, the greatest work of philosophy ever written in English, the early reviewers were not kind.  I can still recall walking up and down the aisles of the stacks in Widener Library in 1956, pulling down copies of eighteenth century English journals and searching for reviews of Hume's Treatise.

Well, enough of strolling down memory lane.  Thank you, Magpie.

Wednesday, September 30, 2015


I have it firmly fixed in my mind, although I do not have the textual reference, that Aristotle somewhere observes that shit does not have a form.  [The Greek scholars among you can help me out.]  Now, my work, for all of my life, has consisted of intuiting conceptual or argumentative forms and then trying to articulate them clearly and transparently.  Perhaps this is why I find it so difficult to write about the current political scene.  Bernie Sanders is the only political figure in America today whose utterances can support a conceptual analysis of any sort.  The remainder is burlesque, low artifice, or vulgar evil. 

I am sure you will understand the problem this poses for a philosophical blogger.

Tuesday, September 29, 2015


The young Marx famously wrote, "Philosophers have hitherto only interpreted the world in various ways; the point is to change it."  To which I am inclined to respond, "Fine for you to talk, Mr. World-Historical, but what about us poor schlubs whose greatest efforts do not even register on the global needle?"  Day after day, I sit in my apartment yelling at the TV when political commentators say dead stupid things.  When I cannot stand it anymore, I retreat into elaborate fantasies of magical powers with which I redistribute wealth or reverse global warming or put a piece of adhesive tape over Chris Matthews' mouth.

My latest fit of TV-yelling was triggered by some nameless member of a panel of opinionaters on Hardball, Chris Matthews' MSNBC show.  At issue was the phony "scandal" created by some opponents of Planned Parenthood who cooked up photo-shopped videos that purport to reveal Planned Parenthood employees heartlessly bargaining for the body parts of aborted fetuses to be used in medical research.  Carly Fiorina, the failed Hewlett Packard CEO now making a run for the Republican presidential nomination, has made this the centerpiece of her right-wing lunge for the nomination.

"Stupid, stupid, stupid," I shouted at my inoffensive TV set, as the idiot on the Hardball panel went on about how terrible it was that a woman in one of the videos should casually eat her lunch and sip wine while talking about fetal body parts.

So, here is what I would have said if, magically, I had been transported to the MSNBC set and had by the exercise of unimaginable powers forced Matthews to shut up for ten minutes.  Since this must all seem not only pathetic but also incomprehensible to my overseas readers, a few words of explanation are called for.

Planned Parenthood is a private non-profit organization, almost a century old now, that provides reproductive health services and associated medical services [cancer screenings, etc.] to women.  It has an annual budget of about one billion dollars, half of which comes, in one form or another, from federal and state governments.  Its important role in providing contraception to women and in performing abortions has made it the target of so-called "pro-life" forces in American politics.  Carly Fiorina is a businesswoman who worked first for the Lucent Corporation and then as CEO of Hewlett Packard, the IT giant.  As head of HP, she initiated the acquisition of Compaq, which turned out to be a disastrous mistake, causing the HP stock to lose half its value and bringing about her firing.

So much for background.  Here is what I would have liked to have the opportunity to say to that brain-dead member of the MSNBC panel ["brain-dead" does not, of course, uniquely identify her, but that is another matter.]

It is often said that in the first year of Medical School, the students must start to think of themselves as doctors, just as the first year of Law School is devoted to getting the students to think like lawyers.  Thinking of oneself as a doctor means, among many other things, adopting an utterly unnatural attitude toward the human body -- an objectifying, de-sensitizing attitude that enables the newly formed doctor to engage in such activities as physical examinations and invasive operations calmly, routinely, scientifically, and without gagging or throwing up.  One of the ways in  which medical schools accomplish this is by setting the student to work, right at the beginning of the very first semester, dissecting a cadaver.  If you can even allow yourself to think about it, there is something profoundly unnatural and unsettling about picking up a scalpel and cutting into a corpse.  Those first cuts are bad enough, even with the rest of your cadaver team there to cheer you on, but imagine what it feels like to dissect a liver, a penis, an eyeball, a brain.

Doctors steel themselves for these experiences by breathing deeply, gritting their teeth, making crude locker room or funeral parlor jokes, and in every way they can denying the appalling reality of what they are doing.  Years later, those who become surgeons routinely cut open living, breathing, bleeding people.  We need them to do this, we want them to do this, because our health and our very lives depend on their ability to do so calmly, deliberately, even casually.  So a surgeon who is up to her elbows in the chest cavity of a patient, blood flowing all around, will chat about the latest episode of NCIS or discuss what she had for dinner the previous night. 

It is a profound mistake, not to say a stupid mistake, to conclude that the surgeon is heartless, or amoral, or insensitive to the humanity of the patient.  This is, after all, a highly skilled woman who is prepared to work for thirty-six hours without rest to save the life of a desperately ill patient.

Which brings me to the woman having lunch while talking about salvaging for medical research the body parts of an aborted fetus.  It is medically extremely valuable to have those body parts available for research.  Countless life-saving medical advances have resulted from such research.  But the process of cutting open a dead fetus to preserve the appropriate organs is, viewed from a normal human perspective, appalling, revolting, unthinkable -- just as appalling, revolting, and unthinkable as it is to perform open heart surgery or to remove a brain tumor.  The only way one can engage in such activities is to grow an emotional carapace that protects you from your ordinary human responses.

So when that woman on the MSNBC panel went on about how horrid it was for a Planned Parenthood employee to sip her wine and eat her lunch while discussing the harvesting of fetal organs, all I could do was shout at the TV set, "Stupid!  Stupid!  Stupid!"

Fat lot of good that did.