top of page

Critical remarks on academia and philosophy


Socrates is a good role-model for philosophers.

I was recently skimming old notes of mine and came across the following thoughts from a two-volume book on "humanism" I wanted to write in 2006, as a graduate student in philosophy. In retrospect, I find them hilarious, naïve, and overly idealistic. But in some respects admirable. Many of them I still agree with. (For instance, the insistence on intuition.) Their irreverence is refreshing, if at times a little offensive. I've deleted some of the repetitions and inapt formulations, but on the whole I like the critical attitude they display towards academia and contemporary philosophy, even if now and then they're pretty unfair. I also like the alternative conception of philosophy I outlined near the end, a sort of anti-academic conception.


(The thoughts remind me of a little story I wrote around the same time called "Imprisoned in an Ivory Tower.")


I had begun the chapter with introductory remarks on contemporary society, which I link to below. But the main point of it all was that I was unhappy with how philosophy tends to be done today, and with how "complacent"—politically, too—intellectuals usually are. I have no idea who my audience was supposed to be, since general readers wouldn't have been interested and academics would have been alienated.


***


...Having tantalized you with these thoughts on selfhood, I’m going to proceed to the main topic of this chapter, namely ‘Academia as a Case-Study in Stuntedness’. I choose academia as my target for two reasons: I’m intimately acquainted with it, and its framework is such that the people who spend their lives in it have more opportunities than anyone else in society to realize themselves in all their potential. Especially their intellectual potential. And yet they usually don’t: they tend to be somewhat unambitious, intellectually and spiritually stunted. They are über-specialized and complacent. Their attitude toward intellectual matters is not entirely healthy, I think. They are at the same time not serious enough about such matters and too serious: on the one hand, they seem to have lost the desire to really understand the world, choosing instead to write masses of technical papers [I was thinking of philosophers] in an effort to build up their curriculum vita; on the other hand, they approach their professions in the spirit of drudgery, as if writing articles is a necessary chore, a necessary part of the academic game. There is no spirit of play[1] or spontaneity in any of their writings; they just write and write, like automatons, because it’s something they have to do to rise in the academic world. Ironically, they themselves are sometimes the first to admit it's all a meaningless “game”; yet they continue to go along with it. Of course, if they don’t, they might not be promoted, or they might lose their jobs.


The ancient quest for truth has suffered a rather ignominious demise, at least temporarily. Consider any of the postmodernist texts that proliferate in academia. Like Kenneth Gergen’s book The Saturated Self, which, when it was published in the 1990s, was heralded as a godsend by philosophical nihilists everywhere.

...Most of the cherished beliefs that undergird the traditional goals of research and teaching are in eclipse. Some consider the demise of the traditional assumptions to be an event little short of catastrophe; to part with the longstanding ideals of truth and understanding is to invite chaos, first in the academic world and then in society more generally. Others feel an innervating sense that history is at a turning point, that a new and exciting era is in the making.
...The crisis in the academy about beliefs in objective knowledge has profound implications for beliefs about the self... In large measure[,] [traditional beliefs about the self] derive their credibility from the assumption that they are objectively true. But...there is ample reason to doubt such claims in the case of human personality. The present crisis in the academy presses the argument to its radical extreme. We are not dealing here with doubts regarding claims about the truth of human character, but with the full-scale abandonment of the concept of objective truth. The argument is not that our descriptions of the self are objectively shaky, but that the very attempt to render accurate understanding is itself bankrupt. And if objective accounts of human personality are beyond possibility, then why continue the search for human essence? Whatever we are is beyond telling.[2]

Gergen’s book is not only among the silliest I’ve ever read but is “saturated” with self-contradictions. Its continued popularity testifies to widespread intellectual despair. As does the popularity of Jean-François Lyotard’s version of postmodernism:

“How do you prove the proof?” or, more generally, “Who decides the conditions of truth?” It is recognized that the conditions of truth, in other words, the rules of the game of science, are immanent in that game, that they can only be established within the bounds of a debate that is already scientific in nature, and that there is no other proof that the rules are good than the consensus extended to them by the experts.[3]

Most Anglo-American philosophers recognize that Lyotard’s famous book, and postmodernism in general, displays an impressive lack of intellectual integrity. (I explain that remark in the appendix.) Analytic philosophers usually still believe in such quaint notions as truth, intellectual rigor, deductive reasoning and standards of rationality, and recognize logical fallacies when they see them.[4] Indeed, they are fanatically committed to logical rigor—which is, in a sense, precisely the problem. For their commitment is so obsessive that it manifests itself in an exclusive concern with scholarly minutiae, mind-boggling technicalities, preciseness for the sake of preciseness.[5] While their substantive positions are not as idiotic as the postmodernist’s...one is hard-pressed to find their substantive positions. When they exist, they’re often hidden beneath a lot of verbiage. The point is no longer to find the truth about the self, or the mind-body problem, or ontological confusions, or—in the sphere of social science—the causes of a given historical event, or the workings of the economy; the point is to debate endlessly about methodology, or about what past thinkers really thought, or about the proper definition of a concept, and in general to indulge in mere intellectual exercises. Few people are confident enough anymore to attack substantive questions head-on, and few are presumptuous enough to think their work will have lasting influence. Effectively, therefore, even academics[6] with intellectual integrity don’t believe in real ‘truth’, since they don’t act on that belief.


“But the world is so complicated these days!” I hear someone shout at me. “There are so many theories, so many things to sort out! To get clear on the kind of ‘substantive truth’ you’re talking about is too hard!” First of all, ‘hard’ doesn’t mean ‘impossible’. No doubt it is exceedingly difficult to attain even a passable understanding of something like the self; it is not impossible, though. There are truths.[7] The successes of natural science have demonstrated this, at least beyond a reasonable doubt. How else could scientists act on the world—make correct predictions, etc.—if not by having understood it in some way or other? How else could the application of psychological theories result in the restoration of an individual’s self-esteem if it weren’t that the theories manifested some insight into the way the psyche really works? Or will it be denied there is such a thing as the ‘psyche in itself’? It will, by many anti-realists. Such people oppose representationalism, or the correspondence theory of truth, with neo-idealist arguments like ‘Reality is socially constructed’—thus, incidentally, confirming (some version of) the correspondence theory of truth. For their own theory is intended to portray ‘reality’ in propositional form, namely the ‘reality’ that there is no world-in-itself, no reality independent of interpretation. Any denial of truth is self-refuting in this way, not to mention contrary to common sense (which upholds the existence of facts, or states of affairs, whose truth doesn’t depend on whether people are aware of them as facts). Philosophical truths are admittedly not as purely ‘representational’ as natural-scientific truths, in that they don’t refer to (i.e., portray) physical states of affairs, but they too are explications of the world that can be evaluated on the basis of cross-cultural standards of reason.


I’m not going to delve deeply into questions about the nature of truth right now. My point is only that the goal of understanding truth is implicit in all theorizing, even that which denies truth, and that there is at least potentially a ‘best interpretation’ with regard to any object of analysis. The possibility of one is implicit in the fact that we can rank opinions in terms of rational justifiability. For example, the belief that the global economy is kept in motion by millions of fairies who sprinkle pixie-dust on money is less justified than the belief that international exchange-rates have a lot to do with it. The second interpretation is better than the first, which means that it’s possible for interpretations to be ranked according to their justifiability, which means there is potentially a best interpretation of economic functioning. Doubtless we’ll never realize this ideal interpretation, this ‘truth’, in practice, since there are too many variables that come into play, but we can approximate it. And we can approximate it to ever-greater degrees. –Unless, of course, we lack the will to truth.


[...] If academics really cared about truth they would take an interdisciplinary perspective on things, to see how they all fit together. The philosophy of language bears on epistemology; psychology is relevant to sociology; anthropology can contribute to the philosophy of history. Everyone knows this, but almost no one acts on it.


Admittedly, the situation isn’t really anyone’s fault. Ultimately, systemic factors are responsible for it. It is impossible to untangle all the specific causes—there are too many of them, and social science rarely has recourse to the reliable methods of natural science[8]—but their general outlines are intuitively obvious. As scientific technologies and the global economy have grown in complexity, greater specialization between industries and between jobs in a given industry has been necessary. This has led to greater economic productivity,[9] which has stimulated the growth of industry and thereby intensified the division of labor, in a self-reinforcing cycle. Contemporaneous with these developments has been an intensification in the division of labor between the ideological (academic) professions, for the changing material conditions have led to a correspondingly more intricate and nuanced understanding of the world. –This is just elementary Marxism. As social relations have grown more complex and interdependent, so have relations between ideas. At the same time, there has been a growth in population, a migration to the cities, an extension of university education to more and more people, and thus an expansion of the university. All this has meant more specialization between disciplines and more competition between practitioners of these disciplines (competition for jobs, for prestige, etc.). To ‘get ahead’, academics have to compete, and they have to compete according to the rules of the system, one of which is ever-greater specialization—until the system has finally reached the point where an economist, for example, can lament that


Economics has....become so broad and so complicated that, within the fields, one group of specialists barely speaks the same language as the Ph.D.s across the hall. And so much of what is published seems more to proselytize for an ideology than to make sense of the chaotic world.... It’s no wonder that a single economic development can be interpreted as a godsend or a disaster, depending on the interpreter’s frame of reference.[10]

And yet, from the scholar’s perspective, by ‘specializing’ he is only being intellectually honest and thorough. For each subfield in each discipline is so complicated that he has to be extraordinarily cautious and subtle in his reasoning, and he can’t be too ambitious because his colleagues will invariably find a way to contradict him. The more ambitious he is, the more opportunities there are to find weaknesses in his arguments. Unless he is already famous, people will probably treat his ambitiousness as naïve. So the best, and apparently most ‘disciplined’, thing to do is for him to write narrow papers and gradually build up an extensive CV, at the same time contributing to the evolution of knowledge ‘in his own small way’, through his relatively unambitious but respectable (closely reasoned) articles.


In light of all this, where is the academy headed? Well, clearly the process I've outlined will continue. No end is in sight. The global economy is growing every year; universities are still expanding, hiring more professors and more specialized professors all the time. And scholars, virtually of necessity, continue to do what the system asks of them, so that they can feed their families and hopefully get recognition. It seems like nothing can stop this evolution except some kind of economic or military catastrophe, such as a nuclear war. [...]


There may be other, happier possibilities, but I’m not competent to speculate. What is certain is that the university system as it exists now in America will straggle on for at least a few more decades, as our society straggles on. Costs will continue to rise, the quality of education will continue to decline, and the ‘research’ that is pumped out annually will become more and more repetitive, tedious, technical, and ‘careful’.


The professional demands of academic life, together with the complexity of the subject-matter, are not the only causes. The cynicism, the discontentedness, that wafts through the cultural milieu, or rather saturates it, such that even the inert matter of an office-building or a university’s student-center is soaked in it, is also a factor. I mentioned this discontentedness earlier, in a slightly different context. Professors are no more immune to it than the average blue-collar or white-collar worker. All the impersonal vastness of the contemporary social world, which, by means of newspapers, television, constant financial imperatives and so on, confronts the individual as an impenetrable objective force—as a kind of malicious Other that is unaware of his existence but nevertheless bombards him daily with its inhuman demands, its sensory overload, its frenetic movement, its pervasive cruelty and senselessness—all this external compulsion and insensitivity, which makes each of us acutely aware of the littleness of his life and turns him into an insecure yet self-defensively egomaniacal little narcissist, pushes us into a strangely dissatisfied self-complacency. We’re so unhappy and dehumanized that we finally settle into our imposed routines without much resistance—resignedly, as it were—content to do what we all but have to do, yet at the same time not content with that at all. And so we passively embrace the one-sidedness, the alienation, of modern life, since there seems to be no other option—and since, at any rate, the nature of modern life has sapped our vitality. The result, in much of academia, is disregard for the very goal of all analysis, namely understanding the world. And if people went into academia with the bright attitude of the disinterested truth-seeker, they emerge from it disabused of idealism—including self-idealism. For, far from attaining their intellectual ends, they have had to play the game of petty scholarship. And they have ended up not realizing all their talents, or their desires.


Maybe you’ll say that if most academics don’t make more of themselves, it is because they don’t have much talent to begin with. To some limited extent, you might be right. Hegel was on to something when he said “the inner is the outer”. José Ortega y Gasset, that old warrior on behalf of all that is good in man, would have subscribed to this interpretation, judging by the following passage:


For me, then, nobility is synonymous with a life of effort, ever set on excelling oneself, in passing beyond what one is to what one sets up as a duty and an obligation. In this way the noble life stands opposed to the common or inert life, which reclines statically upon itself, condemned to perpetual immobility, unless an external force compels it to come out of itself. Hence we apply the term mass to this kind of man—not so much because of his multitude as because of his inertia.
As one advances in life, one realizes more and more that the majority of men—and of women—are incapable of any other effort than that strictly imposed on them as a reaction to external compulsion. And for that reason, the few individuals we have come across who are capable of a spontaneous and joyous effort stand out isolated, monumentalised, so to speak, in our experience. These are the select men, the nobles, the only ones who are active and not merely reactive, for whom life is a perpetual striving, an incessant course of training....[12]

Ortega would say that most academicians live common, inert lives, condemned to immobility. So would Schopenhauer, Kierkegaard, Nietzsche, and many others. These thinkers were contemptuous of the “scholar”—and even more so of the “mass man”—for his apparent lack of talent and existential ambition. They thought it was in his nature to be self-satisfied (in the bad sense), lazy.


I cannot agree with such opinions, fun as they are. They’re unscientific. Too easy. And too self-congratulatory on the part of the person who opposes himself to the academy, and to the masses. [...]


Schopenhauer wasn’t completely justified in all the scorn he heaped on “philosophasters”. Many of them, of course, are extremely intelligent, very dedicated to their craft, dedicated to intellectual integrity and the search for knowledge. They simply go about it, I think, in the wrong way, in an overly specialized way. It may be true that the academic system as a whole demands specialization and conformism; still, individuals can break out of the mold. Individuals can fight against stuntedness, can work passionately to synthesize disparate spheres of analysis and at the same time facets of their mind. And the more people do it, the more people will follow them, perhaps eventually bringing about a far-reaching change in the decadent world that is academia. [Ha! A remarkably naïve opinion.]....


—In the end, though, not much can be done by individuals to remedy the ‘academic epidemic’. The system’s logic is inexorable, and people are currently too complacent anyway. It’s a pity, because these intelligent people have a lot to offer the world. Even apart from the pursuit of knowledge. They could, for instance, contribute also to the pursuit of justice. They could focus their energies on exposing the crimes of the government (à la Noam Chomsky), or they could work together for social change. Chomsky is right that intellectuals, with all their freedom, their financial security, their intelligence, their instinctual hatred of bigotry and, to some extent, conservatism, have a moral obligation to advance the banner of humanity. One needn’t argue for that claim; it’s pure common sense. Intellectuals can and should be at the vanguard of progress. They should band together and launch assaults on the kind of idiocy that, say, opposes stem-cell research and gay marriage; they should aggressively advocate for renewable energy, for political transparency, for human rights. Above all, they should use their resources to counteract the propaganda of the mainstream media...


Continental philosophers like Sartre and Herbert Marcuse used to deplore the analytic philosopher’s lack of social engagement. This Anglo-American philosopher himself tended to be less politically engaged than his continental counterpart, but even his theorizing was divorced from material conditions and concrete applications. It was, so to speak, a flight from reality, from commitment to any social agenda, from even the attempt to understand philosophy’s relation to social conditions. Nothing was more foreign to the spirit of this philosophy than the Hegelian and Marxian sentiment that “the world of immediate experience—the world in which we find ourselves living—must be comprehended, transformed, even subverted in order to become that which it really is”.[14] The order of the day was to separate philosophy’s domain from that of other disciplines (psychology, sociology, etc.), to reduce its sphere of influence—(Wittgenstein, for example, proclaimed loudly and repeatedly that philosophy “leaves everything as it is”, that its purpose is only to “show the fly the way out of the fly-bottle” so that it can go on living its life not vexed by philosophical questions)—to reduce it to a mere sustained analysis of language, in the hope that such an analysis would dissolve all philosophical questions (which were supposed by many to be unanswerable, to be pseudo-questions arising from confusions in language). These analytic thinkers (e.g., in the ’40s and ’50s) wanted to ‘purify’ philosophy, to remold it in the image of symbolic logic, or at least of close, careful reasoning about elementary linguistic utterances. The logical positivists of the ’30s even went so far as to deny the meaningfulness of statements in metaphysics, ethics and aesthetics, as being “unverifiable”. (A statement, they thought, is meaningful if and only if it is verifiable.) In a way, their project was similar to Edmund Husserl’s: both programs—the analytic and the phenomenological—were ‘foundational’, in the sense that they wanted to ‘get back to basics’ and build philosophy up from the ground (or knock it down, as the case might be).


But Husserl’s method was very different from theirs, and in the end, I think, more fruitful. It was (is) in fact similar to the method of all great philosophy as such, namely that of intuition. It is intuition taken to an extreme, raised to the status of a self-conscious method.[15] Of course, in his attempt to make phenomenology scientific, Husserl formulated many doctrines that were not explicitly held by earlier philosophers; nevertheless, virtually all “great” philosophers would have respected the intuitive core of phenomenology, since their own ideas have been arrived at primarily through intuition (though not necessarily phenomenological introspection) and their analyses have tended to proceed along intuitive lines (embellished discursively). Thinkers as diverse as Descartes, Kant, Hegel, Schopenhauer, Kierkegaard, Nietzsche, James, Bergson, Heidegger, Poincaré, Sartre, Merleau-Ponty, and Kripke—and many others—have explicitly relied on intuition in their investigations. Analytic philosophy, however, distrusts intuition, as being too ‘private’, ‘subjective’, and ‘obscure’.


It’s difficult to say what it is about intuition that makes it so fruitful and essential to philosophy—indeed, to all critical thought—but the point is that it somehow seems to permit an approximation of ‘direct insights’ into a phenomenon. It is sort of a bypassing of logical processes, or rather of such processes on the conscious level (for they must occur unconsciously[16])—a cutting-through them and getting to the ‘thing itself’. The insights that are thus made possible are not always wholly true or adequate—they may be one-sided—but the fact remains that great philosophy is merely a discursive elaboration on, and embellishment of, fruitful intuitions. And these pithy intuitions animate all the great philosophers’ writings. One has but to read them to recognize the truth of that claim. (Kant’s writings are a good example.) It is odd, then, that analytic philosophers are usually contemptuous of ‘intuition’. They treat it as some sort of mystical, unscientific notion that has no place in real philosophy, when in fact it is the very heart and soul of real philosophy. Ironically, I can use one of their own against them: Saul Kripke, a universally acknowledged genius. In both his Naming and Necessity and his Wittgenstein on Rules and Private Language, he states more than once that the whole force of his argument is intuitive, and that unless the intuition is grasped the argument will not be understood. Schopenhauer had said the same thing more than a century earlier. Without the preliminary, pithy intuitions, philosophy would be an empty husk, an endless flow of empty discursiveness.[17]


As I said, one of the goals of analytic philosophy used to be the somewhat masochistic one of making philosophy self-contemptuous—that is, of turning it into a mere instrument of “therapy”, which would allow us poor bedeviled intellectuals to calm our roving minds so we could get on with ordinary life. Philosophy was conceived as (arising out of) sheer confusion. Said Wittgenstein: “A philosophical problem has the form: ‘I don’t know my way about’”. Also: “Philosophy only states what everyone admits”.[18] It cannot contribute positively to life, it cannot help right wrongs, it has nothing to say about anything that really matters to people. Alternatively, some philosophers imagined that their task, in the modern age, was simply to adjudicate disputes between the disciplines, to clarify which explananda came within the purview of which discipline, and to clarify the epistemological relations between the sciences. Philosophy’s task, then, was primarily methodological. It could perhaps settle, for example, the sociological debate between methodological individualism and holism, and it could set up models of how the sciences work, but it couldn’t say anything about substantive questions. –Here again, philosophy was aloof not only from all social activism but even from any substantive question about society and humanity.


To read about the history of philosophy in the twentieth century is like reading about the identity-crises of a schizoid person. In this modern, technological age, philosophy has had a hard time finding its niche. In America it has often denigrated itself vis-à-vis the ‘hard’ sciences, while in Europe it used to ignore the sciences and concern itself with the human condition, the condition of the ‘spirit’. In recent times, its self-skepticism has mutated into a particularly virulent strain, in Rortyan relativism. Academic philosophy is no longer as tormented as it once was about its place relative to other disciplines, but this is largely because it no longer has any sort of a ‘united front’, no clear agenda or direction of progress. It is floundering, flailing its arms about everywhere, perhaps as a signal for help. Everyone is doing his own thing, in his own little area with his own little clique of interested colleagues. Far from philosophy’s having become more self-confident, more ambitious, since the days of logical positivism and Wittgensteinian ordinary-language-analysis, it has become so atomized, technical and isolated from the world (from life) that people outside the academic bubble are hardly even aware there was once something called “philosophy”.


Not only is it specialized, though; it has also become more superficial than it used to be, despite—or because of—its unprecedented logical rigor. This development is the logical conclusion of the discursive methods of analytic philosophy, which have come to dominate the scene. Excessive formalization was always a fetish of thinkers like Bertrand Russell; they wanted to bring formal logic, symbolic logic, or at least obsessive exactitude, into all spheres of philosophy (and science), in the hope that this would make them more objective and precise. It may possibly have made them more precise, but I doubt it made them more objective. In any case, the more one emphasizes form, the less one emphasizes content. The result has been that writers pay more attention to the discursive surface of an argument than to its intuitive content. Symbolism is used all the time—x, y, z and so on (“Necessarily, x is part of y at t iff x and y each exist at t, and x’s temporal part at t is part of y’s temporal part at t”[19])—because this makes arguments look more mathematically precise; arguments that have already been stated clearly are restated in the jargon of symbolic logic, for no apparent reason other than to exclude non-specialists; unnecessarily abstruse terminology is used, again to prove one’s credentials (to prove that one is ‘part of the gang’); the writer cites an enormous number of articles in his paper, to show he is aware of the literature; arguments are evaluated on the basis of their consistency rather than their plausibility. This last trend is the worst, for it leads to real shallowness. Often the reader loses a sense of what is being argued about, of what precisely is at stake, because the writer rarely keeps the discussion on the level of easily grasped, intuitive concreteness. I have actually listened to my professors and fellow students debate for hours about metaphysical issues, such as the presentism/eternalism debate or the perdurantism/endurantism debate (see below), even as they obviously had no concrete idea of what was being talked about—because the debates themselves are fairly empty, excessively abstract. Which is because the concepts are not backed up by intuitions. All that matters is consistency, not intuitive plausibility or real understanding. The exclusive concern with consistency leads philosophers to take seriously positions that are positively perverse, positions that anyone with a shred of common sense would laugh at.


For example, David Lewis’s famous theory of modal realism, proposed decades ago, is still debated, despite its absurdity. For the benefit of those readers who have not had to endure a course in contemporary metaphysics, I’ll briefly describe it. It was proposed as a solution to the question of how modal claims can be true or false. For instance, what is it that determines the truth of the statement “It’s possible there is extraterrestrial life”, or “A square circle necessarily doesn’t exist”, or “I will probably not go to class tomorrow”? Given that modal statements (i.e., statements involving the concepts of possibility and necessity) can be true or false, what is it that determines their truth or falsity? Intuitively we usually have no problem judging their truth-values, but intuition, in and of itself, is not a very rigorous method. Moreover, what does it mean to say that a certain thing is possible (or necessary, or impossible)? What exactly is being said? Various theories have been offered, but David Lewis’s is the most famous. Its basic idea is that for every true modal proposition, there is a world somewhere—a real world, a concrete world like our own, which is, however, spatiotemporally unconnected to our universe—in which the state of affairs that the proposition posits actually does exist. And that world is exactly like ours except for the one difference posited by the proposition. So if I say, “It was possible for me not to have been born”, that proposition is true because there is a world somewhere in which I do not exist. Similarly, the statement “I will possibly become a novelist” is true because in some world I will in fact become a novelist. –“But how,” you ask, “can you become a novelist in another world if you’re here in this world right now?” Good question. Evidently Lewis’s theory is also committed to the idea that we have ‘counterparts’ in other worlds—in fact, an infinite number of them (for there is an infinite number of possible states of affairs, and thus worlds). That is, the only reason why the concept of possibility can be used to describe my situation is that there literally are other Chris Wrights on other planets spatiotemporally unconnected to our universe. Billions of them, trillions of them. And the meaning of modal claims about me is related to my counterparts. “I may become a novelist” means (roughly) “I have a counterpart in a possible world who will become a novelist [or who is a novelist]”.


You’re right that this is all pure fantasy, concocted by nerds. Unfortunately it isn’t treated as fantasy by these nerds. They actually take it seriously. The arguments that David Lewis brought against alternative theories of modality were powerful, so his colleagues have felt compelled to come to terms with his own ideas. And they devote books to this Lewisian drivel! They cannot see that if they are apparently forced to take his theory seriously as a way to account for modal truths, there must be some gigantic, foundational flaw in their method of philosophizing.


And of course there is: they are too fond of formalization, and of mathematical preciseness. Lewis’s theory is useful in various technical ways relating to modal logic and possible-worlds semantics. Even better: it seems internally consistent. Most philosophers don’t agree with it, since it’s wildly implausible, but they take it seriously. What they should do, I think, is to ‘explain away’ all questions that permit of answers as silly as Lewis’s. I’ll clarify what I mean by ‘explain away’ in a minute.


Not only are philosophers preoccupied with formalization at the expense of content and intuitive understanding; they also have a love of classification, of labels of all kinds. Indeed, they have internalized their society’s and their profession’s atomized structure so completely that they delight in setting up conceptual divisions for their own sake. This has been a characteristic of analytic philosophy almost since its inception. (I say ‘almost’ because its founder, Gottlob Frege, was a pioneering genius with that rarest of traits among philosophers: intellectual common sense. His thinking was more intuitive than discursive, so it was sensible and fruitful.) One could perhaps argue that this delight in labels has been a characteristic of philosophy since its inception, and hence is not merely a modern trait. That may be true. Even so, contemporary philosophers have raised it to a new level. Their sophistries, their labyrinths of casuistry, benumb the mind. I said a moment ago that philosophers now are no longer as concerned as they once were to clarify, for example, the differences between philosophy and psychology, and between psychology and sociology, etc. This could be seen as a positive development. Unfortunately, there are many, many other ways in which one can manifest one’s love of labeling. In whatever subfield of philosophy you research, you’ll find that most writers care more about differentiating opposing positions than actually finding the truth.


For instance, there is currently a debate in metaphysics (or there was when I was writing this paragraph) over the question of how it’s possible for an object to persist through time—i.e., to undergo changes while remaining the same object. In what sense does it remain the same and in what sense does it change? A candle melts but it remains the same candle, even as it somehow changes as well. How can this phenomenon be explained? The dominant positions are “perdurantism” and “endurantism”. The perdurantist thinks the candle is temporally extended: it is actually ‘its’ entire lifespan; it has temporal parts just as it has spatial parts. So the answer to the question is that modifications can occur in its temporal parts, as they can in its spatial parts, while the candle itself persists (or “perdures”). The endurantist, on the other hand, thinks the candle is wholly present in each moment. It doesn’t have temporal parts. It is able to change in that its properties, like the candle itself, exist only at specific times. Rather than being, say, red simpliciter, the candle is red relative to a particular time. In other words, there is no such thing as being red; there is only being red at (a particular time).[20] –Now, in their preoccupation with drawing distinctions between these positions, writers have failed to see that both perdurantism and endurantism don’t really answer the main question. It is unenlightening to say (with the perdurantist) that, “strictly speaking, an object is its past, present and future; each temporal stage is just a part of it—and so, while there can be differences between its temporal stages, it itself remains the same object”. What verbal legerdemain! It says nothing. The real question is what it is that connects the object’s instantaneous temporal slivers with each other, such that in their aggregate they form one persisting object. Nobody, as far as I can see, has given an answer. Instead, everyone has focused on how to differentiate the two positions, thereby remaining on the level of sophistry.


This example illustrates something else too: the triviality of the questions that occupy many philosophers. A few of these questions are interesting enough to ponder for short periods of time, perhaps even to write a paper or two on, but to spend years of one’s life wondering how the candle can stay the same candle from t to t´ borders on insanity. It strikes me as a kind of masochism. In any case, this sort of philosophy is not an active, passionate search for truth; it is a quibbling, a kind of pettiness. Far from elevating one’s outlook on life, which philosophy should always do, it narrows, stultifies the mind, and stunts the personality. –With regard to problems like these, the philosopher should take Nietzsche’s advice: “treat them as a cold bath—quick in, quick out”.


My conception of philosophy is quite different from the popular one. I should note, though, that it represents only my personal opinion, and that I don’t think it is necessarily the ‘correct’ one. Philosophy is an unusually broad and indefinable discipline, which is why philosophers so often agonize over its proper place in relation to the other sciences and humanities. Of no other discipline could so many irreconcilable definitions be offered; only philosophy is amorphous enough to persist through millennia, changing with the times. Therefore, no definition is ‘absolute’. However, I prefer to look at philosophy in a certain way, because it strikes me as a ‘lofty’ conception that does justice to the richness of the philosophical tradition.


First of all—to repeat—I think the method of intuition is essential to real philosophy. I like Heraclitus’s advice: “It is wise to listen not to my words but to the Logos”. He understood that words, in themselves, are deceptive, shallow. On the level of words, his own philosophy seems utterly self-contradictory, as does that of Hegel and that of certain Eastern religions. What is important is not the words in which the thoughts happen to be expressed but the thoughts themselves. The intuitions. Words are useful in that they allow the thoughts to be communicated and thus argued for and against, but without the primary, non-discursive intuitions, philosophy would be only the series of intellectual exercises it has become. In truth, the process of doing philosophy should consist of a constant interplay between intuition and ratiocination, the intuitions guiding us and the reasoning honing our insights to relative exactitude. An overemphasis on either element vitiates philosophy. Intuition without reasoning gets you mysticism; reasoning without intuition gets you academic philosophy.


Bergson once said that “the essence of philosophy is the spirit of simplicity. [From whatever point of view we look at it,] we always find that its complication is superficial, its constructions mere accessories, its work of synthesis an illusion: to philosophize is a simple act.”[21] I think he may have overstated his case there, but his essential insight is sound: all the complexity of philosophy, of real philosophy, is supported by a foundation of utterly simple intuitions—as all intuitions are simple, being nothing but momentary intellectual perceptions. This doesn’t mean it’s easy to understand precisely what an intuition is, for, on the contrary, that is extraordinarily difficult. Nevertheless, any thinker who knows what it is to have an intuitive insight into a problem—an imaginative, ‘perceptive’ insight, as opposed to a purely discursive ‘insight’—knows that intuitions, whatever they are, are fairly simple and yet often profound (or fruitful). Much academic philosophy, on the other hand, is extremely complicated yet superficial.


More generally, by “philosophy” I understand something that the intelligent, informed layman can comprehend, something he can be moved by. Technicalities, logical symbolism, formalization and so on are at best supplementary tools; philosophy ought not to revolve around them. It ought to be nothing more nor less than an intellectually honest engagement with the perennial questions of life. It ought to inspire, as Nietzsche inspires, Marx inspires, Kant inspires; it ought to broaden your vision, impel you to think on your own. As Aristotle said, philosophy arises out of pure wonder. It is man’s original and instinctual attempt to intellectually assimilate the world—his primordial impulse to ask questions, to bring order out of chaos. All the sciences emerged from it. They are extensions of it.


Thus, as I see it, philosophy does not have only one special function, as the old analytic philosophers thought and some still think today. It does not exist only to provide an epistemological ‘grounding’ for the other sciences, or to show how certain questions are merely confusions arising from language, or to oversee methodological issues in other disciplines. It is neither subordinate to the other sciences nor truly ‘superior’ to them. Insofar as it can be distinguished from them at all, its distinction is its breadth. No program can be imposed on it, in the mode of Wittgenstein, in some a priori-ish way; nor is its place ever ‘fixed’ vis-à-vis other sciences. It is fluid; it evolves with the times, going wherever society’s drive to knowledge wills it. It adopts both meta-level and first-order perspectives, depending on what questions are asked.

Moreover, it bleeds into the other sciences. It borrows from psychology and physics while undertaking its own phenomenological and logical investigations. At times it must cede its former territory to newly developed disciplines with their specialized techniques, but it never relinquishes its right to an examination of their results for logical carelessness or for further insights into the human condition. It is the most self-conscious and self-critical of disciplines, for self-consciousness and self-criticism are among its methods. Poetry and fiction and nonfiction can each exemplify philosophy in its own unique way.


For example, Percy Bysshe Shelley was, in a way, a philosopher. Not only in his prose, but also in his poetry. So were Wordsworth, Milton, Shakespeare, Whitman, Dickinson, even Wilfred Owen. And many others. Analytically they weren’t rigorous thinkers, but the spirit animating their writings was philosophical. Wonder, awe, despair at universal absurdity, joyfulness in living, the drive to understand. The true philosopher has a mind so expansive he is often dissatisfied with himself; and his dissatisfaction drives him to push the boundaries of thought and life. He may be called a “genius”, but he is really just a thoughtful person who, because he can’t find contentment in ordinary life, spends his time contemplating himself and the world. Indulging his fascination is what makes him happy.

In this broad sense, philosophy is rebellion against the status quo. And philosophers are necessarily rebels. They have to be, because they are individuals—individuals who, moreover, have something important to say. Social conditions are never modeled on ‘truth’, so the thinker is bound to come into conflict with them, since he pursues knowledge and criticizes that which is false. Especially in a world where alienation is the norm, where power-structures suppress the individual’s development as well as his understanding of truth, philosophy must take on the character of rebellion. It must be socially critical and engaged; but it must also be lived—passionate, for it requires passion to subvert ossified ways of thinking and being.

In its confrontation with the apologetic character of ideologies and the oppressive character of social structures, philosophy is nothing but the “critical theory” associated with thinkers like Adorno, Horkheimer, Marcuse, Habermas and Fromm. [That’s sort of true, but it is much too narrow a conception. “Critical theory” per se was rather silly and masturbatory; what I really meant was all “theory” that is critical of present society.] Critical social theory is one manifestation of philosophy. Theorists used to debate the epistemological status of critical theory (and Marxism), some arguing it was merely another ideology as ‘uncritical’ as others, while its adherents argued it was different from, say, bourgeois ideologies, in that it was more scientific and signified a union of theory and praxis. I think its adherents were right. While critical theory, like all theory, is indeed guided by particular interests, and to that extent is as partisan as all ideologies, its interests are precisely societal understanding and emancipation. Bourgeois ideologies, by contrast, serve the narrower and less ‘philosophical’ or ‘universal’ interest of maintaining the social order, of defending it against those who have the temerity to point out its flaws. Critical theory—i.e., the enterprise of understanding society—can be more quasi-scientific in its methods than bourgeois ideologizing because its goal is to understand, not to defend the status quo; but given that understanding must criticize and dissect anything that militates against understanding, critical theory necessarily involves criticism of the oppressive and opaque elements of the social order. Which means that its interest coincides with the interest of mankind, its emancipatory interest.[22]

Be that as it may, my point is that philosophy, being the broadest, oldest, most ambitious, and arguably the most ‘human’ of disciplines, should reclaim its rightful place at the forefront of human knowledge, life, and progress. We should stop wondering what is and what is not philosophy—how to delimit its sphere of influence, as if it has a clearly defined subject-matter. It doesn’t. It isn’t even really on the same level as the other sciences. It exists in its own realm of thought, which extends from art to religion to science even as it transcends them all. In this age of money and specialization, we forget philosophy at our peril. We ‘fragment’ it at our peril. If philosophy is in danger of perishing, so is the individual.

I have often heard the term “cutting-edge philosophy” tossed around by my professors. They exhort their graduate students to read up on the most recent scholarship so that they can contribute to cutting-edge philosophy, cutting-edge metaphysics. And every time I hear that word, I feel myself become momentarily hollow, as if a draft of nihilism has passed through me. It’s an uncanny feeling, an emotion that makes me hate life, like extreme boredom. “Cutting-edge philosophy” is an oxymoron, a self-vitiation, like “contemporary classical music” or “that mediocre poem the Bhagavad-Gita”—a manifest contradiction-in-terms. The more common this attitude towards philosophy becomes—this ‘scientistic’, ‘technologistic’, ‘mathematical’, ‘research-paper-ish’ attitude, which uses words like ‘cutting-edge’—the more I know that philosophy is in danger of dying. And that idealism itself is undergoing something of a crisis. Philosophical scholars may be adept at reasoning, but their technical jargon as much as their actual positions proves that their approach, on the whole, is not quite correct. It’s fine to think that the activity of doing philosophy is merely ‘interesting’ and ‘entertaining’, as most professionals think; I don’t condemn that attitude. I say only that when it becomes as widespread as it has, it signifies a degradation of philosophy. The motto is no longer Husserl’s “Zu den Sachen selbst!” (“To the things themselves!”); it’s Zum Wörterbuch! Discursion, analytics! Intuition—be gone with you!


...Without intuition, what you get is mostly the “philosophasters” of whom Schopenhauer spoke, people like Daniel Dennett, David Armstrong, Karl Popper, Hilary Putnam, Derek Parfit. (See the chapter on genius.)


And the specialists. Nietzsche had this to say about “specialists”:


Almost always the books of scholars are somehow oppressive, oppressed; the “specialist” emerges somewhere—his zeal, his seriousness, his overestimation of the nook in which he sits and spins, his hunched back; every specialist has his hunched back. Every scholarly book also mirrors a soul that has become crooked; every craft makes crooked.[23]

To be sure, a few paragraphs later he rightly qualified that negative assessment. There is much that is good and honorable in the “specialist”, for example his honesty, his intellectual integrity, his dedication to his craft. Nevertheless, at least in his current philosophical manifestation, he cares more about arguing for its own sake than about truth...

Philosophy as I interpret it is unique. It is totally affirmative of our humanity, not only in its depth and spiritual significance but also in its breadth. Its breadth mirrors, more or less, the breadth of the individual’s potential. While a person who occupies himself exclusively in philosophical pursuits is stunted, in the cognitive sphere at least he is not. Or he need not be, because in order to do good philosophy—original, interesting philosophy, which, as such, synthesizes—he has to be partly educated in the other sciences as well. And the humanities. Synthesis, breadth, is so definitive of philosophy that the latter must draw from all areas of knowledge in order to ‘get the job done’. The narrow thinker is necessarily not a good philosopher. Hence, true philosophy, far from being a victim of academic specialization, should serve as an antidote to it.

In other words, in the academician’s effective disregard for (broad) philosophical truth, he disregards also his human potential—what can be called his existential truth, his psychological truth. Conversely, I might go so far as to say that the ideal philosopher is a well-rounded person, simply because philosophical insights can be gained from living life to the fullest. Nietzsche would have been a better thinker had he experienced a woman’s love; he might then have softened the exaggerated elements of his philosophy, and in any case he would have had a more comprehensive outlook on life. Hegel would have been a better philosopher had he been more sympathetic to Newtonian physics, that is, to science (as opposed to the pseudo-science that mars the Phenomenology and his later works). –In general, the breadth and depth of philosophy, and its universality (across cultures) as well as its life-affirmative character, suggest that philosophers should aim to be the vanguard of humanity, of its potential. They should ‘lead the way’; they are most qualified to, and their vocation demands that they do.

But if, in pursuing this ideal of philosophy, one finds oneself becoming stunted from too much thought, my advice is to temporarily, or even permanently, resist the lure of ‘philosophical truth’ and reform one’s ways. If immersion in analytic philosophy becomes dehumanizing, one should take a break from it. Take up poetry or music or sports, or something else. Self-realization, self-affirmation, is, as I see it, perhaps the foremost imperative in life; not even the search for knowledge trumps it. Should the two ever conflict, the first has priority over the second. For knowledge is not an end in itself. It has value only insofar as it affirms one’s humanity, either directly or indirectly (as in the beneficial practical effects of science). Nietzsche was right that truth is not intrinsically valuable, and that treating it so can be dangerous. What beauty would there be in life if everyone devoted himself at all times to dry understanding? How could we affirm life in such circumstances? What a desolate thing it would be! To have an appetite for knowledge you have to be willing once in a while to live in ignorance.


But insofar as you remain a philosopher, you should follow the twin imperatives to subvert and to integrate. Or, in less nocuous language: analyze and synthesize. (I prefer ‘subvert’ and ‘integrate’, though.) Philosophy ought to engage with society rather than imprison itself in an ivory tower. It ought to be readable and compelling, not boring and unambitious—and Lewisianly perverse. If you find yourself getting lost in endless technicalities—or getting bored to tears—then you’re probably approaching the question wrong.

Indeed, that’s how I monitor myself and my arguments when doing philosophy. If I’m getting overwhelmed in the complexities of a thought I’m pursuing (or if I find an idea totally uninteresting), then I decide the thought is probably wrong. Ockham’s razor, after all. The simpler, the better. “The essence of philosophy is the spirit of simplicity.” At such times I take a break from thinking about the problem and then return to it later with a fresh mind. But instead of plunging right back into the argument, I revert to intuition. I sit back, spend hours imagining the problem—trying to ‘picture’ it—non-discursively approaching it from various angles, until suddenly a ‘picture’ of the solution hits me. I can’t really describe the process, but the intuition itself is always very simple. Then I embellish it in arguments. When, again, I lose my way in the arguments, I sit back for a while or pace the room while waiting for a new intuition that will show me the way out of the muddle. At such moments I know I’m doing philosophy—I can feel the philosophy. It’s when I get lost in tedious ratiocination that I realize I’ve ceased philosophizing for ‘sophisticizing’ (for lack of a better word).


For example, one of the ways I know that traditional and contemporary approaches to ontology are wrong is that they all lead to casuistry—over-subtleties, debates about such abstruse concepts as “uninstantiated universals” and “Lewisian possible worlds”. The correct approach will cut through all the shit, the centuries of verbiage; it won’t argue with it on its own level. (The later Wittgenstein understood this, but the sloppiness of his thinking handicapped him.) It will explain the deeper sources of the confusion—therapeutically, as it were. The linguistic or the phenomenological sources. Ontological questions like “Do properties exist, and if so, in what sense?”, or “Do numbers exist? What are they?”, are fundamentally misguided. They are hardly even real questions, as I hope to explain in volume two of this book.


This is the sort of circumstance in which Wittgenstein-esque therapy is valuable. Insofar as one has to separate the substantive questions from the word games, Wittgenstein’s project of “showing the fly the way out of the fly-bottle” is a genuinely philosophical project. His “ordinary language” approach has value, not only as a corrective to academic technicalities but especially because the emphasis on our ordinary use of language is a promising way to illuminate the basic intuitive conflicts and cognitive dissonance that give rise to many philosophical (particularly metaphysical) questions. Especially in this age, this crisis of philosophy, brutally honest therapy is necessary. We have to reinvigorate philosophy, make it an existential and vital activity again. But this involves not only cutting the dead wood from the live; it also entails making philosophy relevant to society. We have to show how ethics can settle concrete doubts about life, and how the death of religion is not the death of values. We have to engage with politics through Marxism and other theories of history. We have to make action philosophical. Philosophy, after all, grows out of social conditions; it’s only natural that it should also contribute to them.

Social activism, if justified philosophically/‘scientifically’, is itself a form of philosophy, for it is emancipation. It is the real-ization of understanding. Or, at least, we ought to think of it that way. Plato did. He wanted a Philosopher-King! Like Marx, he wanted to unite theory with praxis. Why do we ignore his vision? If we’re going to call ourselves philosophers, we ought to do philosophy. Actualize it. We ought to take up arms against ignorance, like Noam Chomsky and Richard Dawkins; we ought to fight all bigotry and conservatism. If we have the talent, we ought to go into politics—subvert the system by means of the system. “To ruthlessly criticize everything existing!”[24] is the philosopher’s creed. Apathy, complacency, is philosophical treason.

In a civilization that has become fragmented into a thousand ideologies and is in danger of perishing someday in a nuclear holocaust, perceptive critical analysis—translated into activism—is imperative. Not only is it not ‘obsolete’, as the postmodernist would have it; it is essential. [...] We need people who can unite the erstwhile ambitiousness of European thought with the analytical carefulness of Anglo-American thought. Otherwise philosophy—and probably the social sciences in general—will become obsolete, and the grand tradition will disintegrate in the nihilism of a Kenneth Gergen.


*


I’m just sketching a vision here. I can’t go into details, since this book is aimed at the public rather than merely the academy. [?!] My purpose in this volume is not to ‘explain the world’ but only to remind people (including myself), in my inadequate way, of old ideals, ideals that are on the defensive now that capitalism is at its zenith. Truth, creativity, love, poetry, music, community, etc., all of which have been corrupted through commercialism and the culture it has spawned. The only way to bring them back—to overcome the atomization and bureaucratization of life—is to understand the dynamics of society and to manipulate social progress along an enlightened path. Not to obstruct it, as conservatives would do, but to push it even harder to realize its potential. Especially its economic potential. Marx was the first to see the possibilities unleashed by capitalism, the productive possibilities, which are (or will be) the basis for amazing cultural opportunities...

[An epigram I was going to put somewhere at the end of all this: To alienate an alienated world is just to be human.]


[1] By that term I’m not referring, e.g., to the undignified cutesiness of a Daniel Dennett. [2] Kenneth Gergen, The Saturated Self (New York: Basic Books 1991), p. 82. [3] Jean-François Lyotard, The Postmodern Condition: A Report on Knowledge (Minneapolis: University of Minnesota Press, 1979), p. 29. The pernicious influence of Wittgenstein is evident in that quotation. [4] For example, the main argument running through Gergen’s book can be stated as follows: people are confused nowadays; therefore, there is no such thing as truth. [5] Herbert Marcuse already remarked on that over-preciseness in One-Dimensional Man (Boston: Beacon Press, 1964). Criticizing the Austinian style of linguistic analysis, he asks (rhetorically) “Are exactness and clarity ends in themselves, or are they committed to other ends?” (p. 176). But compared to contemporary philosophers of language, who can spend dozens of pages writing about practically nothing, Austin was virtually an avatar of ambitiousness. Incidentally, later I’ll try to explain the irony that despite the contemporary philosopher’s obsessive concern with logical analysis, his positions, such as they are, are frequently muddled and superficial. [6] I mean people in the ‘soft sciences’, not the natural sciences. [7] I’m talking about truths beyond facts like ‘George Bush is the president of the United States (unless I’ve simply been having a nightmare for the last seven years)’. Presumably not even postmodernists would deny that fact. (But this already shows that there are ‘objective truths’ of some elementary kind.) [8] For example, experimentation and the formulation of general laws. The basic reason for this deficiency in social science is that its subject-matter—namely, society and its interactions with the human psyche—is not purely deterministic, as is the subject-matter of the natural sciences (quantum mechanics notwithstanding). The psyche somehow ‘softens’ causality, such that laws, testable through experiments, cannot accurately describe it. [9] From Adam Smith’s day (and even before then) it has been known that “the division of labor, so far as it can be introduced, occasions, in every art, a proportionable increase of the productive powers of labor [i.e., productivity]”. See The Wealth of Nations (New York: Oxford University Press, 1998), p. 13. [10] Quoted in Gergen, op. cit., p. 83. [11] Until either the earth is consumed by the sun or it collides with a massive asteroid. [12] Ortega, The Revolt of the Masses (New York: W. W. Norton & Company, 1957), p. 65. [13] Actually, on one reading, that saying isn’t contrary to Marxism. On the way I intend it, though, it is. [14] Marcuse, op. cit., p. 123. [15] “[Phenomenology’s] essential way of proceeding is by intuition.” Sartre, The Transcendence of the Ego (New York: The Noonday Press, 1957), p. 35. The precise definition of intuition has always been controversial, but these remarks will suffice for now: an intuition “is an act of consciousness by which the object of investigation is confronted, rather than merely indicated in absentia. Thus, it is one thing merely to indicate the Eiffel Tower (merely ‘to have it in mind’, we say), and another thing to confront the indicated object by an act of imagination or perception. The indicative act is ‘empty’; the intuitive act of imagination or perception is ‘filled out’.” Ibid., p. 110. [16] See chapter three. [17] Cf. Kant’s epigram: “Concepts without intuitions are empty, intuitions without concepts are blind.” His understanding of ‘intuition’ was different from mine, but the quotation aptly expresses my own views. [18] Wittgenstein, Philosophical Investigations, §123 and §599. [19] That’s a simple example, from Ted Sider’s paper “Four Dimensionalism”. [20] Cf., e.g., Mark Hinchliff, “The Puzzle of Change”, Noûs Vol. 30, Supplement: Philosophical Perspectives, 10, Metaphysics (1996): 119–136. [21] Quoted in H. Stuart Hughes, Consciousness and Society (New York: Vintage Books, 1958), p. 200. [22] See Julius Sensat, Jr., Habermas and Marxism (Beverly Hills: Sage Publications, 1979), chapter two. [23] The Gay Science, translated by Walter Kaufmann, §366. [24] That was the motto of the early Marx.

Recent Posts

See All

Thanks for submitting!

bottom of page