Tuesday, September 30, 2008

The Truth of an Illusion

Yesterday, in my introduction to philosophy classes, I taught my students about the three big theories of truth: the correspondence theory, the pragmatic theory, and the coherence theory. These three theories represent the basic positions on what truth is, although hermeneutic theorists like Heidegger and Gadamer certain offer their own visions, albeit in not so formal a manner. In any case, I was listening to Terry Gross this afternoon, and she had on Bill Maher and Larry Charles, the star and director of "Religulous." One of the things that Terry Gross asked was whether or not religion, even if one admits that it is a bunch of stories, can still be useful. Maher thought no, for ethical reasons (because of all the bad things that have been justified using religion), but Larry Charles said no, for theoretical reasons: the stories are not true, so one should not believe them. 

Of course this brings up a question for me: what is Larry Charles' view of truth? I would hazard a guess it's the correspondence theory. Like most of the so-called "new atheists" (Hitchens, Dawkins, Harris), I would assume most atheists in this mold probably think of their theory of truth as the correspondence of our ideas to reality. If they do correspond, then everything is good (enter modern science); if the don't everything is bad (enter religion). This basic view entails a whole bunch of theories, like the theory of representation: to "know" something is to represent it faithfully, to re-describe "reality" in language. So when a biologist analyzes something into the language of evolutionary biology, they are representing reality faithfully in a different language - one that hopefully helps us to understand the world a bit better. 

There is only one small problem with the correspondence theory of truth, a problem that when it is pointed out, makes correspondence theorist merely shout louder: they beg the question of what "reality" is. In other words, if you define "true" by reality (in this theory, it is reality that makes a proposition or view true), you have not answered the question of what reality is like. If you try to answer that question, you immediately have to come back to say that this definition of reality is the true one, and not that. But if you use the conclusion as a premise, and then the premise as the conclusion, you're merely begging the question. 

In the case of Larry Charles' views, this would mean that however he defines "reality" (which probably excludes certain types of "supernatural" phenomena, an "immanent frame," as Charles Taylor puts it), he is merely assuming it is true, and then defining what things count as true based on this assumption. Why should truth be based on that particular assumption rather than another? If its not argued for and defended, we'll never know. And to me, that is one of the major drawbacks of the correspondence theory of truth. Most of these theorist do not argue for their picture of reality, because they want "reality" to be some inert thing that is absolutely untouched by anything human. It is just "there" and there's nothing you can do about it, is the attitude. 

One of my basic problems with this attitude, is that if you does not argue for your view of reality (and you can't with the correspondence theory - it would arguing in a vicious, not a virtuous, circle), then you're apt not only to be a totalizer, but to lump all things that just "seem" similar, together. Why? Because of a habit of thought. If you don't argue for reality, if you don't think about your own assumptions toward it, how well would you be able to reach behind yourself and to see how your own history, social location, economic standing, etc., help to color how you think of other things? If you are not use to doing this, why would you be able to be hermeneutically sensitive in your definitions of anything? 

Take religion. The very idea that there is one thing called "religion" is ludicrous. First, the history of the term is interesting. It developed in the 17th century to precisely describe a certain war, and so trying to decide what is religious or not is from the outset put in terms of Protestant and Catholic disputes (also, since this is the case, for most of history no one thought of themselves as "religious"). But how can you do that? Religions are so different as to be unrecognizable, and even within religions the diversity is so great that I probably have more in common with atheists than I do with a lot of Christians. The standard of Protestant Christianity becomes the standard for all "religion" (which standard is also a misrepresentation of Protestant Christianity). Yet someone like Larry Charles thinks that because their assumption of reality does not include this so called "religion" then this "religion" has no place in life. 

To me, this is just a case of "the superstition of science scoff[ing] at the superstition of faith," to quote James Anthony Froude (himself a famous apostate and Carlyle biographer). To have a view of reality without argument is just as egregious as believing in so-called "myth." In either case, it is standing on a foundation that ultimately does not have recourse to reasons and argumentation. This very well may be the human condition, but in my view, we ought just to be up front about it, and change a vicious circle into a virtuous one going beyond any such "correspondence" theory of truth. 

And if Maher and Charles want to know what a Christian like me thinks about "religion," then I offer this quote from Kierkegaard: "to stand on one leg and prove God's existence is a very different thing from going down on one's knees and thanking him."

Monday, September 29, 2008

Absolute and Relative - Truth, Morality, Anything...

While questions of "absolute right and wrong" are not as pressing these days as they were, say, in the 90's (the heyday of groups like Focus on the Family and other "moral majority" groups), thinking about "absolutes" is an interesting and fruitful question. I recently came across one of Richard Rorty's arguments on this point. And like typical Rorty, it is grand, dismissive, and extraordinarily interesting. 

Rorty begins by mentioning that many people attack him on this particular point: that he denies there is any concept that we call "truth," and is thereby a relativist. What he denies, instead, is the dichotomy "reality-appearance", and the attendant correspondence theory of truth that this dichotomy implies. His detractors think that any theory of truth besides the correspondence theory leads us on the path toward relativism (especially the pragmatic theory, like Rorty's). 

But that doesn't mean he doesn't believe in truth, or so he maintains. Truth is surely an absolute notion. He gives two examples: we don't say "true for me but not for you," or "true then, but not now." Clearly, the geocentric view of the solar system is untrue, and never was true, absolutely and with no preconditions. But, he then says, "justified for me but not for you" is a common locution, and one most of us are quite happy to go along with (although not for everything). And the thing is, "justification" is the application of truth - or at least it goes along with it quite strongly, as William James points out. In fact, Rorty argues, justification does indeed seem to be something that always goes along with any claim to truth. From this Rorty draws a conclusion I've often myself thought about. 

His conclusion is this: granted that truth is an absolute notion, the application of this concept is always relative to the situation we are in. The criterion for applying the concept "true" is relative to where we are, our limitations, and our expectations. At the same time, the nature of truth is certain absolute. But if this is the case, what is the point of a theory of the nature of truth? Rorty sees none. If we only encounter truth in application to relative situations, what is the point of specifying "absolute" truth. We never encounter it, so even if we saw it we wouldn't know that we were looking at it. 

I have to be honest here. I should have written this like eight years ago, when I first encountered James Dobson's "Right versus Wrong" campaign. At the time I thought, "sure, in general we have a good idea of right and wrong. But absolute right and wrong? How could we ever know that? We are never in situations that are clear enough, never in situations that present themselves to us so cleanly. Who has ever had to make a decision a.) with full information, and b.) with full moral certitude? How would we even get this certitude, since by definition 'absolute' means something unconditioned by how we think about it. But what could that be?" 

I believe this line of thinking comes from a specific source for me, and on this Rorty agrees. Orthodox Monotheists (Jews, Christians, and Muslims) actually basically view God in this way. They say, "God has indeed been revealed to us - but we can never fully capture God conceptually, given our limitations." Even the most fundamentalist Christians recognize this point, at least in principle. And indeed, this is a point that has been driven home to me throughout my life. We use the language of Scripture, but we also recognize that we fall short of intellectual capacity to understand it. Calvin makes this point repeatedly when he says that God "condescends to us," speaks to us with the language like a wet-nurse, with concepts like the Trinity, and salvation, etc. 

What does this mean philosophically? I'm not entire sure, except that perhaps Christianity ought not to be so hostile to pragmatists, or at least that notion of truth. Perhaps there is something we can learn from Rorty and other pragmatists, who insist on the "useful" as the basic category of thinking. 

Thursday, September 25, 2008

Ad venalicium: on the "free" in free trade

Alfred Hugh Clough once said that "thou shalt not covet - but tradition approves all forms of competition." An apt saying in a strange time, no doubt. And in this strange time I have been trying to reformulate my own view of things with big capital letters like the "Economy," and "Free Trade," and all such other concepts. I do believe my views have shifted a lot since my youth, and I figured I might set them out clearly here - at the very least as an exercise for myself. 

Growing up I listened to Rush Limbaugh with my parents. I remember driving in the summer, from noon to three, hearing him denounce Bill Clinton, and trumpet his view of economics. His view is rather simple and elegant: enable "free enterprise" and the rising of the water will float everyone's boat. This is "supply-side" economics (the "trickle-down" stuff is not really a economic position), i.e. if you bolster the producers and employers, you will bolster everyone. This does have prima facie plausibility, if you think about the reasons a business might expand. There are two basic reasons a business expands: the costs of production goes down, or there is an increase in demand. The main reasons for a decrease in the cost of production would be paying employees less, figuring out better techniques of production, or the lessing of other "exogenous" factors (exogenous just means those factors that do not have to do with the market per se). The worse exogenous factors, according to Limbaugh and most supply-siders, is government intervention - in the form of taxes and regulation. And so the argument goes, instead of decreasing production cost through paying employees less (although this too is argued - against the minimum wage, for example), one should get the government off the backs of the employers. 

At the heart of this view is the basic neoclassical economic view that the market is the most efficient instrument for the allocation of scarce resources. The reason for this is because of the principle of marginal utility, or marginality. This principle basically states, all things being equal, that (from the demand side) as an individual consumes more and more of something, it becomes less desirable (or, as economists say, the "utility" is lessened). Thus, after the first can of caviar that I have, there will be decreasing returns on my enjoyment of additional cans. This principle makes sense: the more you have something, the more "normal" it becomes, the less of a treat it is, the less desirable for itself it becomes. 

From the supply side, the principle of marginality is similar. The idea is that as a producer expands production to meet the increasing demand, each additional unit costs more to produce than the ones before. Consider the caviar. As I eat more and more caviar, and the caviar fishermen expand their operation, it will take them more and more resources and energy to get enough caviar to fill my demand. At the same time, the price will go down, since the supply goes up, and my desire goes down. 

The outcome of these two movements is what economists call a "competitive equilibrium. It will mean that demand has equaled supply. Alfred Marshall was perhaps the most famous and one of the earliest economists to talk about this. Now, if one thinks about this, it's great. It means that left on its own, the market does everything we need. It allocates things based on what the individual wants, and that all the productivity in an economy is focused. It also means that there is no other way to increase one player's welfare at the expense of another (what economists call the "Pareto optimum"). It is a world where the market - not the government or some other thing - allocates all the resources out there in the most efficient manner possible. 

There is only one slight problem, a small caveat that makes a big difference. This is the caveat "all things being equal" (ceteris paribus). What does this mean? This means that the very concept of a market is an abstraction, an utopia, a pie-in-the-sky world where no one really lives. The "market" correcting itself is a fiction - a pious fiction, perhaps, but a fiction nonetheless. Let me explain why. 

The major reason is that this world is filled, not with pure competition, but with "oligopolies." This is a nice economic term for an asymmetrical allocation of productivity between producers. Monopolies and "duopolies" are examples of this, and of course most free-marketers would say that you need really strong anti-trust laws to make sure these don't occur (almost the only place they say the government should intervene). However, in our economy, and the international economy, it is clear that in fact oligopolies are the norm, not the exception. Even though we heap praise on small business as the back-bone of the economy, it is really not the case that there could ever be market equilibrium because of the very real problem that there is no perfect competition, and their never will be. The reasons are simple: history, geography, and politics. 

Let's take an example: Microsoft. There is this concept economists now use called "path dependencies." This is a fancy way to say that you only have so many options. Microsoft is not considered a monopoly because it's not like they make all the software or hardware. They're just the main operating system people have to use. Imagine you're a small business owner who is dependent on computers. You can either go with Mac - which has much less software and compatibility - or with Microsoft, who has the basic system most people use. In perfect competition this would not be the case. You would have lots of choices, and you would pick the one that is the most efficient. Instead, because a contingent thing like the poor marketing of Apple in the 80's, you only have one choice, and not even the most efficient. This is a path dependency. 

The second major thing about oligopolies is the fact that these are not merely asymmetries in production and allocation, but in real political power. Political power is one of those things that neoclassical economists don't like to talk about (it cannot be quantified), but in the real world (not the world of ceteris paribus) political power, and power asymmetries in general, really effect the outcome of the market. This of course is seen with the recent Wall-Street debacle, and with organizations such as the IMF and WTO, who continually reinforce the trade advantages of developed countries (by things like not demanding the dismantling of agricultural subsidies and other things). In fact, even with regional trade agreements that are supposed to be "free" like NAFTA, there is no free trade. Barriers are lessened to a degree (and disproportionately for less-developed countries), but the fact remains that for all the free-trade talk, there is very little free trade at all. 

Which brings me to the title of this blog. I'm not a socialist. In theory, free markets are great. What I have issue with is the notion that any market really is free, or really can be. Supply siders might retort that 'it's because of government!' But it is a fundamental misrecognition of power to think that somehow those in power (such as the powerful in oligopolitic markets) will just give it up for an ideal that has never been seen in the history of the world (the ideal of market equilibrium). 

Then again, maybe I'm wrong, and we should say with Nietzsche that "the lie is a condition of life." Or, maybe we should buck up and say that humans making decisions has more to do with allocation of resources that we acknowledge, and then we should figure out what good judgment entails. Just an idea.