HOME |  RANTS |  SOFTWARE |  PRESENTATIONS |  STUFF |  CONTACT


'Rationality' Without God (or Gods)

Some people assert that, without a God to guide things, we cannot trust our own thoughts and reasoning, and have no way (or at least no reason) to believe any of our conclusions. There are a few variations on this, of course; I'll tackle the most formal.

The "Evolutionary Argument Against Naturalism"

I'll liberally cut from the Wikipedia article here to summarize this:

Alvin Plantinga's argument attempts to show that the concepts of naturalism and evolution are logically incompatible because the probability of non-theistic evolution producing reliable cognative faculties is low... Plantinga's argument begins with the observation that our beliefs can only have evolutionary consequences if they affect behaviour. To put this another way, natural selection does not directly select for true beliefs, but rather for advantageous behaviours... Plantinga points out that innumerable belief-desire pairs could account for a given behaviour; for example, that of a prehistoric man fleeing a tiger:

"Perhaps Paul very much likes the idea of being eaten, but when he sees a tiger, always runs off looking for a better prospect, because he thinks it unlikely the tiger he sees will eat him. This will get his body parts in the right place so far as survival is concerned, without involving much by way of true belief... Or perhaps he thinks the tiger is a large, friendly, cuddly pussycat and wants to pet it; but he also believes that the best way to pet it is to run away from it...

The basic idea here can be paraphrased something like: "Mother nature doesn't care what we believe as long as it keeps us alive long enough to reproduce - she'll favor a useful false belief over a less-useful true one. So evolution would be expected to produce a creature that believes a huge pack of lies."

The first problem with the EAAN

Of course, the second statement doesn't really follow from the first - at least, not in a strong and compelling way. It's critical to the argument that it be more likely, or at least as likely, that a false belief will be at least as useful as a true one. But is this really the case?

David Gerrold, the science fiction author, addressed a related issue in one of his novels. In it, a character is discussing how maps, even our mental ones, are not the territory they represent, and can never be. (This is called "the map-territory problem" in philosophy.) As he points out, "We don't necessarily want accurate maps, we want useful ones. But accuracy is extraordinarily useful."

That's a critical point. The more accurate a map is, the more likely it is to be useful. A map of New York's subway system will usually be quite abstract and hide a lot of detail - but the details it does show had best be correct. For example, it might not be accurate as to scale, and not show the true distances between the stops, but for many purposes this isn't important. Now, if it shows a nonexistent stop - or hides an existing one - by that very fact it becomes dramatically less useful than an accurate map for the purpose of navigating around New York by subway.

By the same token, the more accurate our mental "maps" of the universe around us are, the more likely they are to be useful. And that does have evolutionary consequences. We would expect that, at least for the environments we've evolved for, our mental maps would tend to be pretty useful... and accuracy and usefulness are strongly correlated.

Moreover, you can easily imagine arbitrarily complex chains of 'false beliefs that have useful consequences', such as the fancies Plantinga describes above, but evolution must economize. Resources are finite and the more complex the system the harder and more expensive it is to develop (and the more ways development can go wrong). Poison, for example, is a very useful weapon and defense, but it's biologically quite expensive to produce and rarely arises unless an organism really, really needs it. (Note: I'm not saying that evolution directs toward a goal, but if an ecological niche can only be filled by something poisonous, that's what you'll find there, if anything.)

It's worth noting that the examples Plantinga comes up with are all overly complex and Rube-Goldberg-esqe; certainly evolution does produce such structures but also boasts examples of efficiency that far outstrip that of most of the machines we can produce.

The second problem with the EAAN

The second problem (at least, for a theist) with the EAAN is that it's actually true - so far as it really goes.

The first part of (my biased paraphrase of) the EAAN is correct - if a belief is useful (at least in the typical environment an organism encounters), it will be selected for regardless of its accuracy. As we've established, accurate ideas are more likely to be useful than false ones, but by no means is that a guarantee. Another science fiction author, Robert Heinlein, put this well: "Delusions are often functional. A mother's opinions about her children's beauty, intelligence, goodness, et cetera ad nauseam, keep her from drowning them at birth."

We have plenty of examples of 'naturally warped' thinking in our own experience. For example, this strange, incomprehensible delusion - common among the majority of women - that men are somehow sexually attractive and women are not. ( :-> ) Of course, the utility of such attitudes is fairly obvious. (Though there are other potential evolutionary strategies.) So we would expect a certain amount of irrational (or, at least, non-rational) thinking just because it's so useful in propagating the species. But that's only part of the story. The human body is a remarkable machine, but it's far from perfect and has structural flaws and weaknesses. Should we expect the human mind to be dramatically different?

The thing is, people really are generally bad at thinking logically [1]. People routinely, in large numbers, completely fail to correctly assess the risks associated with various dangers. They buy into superstition and magical thinking, they follow fads, horrific practices, witch hunts, claptrap, and so on.

It takes effort for people to think logically about the world and to carefully examine it while putting aside preconceptions. And when we do, we find things that we didn't expect - things that strike us as really weird. (I can recommend Carl Sagan's book The Demon-Haunted World for a discussion both of the human tendency to fallacy and the ways to counteract it.) This is kind of a problem for the people who try to put forth this argument, if you ponder it for a moment. They are the ones claiming that our reasoning processes are deliberately designed and fashioned by (at least) one master craftsthing, so the fact that our thinking is so easily derailed is kind of a sticking point. By their own lights, our thinking ought to be substantially clearer and better.

Of course, the actual case is more-or-less what we'd expect from an evolutionary origin of humanity: A mind that's flexible enough to learn and correct errors - good enough for day-to-day operations in the kind of environments we've lived in - but nevertheless prone to fallacies and illogic when tired, excited, or careless. If you want to see an example in your own head, check this out.

But that is not a concession that rational thought is impossible. Indeed, as I noted above, we can, with effort, think logically and come up with sound, reliable, testable, and tested answers about the world.

There's also another consideration. Even if you consider it possible that you might be mistaken or illogical about many things, you still have to assume that it's possible to be right at least some of the time, about some things. Otherwise you descend into solipsism.

I think Woody Allen put this in the most succinct and forceful way: "Is knowledge knowable? If not, how do we know this?"

The Converse Argument

Increasingly it appears that people's basic cognition tends to lead toward supernatural accounts of the world, that belief in spirits and creationism and such arises from the natural styles of thinking people employ rather than any actual truth of such notions.


[1] That article is, in fact, an excellent example of the problem. It's a detailed and well-researched (and well-referenced) article discussing cognitive errors that can make people believe weird things. And I found it on a site promoting ESP. Oy.