Tuesday, July 31, 2012

Other minds and the challenge of solipsism

To: Anthony K.
From: Geoffrey Klempner
Subject: Other minds and the challenge of solipsism
Date: 31 January 2008 12:59

Dear Anthony,

Thank you for your email of 21 January, with your second essay for Possible World Machine, entitled 'Lost in Logical Solipsism'.

Regarding your question about Philosophy Pathways e-journal. There is nothing in particular that I look for, other than quality. Target length is 2500 words, but I have published pieces as short as 1000 and as long as 4000.

Essay

This is quite effective in inducing a state of 'aporia' (not knowing what to say) in the face of the problem of other minds.

I am not going to give you a lecture on my view of the other minds problem, but there are two points I would make in response to what you have written which you perhaps have not considered.

The first concerns your hypothesis that other persons are 'programmed drones', or 'robots'.

You can buy toy dogs which are pre-programmed with a wide variety of doggy-like responses. But human technological ingenuity only goes so far. Pretty soon, the child learns all the responses and then it's just routine, nothing to be surprised about.

Similarly, super-intelligent Martians could build a pre-programmed 'toy human' with many, many more responses. Serving coffee in a bar is simple. Responding to chat-up lines is also simple. How many 'original' responses does a girl need?

This suggests a science-fiction scenario as the basis for a sceptical hypothesis. All my empirical knowledge is not sufficient to disprove the sceptical hypothesis that I am 'conversing' with a 'human toy' made by super-intelligent Martians. Say, I am the last surviving human being after a nuclear holocaust. Kindly Martians have made a 'world' for me, to stop me from getting lonely.

But this is *not* the other minds problem. It is merely a version of inductive scepticism, the same variety as, 'How do I know that I am not dreaming in the Matrix?'

What does raise the other minds problem in an acute way is the widespread belief that a human being is ultimately a biological organism, whose brain is naturally constructed to 'run a program', and it is by virtue of its running a program that we are able to say that the subject has 'thoughts', 'feelings', 'sensations'. These are just terms which refer to the behaviour of an entity which has the capacity to process information in the way human beings do.

And I am just one example of this kind of entity.

The problem is that I know -- or seem to know -- that there is 'something extra' in me. I don't just exhibit pain behaviour, I feel *pain*. This term *pain* (with emphasis) is not a term in the public language we all use, because its intended referent is an object which only I can know, what Wittgenstein calls a 'private object'. A large part of the 'Philosophical Investigations' is dedicated to refuting the idea of the private object. (Start at paragraph 243 and read forwards for a 100 paragraphs or so, then go back to paragraph 202 and read to 243.)

This brings us to my second point.

You said that you 'know' that you have a mind. But do you? How do you know this? Suppose I suggested that you were only 'given' a mind five seconds ago. (Don't worry about how this happened. It doesn't need God. Just suppose that you mind just 'sparked' into life and before that you were a zombie like all the others.) All that you 'remember' before that was given to you at the same time.

I mean, if we are seriously considering that you have a 'private object' that no-one else has, then we have to consider the possibility that the 'you' which existed six seconds ago was just like the girl in the coffee shop so far as *your* (the present 'you') knowledge is concerned.

Hence, the claim that solipsism 'shrinks to solipsism of the present moment'.

Another way of looking at the problem is this. Define a 'zombie' as an individual which biologically has all that is required to have 'thoughts, feelings and sensations', in the sense in which these refer to brain processes and observable behaviour. Then, the problem of solipsism can be stated as follows: I know that I am not a zombie but I don't know that other persons are not zombies.

The difficulty is, by hypothesis, your *zombie double* would say this too. You have identical body and brain states. The only difference is that you have the 'private object' while your zombie double doesn't. Then why is it that your zombie double insists that he has? And if he can insist that he has a private object when he doesn't then maybe you should consider the possibility that your insistence has a different explanation than the one that first occured to you.

This is not a refutation of solipsism, but rather makes it an even more uncomfortable theory to believe. On the other hand, you might rebound from the absurdity of the notion that you did not have a mind six seconds ago, and come to the conclusion that speculation about whether others 'have' a private object like yours is absurd.

This seems in fact to be your conclusion: I can't 'prove' that others have minds, and yet I 'know' that they do.

But this still doesn't defeat the problem. So long as you believe that 'pain' etc. ultimately refers to a private object you are stuck in the solipsist's predicament, any assertions of confidence that you may make in your 'knowledge' of other minds is just bravado.

All the best,

Geoffrey