A standard philosophical question.

How can we determine that we actually exist as full bodies in the world we are experiencing? It is impossible to prove that a person is not simply a brain that is being fed immense amounts of stimuli to simulate a real world that does not truly exist.

Or for that matter, it may be possible that we are simply artificial intelligence software in a very complex computer.

After all, if we really are flesh and meat creatures, then we're still one level away from perceiving reality. You don't really experience the world, you experience a world created in your mind from the sensory input it receives from your body. Everything you think you sense is actually a recreation that is unique to you.


Back when I lived in New York, I was accosted by some homeless guy offering to sell me a cauliflower in a really old jar of tomato sauce.

He kept insisting that it was Hitlers brain, that he had been in the Army during World War II, and that he'd stolen it shortly before he was discharged.

He was such a good salesman and I'm such a sucker that I bought it.

Damn! did Hitler's brain taste good! Needed a little Oregano and salt but it was quite tasty indeed!
Rene Descartes' famous argument, in fact, is not much use with the brain-in-a-jar scenario.

After envisioning all possible deceptions and doubts, Descartes concludes that the one thing he can be sure of is that he is a 'thinking thing'.

Hey! Guess what? A brain-in-a-jar is a 'thinking thing' too!

You have to look elsewhere (maybe Kant, maybe Wittgenstein, for example) for some decent arguments against this brand of solipsism - though Descartes did go on to develop his 'Cogito' argument in order to make plausible the reality of the external world, his arguments resulted in Substance Dualism, which is hardly thought credible by most modern philosophers. An excellent discussion is available under "Undetectable Illusion."

One could easily imagine that an artificially conscious robot philosopher might ponder a similar 'chip-on-a-test-bench' problem, and in fact such a scenario was used by one of the crew in the Sci-Fi comedy film Dark Star, in order to convince an intelligent and willful bomb that it shouldn't trust its sensory data, and therefore hasn't necessarily received the order to detonate.

Quite sensibly, after some consideration, the bomb decides this attempt to "teach it phenomenology" is just more untrustworthy sensory input. "You are bad data", it says, and happily blows them all up.

Log in or register to write something here or to contact authors.