NOTE: unfortunately, it's come to my attention that this was previously brain-in-a-jar. Oops.

I came across the brain in a vat device in an Artificial Intelligence textbook. The brain in a vat is used by philosophers who are trying to aid their argument by invoking the difference between experience and reality, perception and objectivity.

The idea is that your brain has been removed at birth, and placed lovingly into a vat of goo capable of sustaining and nurturing your brain just as your skull would. Then, the diabolical scientists connect some kind of magical simulation device that feeds your brain false input from your false body and it's interactions with the false simulated universe, which your poor brain can not see as false. (Yes, like The Matrix)

One particularly relevant example is the Correspondence Theory of Belief, which relates to the connections between internal states (thoughts,memories) and external states (reality), contains (at least) two views related to the question, "do internal representations of reality actually have any meaning?" (A very important question for an AI agent, as you can imagine.) One view is called wide content, and the other narrow content. Wide content states that internal states are intrinsically linked to the specific objects to which they refer. Narrow contents disagrees, stating that the internal state is merely an intrinsic aspect of the person's beliefs. I could be interpreting this wrong, but it seems to me that the difference is subtle: given the statement, "This rutabaga is really good," wide contents says there is an intrinsic connection between your mind and that rutabaga, and narrow content says there isn't, and that it's all in your head. The brain in a vat seems to be useful for backing the narrow content theory.

Log in or register to write something here or to contact authors.