There's something queer about describing consciousness. Whatever people mean to say, they just can't seem to make it clear.
          - Marvin Minsky

When it comes to consciousness, only one thing is clear.

There exists no test, no procedure, no method, that can objectively identify the presence of consciousness in something.

This may be a difficult statement to comprehend, especially because it can quite possibly throw someone's assumptions about the world into chaos. We all go around, secure in the belief that every person we interact with is conscious, just as we are. There is no conclusive evidence that exists, or can be found, to show this to be the case.

Determining consciousness can only be done by the entity doing the testing. I can determine for myself that I am conscious. This is not because I define the word to fit whatever I am, but because I am the only entity that can completely understand and comprehend my awareness and self-awareness, that can truly realize that mental processes are occuring, doing the thinking and the remembering and the debating and the daydreaming. Though, to be honest, it's fairly difficult to even describe how I know I'm conscious.

Testing another entity for consciousness would of course be more difficult, relying on examining the "output" of the entity, interpreting its actions and sounds that it makes, and determining if they are in line with what we'd expect from a conscious entity. So, why can't a machine be built to emulate all these, all in a purely functional way from a highly complex set of instructions? Would such a machine be conscious? It would be no more conscious than how much the man inside Searle's Chinese Room understands Chinese.

Part of the difficulty may even be due to the fact we can't even properly DEFINE consciousness. All these writeups in this node, and there's not a single one that anyone can point to that completely and authoritatively defines consciousness. In many ways, it is sort of like trying to define life. We can't really come up with a complete definition, we just sort of "know it when we see it". As stated in the Marvin Minsky quote, I would bet every person who's added a writeup here would state that they're not happy with what they wrote, that they know it's not adequate, that it is not quite what they meant, but they can't even SAY what they mean. I even feel that way about this writeup.

This definitely has repercussions when discussing artificial intelligence. If we can't properly prove that a human being is conscious, how will we know when a machine is? Is there some specific test we can do to consider it conscious? If so, then we can program one to pass the test, but that doesn't mean it's conscious. When a machine can learn? already done. When a machine can observe and react? Done, in many ways, depending on the defeinition of "observe".

So, when it comes down to it, the only reason we accept that other human beings are conscious is because of assumptions. A person feels they are conscious themselves, seems no reason to believe that it is any different for other humans, and thus is willing to accept they are conscious. Whether or not consciousness is assigned to non-human creatures, such as dolphins, elephants, or cats, isn't even consistent among people. However, should technology keep progressing, we're going to have to evaluate whether or not to consider something conscious much more carefully, as should consciousness not depend on a "soul", a "spirit", something beyond the body that we can't duplicate, there's a good chance we'll soon enough have something man-made, exhibiting behaviors that make the question very important.