Inspired by the many good points made on cybernetic immortality. Here's my take:

The problem is that what constitutes a person may or may not, according to your opinion, be something that can be captured inside a computer.

If you lose an arm, you do not die. You adapt. Life is different without that arm, but you are still you.

If you lose a sense, like hearing, you do not die. You adapt. Life is different as a deaf person, but you are still you.

(what happens if you add things? Can anyone think of an example?)

The point is as follows: If the change is made gradually, and the mind of the person is able to cope, there is no intrinsic problem with having a mind with no body. There is no qualitative difference between losing a arm and losing your whole body. You adapt.

See achieving cybernetic immortality.

Now, what would a mind be like without a body? A once human consciousness would behave very differently if all the joys, pains, and drives of biological function were removed. Maybe the mind would have its own drives. Maybe it would create artificial ones.

The problem I have with this is that we aren't talking about losing a limb...we're talking about removing the consciousness from the body. I just don't think it's possible to transfer a consciousness (and have it remain "together", whatever that means) to a completely artificial environment. At least, it would be extraordinarily difficult. How would a person adapt to a lack of drive towards anything?

Adding stuff to us would be great, though. Cochlear implants are already a viable (if flawed) solution for people born without hearing. I wouldn't mind having a couple terabytes of secondary memory to store people's names, phone numbers, etc...

When people have stomach transplants they often inherit the donor's desire for tastes as - to some extent - this information is kept in the belly. It has grown to expect certain foods and regular doses throughout it's life.

Even in cases where the recipiant didn't know anything of the donor's history they can take on the tastes to a strange degree of accuracy -- A summary of March's New Zealand Medical Journal 1995's main story. An excellent read.

I'd consider this part of my tastes and preferences. Even thought it's not necessarily part of the brain. Thus to feel the same person in a new body this would need to be transfered.

It depends largely on what we mean by transfer. Do we mean a continuous conscious experience that starts in meat and ends in silicon with no discontinuity in between? Or do we mean a mind that can be transfered like a piece of software, ftping itself from computer to computer and thus living forever?

Let us assume for the moment that the mind is causally dependant on the brain. That is to say a given brain state consistantly gives rise to a given mental state. I'm talking on a neural level here, not a gross anatomic level- fire the same neurons and you get the same experience. This, I think, is a safe assumption given the current state of neurology.

Given this, what if we were to replace a single neuron with a micromicrochip that functions in exactly the same way- fires with the same voltage when presented with the same stimulus, etc. Clearly, with only a single neuron replace we would still be the same person- after all, this micromicrochip acts exactly like the original neuron so the brain as a whole functions in the same way. Now say we were to replace one neuron in ten with these micromicrochips. Still the same person, right? A rose still activates certain neural firing patterns, memories of playing on the beach activate others. Now let's go all the way and replace every neuron. The brain still acts the same and I think it would be safe to assume we'd still feel and act the same- no loss of emotions, no change in temperment, no loss of sensation.

But... have we really transfered anything? Certainly we've rebuilt the brain piece by piece, but it's still there in our head, just as limitted as it was before (since we've stipulated that these micromicrochips act just like neurons, plugging a cat5 line into your skull would still do nothing more than maybe give you a headache, just like before). What we really want is to be able to move our mind from one residence to another rather than just replacing the walls one-by-one (note: since our understanding of neurology is so limitted, I don't think it's feasible to rearrange the structure of the brain while transfering it in the above described process- we'd have to know exactly what to put where, which would be mighty hard, so let's for the moment stick to the only slightly less insanely difficult task of replicating the brain with it's current functionality)

So what about transfering the mind to a piece of software that could be run on any system of sufficient power? Well, this is currently a real live issue in philosophy, whether a program simulating consciusness is itself conscious or not. In other words, is consciousness hardware-dependant? IMHO, you'd have to have a computer that functions exactly like a brain in order to produce the consciousness that the brain does produce. The functional relationships between, say, RAM and CPU on a modern-day computer are not the same as the functional relationships present in a brain. Although it might be theoretically possible to produce a simulation of a brain (note: a research team has actually managed to produce an equation that precisely describes the action of a little clump of a few thousand cells somewhere in the back of the brain- the equation covered three full wall-size whiteboards. Doing the same for the whole brain would take a while), the hardware itself has a whole bunch of other functions going on at the same time, functions that do not relate to the program that is intended to produce consciousness. This isn't a hard and fast proof that it wouldn't work, but I think the fact that the computer's functionality is distinctly different from that of a brain's -even if that of a brain could be overlaid on top of it- indicates that the experience of self while being a program run by a present-day computer would be either very different or nonexistant. Since we're trying to preserve the "self", this option seems to be out.

So what's left? Well, what about building a new you? If we're assuming that we have the capability to replace our brain neuron by neuron, there's no reason why we can't build a new one from scratch, using the pattern of your existing brain. You can't exactly jack in with this option, but cybernetics is another issue. This would allow for an endless string of electronic-brain yous, a new one coming online whenever the old one starts to wear out. Some might argue that this isn't a transfer but instead a replication. Rarely if ever is such an argument raised when watching someone use the transporter on Star Trek, yet this is essentially the same thing- rebuilding a person in a new location with exactly the same pattern as the original.

Those interested in this subject may wish to read The Mind's I editted by Douglas Hofstadter, Hubert Dreyfus' What Computers Still Can't Do, David Chalmers' The Conscious Mind, John Searle's Minds, Brains and Science, and most anything by Daniel Dennett.

Log in or register to write something here or to contact authors.