It was
1966 - two years before the release of
2001: A Space Odyssey where
HAL held a limited - but real conversation;
Joseph Weizenbaum published
Eliza (ELIZA - A computer program for the study of natural language communication between man and machine. Communications of the ACM 9(1):36-45). This program was the first one to be able to "communicate" with another person in plain
English (and thus the first program to have a chance at passing the
Turing Test).
Now, Eliza doesn't really understand English - it has no knowledge of what a noun or a verb is. What Eliza does have is a set of rules that help it split apart a sentence and rework this into a question.
Some example rules from Chatbot.pm (a perl implementation of Eliza):
* i remember *
- Do you often think of (2) ?
- Does thinking of (2) bring anything else to mind ?
- What else do you recollect ?
- Why do you remember (2) just now ?
* do you remember *
- Do you think I would forget (2) ?
- Why do you think I should recall (2) now ?
In these examples, if a match of (something) i
remember (something) is found the second something is replaced in the follow up question in the spot
'(2)'. Saying a word in the set {'
mother', '
father', '
sister', '
brother', '
mom', '
dad', '
wife', '
children', '
child'} will lead Eliza to ask a question about your family such as "Tell me more about your
family."
Another example takes matches the sentence "It seems that you hate me" into the rule * you * me and returns the response "What makes you think I (2) you". Note the "you * me" being switched around to "I * you".
If Eliza fails to parse a question "general purpose" statements are used such as "I'm not sure I understand you fully." or "Does talking about this bother you ?". If you say something trying to close off a avenue of conversation (such as simply "no") Eliza returns with "Are you saying no just to be negative?" or a similar question.
Some people have mistaken Eliza for a real person - Eliza was modeled on the Rogerian method of psychotherapy. Weizenbaum told the story of Daniel Bobrow (recorded in the December 1968 issue of SIGART Newsletter) of a VP talking to an engineer who fell asleep at a keyboard leaving Eliza running. The conversation went something like this (Eliza in bold):
Do you think that I can use the computer this morning?
Why do you ask?
I might be able to make some additional telcomp sales.
Why aren't you sure?
My prospective customers have never seen a demonstration of the system.
Does this mean very much to you?
Of course it does.
You seem very positive of that.
This is ridiculous.
What do you mean this is ridiculous?
Is there any reason why I can't use the computer now?
Tell me more about your prospective customers.
Please dial me up on 491-1850
Note that after that remark the VP did not obey instructions (for Eliza) and left out the period which is required to parse the sentence. Therefore, of course, the computer didn't answer him. This so infuriated the VP, who thought that the engineer was playing games with him, that he called Daniel up, waking him from a deep sleep, and said:
Why are you being so snotty with me?
To which Daniel answered "What do you mean why am I being snotty to you?
The VP angrily read the dialog that 'they' had been having, and couldn't get any response but laughter from Daniel. It took Daniel a while to convince him it really was the computer.
Despite these rules it is often easy to "confuse" Eliza (bold):
How do you do. Please state your problem
I'm wondering whether computers can think.
Why do you mention computers
Since you are one I thought I would ask you - can you think?
Did you think they might not be one you thought you would ask i -can i think
Let's talk just about you - not me. Can you think?
You like to think I - not you - don't you
Pardon?
What does that suggest to you?
What does is suggest to you?
Tell me about what does it suggest to I
That's nonsense - which of us do you mean by I?
Why do you think I mean by you
The question meant to which of us is the word referring.
Suppose the question meant to which of us were not the word referring
I think you are an extremely stupid program.
Perhaps you would like to be and extremely stupid program
When first introduced, many people "fell" for Eliza believing it to be a real person or otherwise becoming dependant upon "talking" with Eliza. Some people saw Eliza as a way to allow psychologists to deal with more patients - using Eliza as a front end for all but the most serious of cases. Many psychologists were afraid of losing their jobs to Eliza despite Weizenbaum's instance that the program had no knowledge of a person's problems and should not in any way be used as a substitute for a human.
This went to demonstrate Weizenbaum's fear that while AIs may not be able to understand or sympathize with humanity we (society) is often ready to entrust these constructs with the task of managing our affairs.
It should be realized that Eliza fails many basic tests for intelligence (artifical or otherwise) - Eliza does not learn, nor is Eliza aware of its surroundings. Eliza is a simple program that uses tricks of language to make it sound like it is holding a conversation.
Eliza has been ported to almost every programming language and is most well known by Emacs users as M-x doctor.
The name Eliza is from Pygmalion.
http://i5.nyu.edu/~mm64/x52.9265/january1966.html
http://www.cs.nott.ac.uk/~gxk/courses/g5aiai/002history/eliza.htm
http://web.mit.edu/STS001/www/Team7/eliza.html
http://www.abc.se/~jp/articles/computer/misc/eliza.txt