It was 1966 - two years before the release of 2001: A Space Odyssey where HAL held a limited - but real conversation; Joseph Weizenbaum published Eliza (ELIZA - A computer program for the study of natural language communication between man and machine. Communications of the ACM 9(1):36-45). This program was the first one to be able to "communicate" with another person in plain English (and thus the first program to have a chance at passing the Turing Test).

Now, Eliza doesn't really understand English - it has no knowledge of what a noun or a verb is. What Eliza does have is a set of rules that help it split apart a sentence and rework this into a question.

Some example rules from Chatbot.pm (a perl implementation of Eliza):
* i remember *

  • Do you often think of (2) ?
  • Does thinking of (2) bring anything else to mind ?
  • What else do you recollect ?
  • Why do you remember (2) just now ?
* do you remember *
  • Do you think I would forget (2) ?
  • Why do you think I should recall (2) now ?
In these examples, if a match of (something) i remember (something) is found the second something is replaced in the follow up question in the spot '(2)'. Saying a word in the set {'mother', 'father', 'sister', 'brother', 'mom', 'dad', 'wife', 'children', 'child'} will lead Eliza to ask a question about your family such as "Tell me more about your family."

Another example takes matches the sentence "It seems that you hate me" into the rule * you * me and returns the response "What makes you think I (2) you". Note the "you * me" being switched around to "I * you".

If Eliza fails to parse a question "general purpose" statements are used such as "I'm not sure I understand you fully." or "Does talking about this bother you ?". If you say something trying to close off a avenue of conversation (such as simply "no") Eliza returns with "Are you saying no just to be negative?" or a similar question.

Some people have mistaken Eliza for a real person - Eliza was modeled on the Rogerian method of psychotherapy. Weizenbaum told the story of Daniel Bobrow (recorded in the December 1968 issue of SIGART Newsletter) of a VP talking to an engineer who fell asleep at a keyboard leaving Eliza running. The conversation went something like this (Eliza in bold):

Do you think that I can use the computer this morning?
Why do you ask?
I might be able to make some additional telcomp sales.
Why aren't you sure?
My prospective customers have never seen a demonstration of the system.
Does this mean very much to you?
Of course it does.
You seem very positive of that.
This is ridiculous.
What do you mean this is ridiculous?
Is there any reason why I can't use the computer now?
Tell me more about your prospective customers.
Please dial me up on 491-1850

Note that after that remark the VP did not obey instructions (for Eliza) and left out the period which is required to parse the sentence. Therefore, of course, the computer didn't answer him. This so infuriated the VP, who thought that the engineer was playing games with him, that he called Daniel up, waking him from a deep sleep, and said: Why are you being so snotty with me?
To which Daniel answered "What do you mean why am I being snotty to you?

The VP angrily read the dialog that 'they' had been having, and couldn't get any response but laughter from Daniel. It took Daniel a while to convince him it really was the computer.

Despite these rules it is often easy to "confuse" Eliza (bold):

How do you do. Please state your problem
I'm wondering whether computers can think.
Why do you mention computers
Since you are one I thought I would ask you - can you think?
Did you think they might not be one you thought you would ask i -can i think
Let's talk just about you - not me. Can you think?
You like to think I - not you - don't you
Pardon?
What does that suggest to you?
What does is suggest to you?
Tell me about what does it suggest to I
That's nonsense - which of us do you mean by I?
Why do you think I mean by you
The question meant to which of us is the word referring.
Suppose the question meant to which of us were not the word referring
I think you are an extremely stupid program.
Perhaps you would like to be and extremely stupid program

When first introduced, many people "fell" for Eliza believing it to be a real person or otherwise becoming dependant upon "talking" with Eliza. Some people saw Eliza as a way to allow psychologists to deal with more patients - using Eliza as a front end for all but the most serious of cases. Many psychologists were afraid of losing their jobs to Eliza despite Weizenbaum's instance that the program had no knowledge of a person's problems and should not in any way be used as a substitute for a human.

This went to demonstrate Weizenbaum's fear that while AIs may not be able to understand or sympathize with humanity we (society) is often ready to entrust these constructs with the task of managing our affairs.

It should be realized that Eliza fails many basic tests for intelligence (artifical or otherwise) - Eliza does not learn, nor is Eliza aware of its surroundings. Eliza is a simple program that uses tricks of language to make it sound like it is holding a conversation.

Eliza has been ported to almost every programming language and is most well known by Emacs users as M-x doctor.

The name Eliza is from Pygmalion.


http://i5.nyu.edu/~mm64/x52.9265/january1966.html
http://www.cs.nott.ac.uk/~gxk/courses/g5aiai/002history/eliza.htm
http://web.mit.edu/STS001/www/Team7/eliza.html
http://www.abc.se/~jp/articles/computer/misc/eliza.txt

In 9th grade, I was taking French, and we had to do some kind of project, it was kind of like a science fair, except French and sans science.

So what I decided to do for my project was translate that old Eliza program into French.

I had the thing running on a TI-99/4A with Extended BASIC, and I had the voice synthesizer module that I borrowed from a friend. So I translated all that junk into French, phonetically spelled French at that, and for all the rules for English that didn't fit into French, I made new ones for whatever the French equivalent was.

You see, the secret of how Eliza works is basically this: It scans what you type to it, looking for pronouns and common verbs, and turns them around so that "I" becomes "you", and "you" becomes "i", and "are" becomes "am", etc. So if you type "You are a dork", it can come back with something like "What makes you think I am a dork?" and seem almost intelligent to someone who's not in on the trick. There's more to it than just that, there are a lot of little tricks like that built into the program, but that's the jist of it.

Well, this thing kind of blew the teacher's mind. You have to realize, this was 1983, so hardly anybody knew anything about computers, and here was a computer that seemed capable of carrying on a sort of conversation in French.

The best part though was when I loaded up the English version. Because suddenly everyone could actually understand what the thing is saying. Within minutes there were probably a hundred people gathered around my stupid little TI-99/4A all yelling and telling me what to type. I'd type it in, then everybody would be real quiet so they could hear what it had to say.

Naturally being mostly high school kids, they'd say "Tell it to fuck off!" And, knowing something about how the program worked, I would type in something that I knew would get a good reaction from the program..."I think you should fuck off".

And it would come back, in that goofy synthesized voice, with something along these lines:

"Do you think I should fuck off because you would like to be able to fuck off?"

People were ROFL without even knowing what ROFL was.

At that time, computer generated speech was still relatively new, and hooking it up with something like stupid old Eliza just killed. I suppose nowadays a project like that would probably get yawns.

Log in or register to write something here or to contact authors.