sustainability Next post: Wombling free: what does Wimbledon have in common with the language of sustainability?

Alan Turing Previous Post: From CAPTCHA to morphogen: how Alan Turing has influenced modern English

Alan Turing

ELIZA: a real example of a Turing test

In honour of Alan Turing, mathematician, cryptanalyst, and progenitor of computer science, we wanted to provide you with a demonstration of one of the areas in which his work has had an influence on the English language.

The Turing test, ‘a test for intelligence in a computer, requiring that a human being should be unable to distinguish the machine from another human being by using the replies to questions put to both’, is commonly regarded as the barrier which a computer program must break to be considered an artificial intelligence. Though he didn’t use the word himself to describe it, the test was set out by Turing in his 1950 paper Computing machinery and intelligence, published in the journal Mind.

Perhaps fortunately for us, no computer has yet passed a Turing test. Turing himself predicted that by the end of the 20th century computers with about 120MB of memory – a modest specification for the time as it turned out – would be able to pass the test in front of 30% of humans, but more recent predictions have that event happening a few decades hence. This has not, however, stopped researchers in the field of artificial intelligence from creating software that attempts to simulate intelligence, and we are presenting an early example of such an attempt here.

Does ELIZA do little, or does she succeed?

ELIZA is a piece of software that attempts to simulate the conversation of a Rogerian psychotherapist. It was written in the mid-1960s by Joseph Weizenbaum, a professor of computer science at MIT, and it was the subject of a journal article published in January 1966. You can find a working copy of it further down this page, thanks to the Javascript version of ELIZA created by Norbert Landsteiner.

Of course, ELIZA does not pass a Turing test. If you interact with it the conversation does not last too long before it becomes obvious you are talking to a machine.

The principle behind ELIZA’s natural language processing is simple; incoming phrases are parsed into their constituent words which are inserted into a set of stock responses by following a fairly simple rule set. In some conversations it quickly becomes obvious that the responses are mechanical, yet in others the illusion of an intelligent reply can be maintained through several iterations of its operation.

If ELIZA has a simple rule set governing its operation, one might imagine that a better piece of intelligence-simulating software might possess a more complex set of rules. Unfortunately as researchers in the decade following ELIZA’s creation found out, this approach simply creates ever-more-complex and unwieldy rule sets without delivering the sought-after improvement in performance. More recent attempts at natural language processing have therefore abandoned the rule set approach, and instead use large corpora of language samples just like our own Oxford English Corpus to derive probable meanings and responses to their inputs based on previous use of similar language. One of the most famous examples of this kind of machine is IBM’s Watson, a supercomputer application that took part in the American TV quiz show Jeopardy! in 2011.

So try interacting with ELIZA below. The main text area holds the conversation, enter your response to each ELIZA sentence in the text box below it and click the ‘talk’ button. For a bit of entertainment, see how it reacts to song lyrics or poetry. I was particularly taken with its responses during a conversation in which I ‘Rickrolled’ it with the lyrics of Rick Astley’s 1987 hit Never Gonna Give You Up.

As a technologist, part of me looks forward to the time in which I can access computer power capable of passing a Turing test. But there is still another part of me that is glad that time has not come. I can close down a web page containing a test failure like ELIZA without guilt, but would I be able to do the same with one containing a personality I’d cracked jokes with, made friends with, or even fallen in love with?

The opinions and other information contained in OxfordWords blog posts and comments do not necessarily reflect the opinions or positions of Oxford University Press.