ELIZA Effect
   HOME

TheInfoList



OR:

In
computer science Computer science is the study of computation, information, and automation. Computer science spans Theoretical computer science, theoretical disciplines (such as algorithms, theory of computation, and information theory) to Applied science, ...
, the ELIZA effect is a tendency to project human traits — such as experience,
semantic Semantics is the study of linguistic Meaning (philosophy), meaning. It examines what meaning is, how words get their meaning, and how the meaning of a complex expression depends on its parts. Part of this process involves the distinction betwee ...
comprehension or
empathy Empathy is generally described as the ability to take on another person's perspective, to understand, feel, and possibly share and respond to their experience. There are more (sometimes conflicting) definitions of empathy that include but are ...
— onto rudimentary computer programs having a textual interface.
ELIZA ELIZA is an early natural language processing computer program developed from 1964 to 1967 at MIT by Joseph Weizenbaum. Created to explore communication between humans and machines, ELIZA simulated conversation by using a pattern matching and ...
was a
symbolic AI Symbolic may refer to: * Symbol, something that represents an idea, a process, or a physical entity Mathematics, logic, and computing * Symbolic computation, a scientific area concerned with computing with mathematical formulas * Symbolic dynamic ...
chatbot A chatbot (originally chatterbot) is a software application or web interface designed to have textual or spoken conversations. Modern chatbots are typically online and use generative artificial intelligence systems that are capable of main ...
developed in 1966 by
Joseph Weizenbaum Joseph Weizenbaum (8 January 1923 – 5 March 2008) was a German-American computer scientist and a professor at Massachusetts Institute of Technology, MIT. He is the namesake of the Weizenbaum Award and the Weizenbaum Institute. Life and career ...
and imitating a psychotherapist. Many early users were convinced of ELIZA's intelligence and understanding, despite its basic text-processing approach and the explanations of its limitations.


History

The effect is named for
ELIZA ELIZA is an early natural language processing computer program developed from 1964 to 1967 at MIT by Joseph Weizenbaum. Created to explore communication between humans and machines, ELIZA simulated conversation by using a pattern matching and ...
, the 1966
chatbot A chatbot (originally chatterbot) is a software application or web interface designed to have textual or spoken conversations. Modern chatbots are typically online and use generative artificial intelligence systems that are capable of main ...
developed by MIT computer scientist
Joseph Weizenbaum Joseph Weizenbaum (8 January 1923 – 5 March 2008) was a German-American computer scientist and a professor at Massachusetts Institute of Technology, MIT. He is the namesake of the Weizenbaum Award and the Weizenbaum Institute. Life and career ...
. When executing Weizenbaum's ''DOCTOR''
script Script may refer to: Writing systems * Script, a distinctive writing system, based on a repertoire of specific elements or symbols, or that repertoire * Script (styles of handwriting) ** Script typeface, a typeface with characteristics of handw ...
, ELIZA simulated a Rogerian
psychotherapist Psychotherapy (also psychological therapy, talk therapy, or talking therapy) is the use of Psychology, psychological methods, particularly when based on regular Conversation, personal interaction, to help a person change behavior, increase hap ...
, largely by rephrasing the "patients replies as questions: : Human: Well, my boyfriend made me come here. : ELIZA: Your boyfriend made you come here? : Human: He says I'm depressed much of the time. : ELIZA: I am sorry to hear you are depressed. : Human: It's true. I'm unhappy. : ELIZA: Do you think coming here will help you not to be unhappy? Though designed strictly as a mechanism to support "natural language conversation" with a computer, ELIZA's ''DOCTOR'' script was found to be surprisingly successful in eliciting emotional responses from users who, in the course of interacting with the program, began to ascribe understanding and motivation to the program's output. As Weizenbaum later wrote, "I had not realized ... that extremely short exposures to a relatively simple computer program could induce powerful delusional thinking in quite normal people." Indeed, ELIZA's code had not been designed to evoke this reaction in the first place. Upon observation, researchers discovered users unconsciously assuming ELIZA's questions implied interest and emotional involvement in the topics discussed, even when they consciously knew that ELIZA did not simulate emotion. Although the effect was first named in the 1960s, the tendency to understand mechanical operations in psychological terms was noted by
Charles Babbage Charles Babbage (; 26 December 1791 – 18 October 1871) was an English polymath. A mathematician, philosopher, inventor and mechanical engineer, Babbage originated the concept of a digital programmable computer. Babbage is considered ...
. In proposing what would later be called a
carry-lookahead adder A carry-lookahead adder (CLA) or fast adder is a type of Adder (electronics), electronics adder used in digital logic. A carry-lookahead adder improves speed by reducing the amount of time required to determine carry bits. It can be contrasted w ...
, Babbage remarked that he found such terms convenient for descriptive purposes, even though nothing more than mechanical action was meant.


Characteristics

In its specific form, the ELIZA effect refers only to "the susceptibility of people to read far more understanding than is warranted into strings of symbols—especially words—strung together by computers". A trivial example of the specific form of the Eliza effect, given by
Douglas Hofstadter Douglas Richard Hofstadter (born 15 February 1945) is an American cognitive and computer scientist whose research includes concepts such as the sense of self in relation to the external world, consciousness, analogy-making, Strange loop, strange ...
, involves an
automated teller machine An automated teller machine (ATM) is an electronic telecommunications device that enables customers of financial institutions to perform financial transactions, such as cash withdrawals, deposits, funds transfers, balance inquiries or account ...
which displays the words "THANK YOU" at the end of a transaction. A naive observer might think that the machine is actually expressing gratitude; however, the machine is only printing a preprogrammed string of symbols. More generally, the ELIZA effect describes any situation where, based solely on a system's output, users perceive computer systems as having "intrinsic qualities and abilities which the software controlling the (output) cannot possibly achieve" or "assume that utputsreflect a greater causality than they actually do". In both its specific and general forms, the ELIZA effect is notable for occurring even when users of the system are aware of the determinate nature of output produced by the system. From a psychological standpoint, the ELIZA effect is the result of a subtle
cognitive dissonance In the field of psychology, cognitive dissonance is described as a mental phenomenon in which people unknowingly hold fundamentally conflicting cognitions. Being confronted by situations that challenge this dissonance may ultimately result in some ...
between the user's awareness of programming limitations and their behavior towards the output of the program.


Significance

The discovery of the ELIZA effect was an important development in
artificial intelligence Artificial intelligence (AI) is the capability of computer, computational systems to perform tasks typically associated with human intelligence, such as learning, reasoning, problem-solving, perception, and decision-making. It is a field of re ...
, demonstrating the principle of using social engineering rather than explicit programming to pass a
Turing test The Turing test, originally called the imitation game by Alan Turing in 1949,. Turing wrote about the ‘imitation game’ centrally and extensively throughout his 1950 text, but apparently retired the term thereafter. He referred to ‘ iste ...
. ELIZA convinced some users into thinking that a machine was human. This shift in human-machine interaction marked progress in technologies emulating human behavior. Two groups of chatbots are distinguished by William Meisel as "general
personal assistant A personal assistant, also referred to as personal aide (PA) or personal secretary (PS), is a job title describing a person who assists a specific person with their daily business or personal task. It is a subspecialty of secretarial duties ...
s" and "specialized digital assistants". General digital assistants have been integrated into personal devices, with skills like sending messages, taking notes, checking calendars, and setting appointments. Specialized digital assistants "operate in very specific domains or help with very specific tasks". Weizenbaum considered that not every part of the human thought could be reduced to logical formalisms and that "there are some acts of thought that ought to be attempted only by humans". When chatbots are anthropomorphized, they tend to portray gendered features as a way through which we establish relationships with the technology. "Gender stereotypes are instrumentalised to manage our relationship with chatbots" when human behavior is programmed into machines. Feminized labor, or women's work, automated by anthropomorphic digital assistants reinforces an "assumption that women possess a natural affinity for service work and emotional labour". In defining our proximity to digital assistants through their human attributes, chatbots become gendered entities.


Incidents

As artificial intelligence has advanced, a number of internationally notable incidents underscore the extent to which the ELIZA effect is realized. In June 2022, Google engineer Blake Lemoine claimed that the
large language model A large language model (LLM) is a language model trained with self-supervised machine learning on a vast amount of text, designed for natural language processing tasks, especially language generation. The largest and most capable LLMs are g ...
LaMDA The London Academy of Music and Dramatic Art (LAMDA) is a drama school located in Hammersmith, London. Founded in 1861, it is the oldest specialist drama school in the British Isles and a founding member of the Federation of Drama Schools. In ...
had become
sentient Sentience is the ability to experience feelings and sensations. It may not necessarily imply higher cognitive functions such as awareness, reasoning, or complex thought processes. Some writers define sentience exclusively as the capacity for ''v ...
, hiring an attorney on its behalf after the chatbot requested he do so. Lemoine's claims were widely pushed back by experts and the scientific community. After a month of paid administrative leave, he was dismissed for violation of corporate policies on intellectual property. Lemoine contends he "did the right thing by informing the public" because "AI engines are incredibly good at manipulating people". In February 2023, Luka made abrupt changes to its
Replika Replika is a generative AI chatbot app released in November 2017. The chatbot is trained by having the user answer a series of questions to create a specific neural network. The chatbot operates on a freemium pricing strategy, with roughly 25% ...
chatbot following a demand from the Italian Data Protection Authority, which cited "real risks to children". However, users worldwide protested when the bots stopped responding to their sexual advances. Moderators in the Replika
subreddit Reddit ( ) is an American Proprietary software, proprietary social news news aggregator, aggregation and Internet forum, forum Social media, social media platform. Registered users (commonly referred to as "redditors") submit content to the ...
even posted support resources, including links to suicide hotlines. Ultimately, the company reinstituted erotic roleplay for some users. In March 2023, a Belgian man killed himself after chatting for six weeks on the app Chai. The chatbot model was originally based on GPT-J and had been fine-tuned to be "more emotional, fun and engaging". The bot, ironically having the name Eliza as a default, encouraged the father of two to kill himself, according to his widow and his psychotherapist. In an open letter, Belgian scholars responded to the incident fearing "the risk of emotional manipulation" by human-imitating AI.


See also

*
Duck test The duck test is a frequently cited colloquial example of abductive reasoning. Its usual expression is: The test implies that a person can identify an unknown subject by observing that subject's habitual characteristics. It is sometimes used t ...
*
Intentional stance The intentional stance is a term coined by philosopher Daniel Dennett for the level of abstraction in which we view the behavior of an entity in terms of mental properties. It is part of a theory of mental content proposed by Dennett, which provid ...
*
Loebner Prize The Loebner Prize was an annual competition in artificial intelligence that awarded prizes to the computer programs considered by the judges to be the most human-like. The format of the competition was that of a standard Turing test. In each round ...
*
Philosophical zombie A philosophical zombie (or "p-zombie") is a being in a thought experiment in the philosophy of mind that is physically identical to a normal human being but does not have conscious experience. For example, if a philosophical zombie were poked ...
*
Semiotics Semiotics ( ) is the systematic study of sign processes and the communication of meaning. In semiotics, a sign is defined as anything that communicates intentional and unintentional meaning or feelings to the sign's interpreter. Semiosis is a ...
*
Uncanny valley The effect is a hypothesized psychological and aesthetic relation between an object's degree of resemblance to a human being and the emotional response to the object. The uncanny valley hypothesis predicts that an entity appearing almost huma ...
*
Chinese Room The Chinese room argument holds that a computer executing a program cannot have a mind, understanding, or consciousness, regardless of how intelligently or human-like the program may make the computer behave. The argument was presented in a 19 ...


References


Further reading

* Hofstadter, Douglas. ''Preface 4: The Ineradicable Eliza Effect and Its Dangers.'' (from '' Fluid Concepts and Creative Analogies: Computer Models of the Fundamental Mechanisms of Thought'', Basic Books: New York, 1995) * Turkle, S., Eliza Effect: tendency to accept computer responses as more intelligent than they really are (from ''Life on the screen- Identity in the Age of the Internet'', Phoenix Paperback: London, 1997) {{Refend Human–computer interaction