Chinese room

From Academic Kids

The Chinese room argument is a thought experiment designed by John Searle (1980) to debunk the stronger claims made by strong AI (also functionalism).

A belief of strong AI is that if a machine were to pass a Turing test, then it can be regarded as "thinking" in the same sense as human thought. Or put another way, proponents of strong AI hold that the human brain is a computer (of a sort) and the mind nothing more than a program. Adherents to this idea believe furthermore that systems demonstrating these abilities help us to explain human thought. A third belief, necessary to the first two, is that the biological material present in the brain is not necessary for thought. Searle summarizes this viewpoint, which he opposes, in this manner:

The computer is not merely a tool in the study of the mind; rather, the appropriately programmed computer really is a mind, in a sense that computers given the right programs can be literally said to understand and have other cognitive states. (Hofstadter and Dennett, 353)

Thought experiment

In the Chinese room thought experiment, a person who understands no Chinese sits in a room into which written Chinese characters are passed. In the room there is also a book containing a complex set of rules (established ahead of time) to manipulate these characters, and pass other characters out of the room. This would be done on a rote basis, eg. "When you see character X, write character Y". The idea is that a Chinese-speaking interviewer would pass questions written in Chinese into the room, and the corresponding answers would come out of the room appearing from the outside as if there were a native Chinese speaker in the room.

It is Searle's belief that such a system could indeed pass a Turing Test, yet the person who manipulated the symbols would obviously not understand Chinese any better than he did before entering the room. Searle tries to refute the claims of strong AI one at a time, by positioning himself as the one who manipulates the Chinese symbols. The Chinese room assails two claims of Strong AI. The first claim is that a system which can pass the Turing test understands the input and output. Searle replies that as the "computer" in the Chinese room, he gains no understanding of Chinese by simply manipulating the symbols according to the formal program, in this case being the complex rules. The operator of the room need not have any understanding of what the interviewer is asking, or the replies that he is producing. He may not even know that there is a question and answer session going on outside the room.

The second claim of strong AI which Searle objects to is the claim that the system explains human understanding. Searle asserts that since the system is functioning, in this case passing the Turing Test, and yet there is no understanding on the part of the operator, then the system does not understand and therefore could not explain human understanding.

The core of Searle's argument is the distinction between syntax and semantics. The room is able to shuffle characters according to the rule book. That is, the rooms behaviour can be described as following syntactical rules. But in Searle's account it does not know the meaning of what it has done; that is, it has no semantic content. The characters do not even count as symbols because they are not interpreted at any stage of the process.

The fact that syntax is insufficient to account for semantics is perhaps not as controversial as understanding what needs to be added to syntax in order to account for semantics. Searle lists consciousness, intentionality, subjectivity and mental causation as candidates. Any adequate theory of the mind must be able to explain intentional states. Searle is at pains to point out that the mind is a result of brain function. He rejects dualism, insisting that mental states are biological phenomena.

Formal argument

In 1984 Searle produced a more formal version of the argument of which the Chinese Room forms a part. He listed four premises:

Premise 1: Brains cause minds
Premise 2: Syntax is not sufficient for semantics
Premise 3: Computer programs are entirely defined by their formal, syntactic structure
Premise 4: Minds have semantic content

The second premise is supposedly supported by the Chinese Room argument, since Searle holds that the room follows only formal syntactical rules, and does not understand Chinese. Searle posits that these lead directly to three conclusions:

Conclusion 1: No computer program by itself is sufficient to give a system a mind. Programs are not minds.
Conclusion 2: The way that brain functions cause minds cannot be solely in virtue of running a computer program
Conclusion 3: Anything else that causes minds would have to have causal powers at least equivalent to those of the brain

Searle describes this version as excessively crude. There has been considerable debate about whether this argument is indeed valid. These discussions centre on the various ways in which the premises can be parsed. One can read premise 3 as saying that computer programs have syntactic but not semantic content, and so Premises 2, 3 and 4 validly lead to conclusion 1. This leads to debate as to the origin of the semantic content of a computer program.


There are many criticisms of Searles argument. Most can be categorized as either systems replies or robot replies.

The systems reply

Although the individual in the Chinese room does not understand Chinese, perhaps the person and the room considered together as a system, do. Searles reply to this is that someone might in principle memorise the rule book; they would then be able to interact as if they understood Chinese, but would still just be following a set of rules, with no understanding of the significance of the symbols they are manipulating.

The robot reply

Suppose that instead of a room, the program was placed into a robot that could wander around and interact with its environment. Surely then it would be said to understand what it is doing? Searle’s reply is to suppose that, unbeknownst to the individual in the Chinese room, some of the inputs he was receiving came directly from a camera mounted on a robot, and some of the outputs were used to manipulate the arms and legs of the robot. Nevertheless, the person in the room is still just following the rules, and does not know what the symbols mean.

Suppose that the program instantiated in the rule book simulated in fine detail the interaction of the neurons in the brain of a Chinese speaker. Then surely the program must be said to understand Chinese? Searle replies that such a simulation will not have reproduced the important features of the brain—its causal and intentional states.

But what if a brain simulation were connected to the world in such a way that it possessed the causal power of a real brain—perhaps linked to a robot of the type described above? Then surely it would be able to think. Searle agrees that it is in principle possible to create an artificial intelligence, but points out that such a machine would have to have the same causal powers as a brain. It would be more than just a computer program.

Related works

de:Chinesisches Zimmer es:Sala china fr:Chambre chinoise ko:중국어 방 he:החדר הסיני ja:中国語の部屋 sv:Det kinesiska rummet zh:中文房间


Academic Kids Menu

  • Art and Cultures
    • Art (
    • Architecture (
    • Cultures (
    • Music (
    • Musical Instruments (
  • Biographies (
  • Clipart (
  • Geography (
    • Countries of the World (
    • Maps (
    • Flags (
    • Continents (
  • History (
    • Ancient Civilizations (
    • Industrial Revolution (
    • Middle Ages (
    • Prehistory (
    • Renaissance (
    • Timelines (
    • United States (
    • Wars (
    • World History (
  • Human Body (
  • Mathematics (
  • Reference (
  • Science (
    • Animals (
    • Aviation (
    • Dinosaurs (
    • Earth (
    • Inventions (
    • Physical Science (
    • Plants (
    • Scientists (
  • Social Studies (
    • Anthropology (
    • Economics (
    • Government (
    • Religion (
    • Holidays (
  • Space and Astronomy
    • Solar System (
    • Planets (
  • Sports (
  • Timelines (
  • Weather (
  • US States (


  • Home Page (
  • Contact Us (

  • Clip Art (
Personal tools