Science

Consciousness in Humans and Robots

Posted 14 Jun 2002 at 21:51 UTC by steve Share This

KurzweilAI has republished an old but good article by Daniel Dennett called Consciousness in Humans and Robots. The article presents and addresses many of the reasons given by AI skeptics for the "impossibility" of creating a conscious machine. Like many articles on the problem of consciousness, it avoids the issue of actually defining what might be meant by the word. But it's quite interesting and definitely worth the time to read, in any case.


Consciousness vs Machine States, posted 16 Jun 2002 at 04:29 UTC by pantera » (Observer)

An earlier article: Consciousness an Electromagnetic Field?, interestingly put forward an unorthodox theory which explains some phenomenas which had not been solved before.

This time we are presented with the idea of consciousness in humans and robots. Most of the arguments going back and forth are running in circles, proposing scientists will never come to an agreement until they are all proven wrong (or right depending which way you look at it).

Simply put, consciousness is the system that gives us the ability to recognize and be aware of the surrounding environment and appropriatelly act upon it. When one is asked the question: "Are you conscious?", it is implies if the person is able to receive information from the surrounding environment, if they are able to process it and respond accordingly to it. There are few ways we receive information: visually (recognizing patterns), by touch, by smell, by taste, and by listening (recognizing various sound frequencies).

After processing the information, we act in the environment. We are awear of the environment as we keep track of it, sorting (remebering) it by time, events, or arrangements. So according the the previous environment state and the newly received information we act or respond either physically (using our hands, legs, and or body), or sending out patterns of sounds (i.e. speaking).

As for I have made an attemt to map everything a human does into multiple states.

The same thing has been applied to many programs that appear "intelligent", or have "consciousness". When we build robots, we arm the robot with those intelligent programs so the robot can keep track of its states, its internal and external states, and autonomously respond or act in the environment upon receiving new information from its receptors.

What my main idea was to understand human consciousness in terms from computer science, while having a robotic design in mind.

Semantics, posted 16 Jun 2002 at 14:23 UTC by steve » (Master)

If you define consciousness, as you did above, it's much easier to do something useful. The problem is getting anyone else to agree that you've defined consciousness correctly. You're definition sounds similar to that proposed by Philip Johnson-Laird in his book, The Computer and the Mind - he defines it (I'm paraphrasing here) as: an operating system at the top of a hierarchy of processors, some of which relay messages about the world and other which transmit signals about how to act within the world.

But there are all sorts of other definitions out there. Some say consciousness is a mental mechanism for coping with complex social interactions by allowing the mind to imagine what others are thinking by analogy to it's own thoughts. Some that it simply means to be alert or aware of the world around us. Some that it means to be self-aware. Others say it is a "court of appeals" for resolving internal conflicts that arise within the brain which might otherwise result in endless loops or deadlocks (if you've ever read the Marvin Minsky/Harry Harrison sci-fi book, The Turing Option, this idea plays into the plot at one point when the AI under construction repeatedly get stuck because of a lack of such an appeals process).

If you ask a psychologist, they'll point out that you are considered conscious when you are daydreaming or dreaming in your sleep even though you may not be aware of sensory data from the real world (and there are other "alter states" of consciousness that can be induced by hypnosis or drugs). William James defined it is a "river of awareness" containing our thoughts and sensations. A fair number of psychologists, on the otherhand, have maintained that there is no such thing as consciousness or suggested that if you can't measure or directly analyze a thing, then it's not suitable for scientific inquiry in the first place.

That's why I was a little disappointed the Dennett article didn't define the term - once you agree on a definition, there's much higher chance of learning something useful about it.

What would it take... ?, posted 17 Jun 2002 at 22:43 UTC by pantera » (Observer)

What would it take for all the thinkers and scientists to come to a mutal agreement, in the name of progress, on what consciousness is?

No agreement necessary..., posted 18 Jun 2002 at 02:57 UTC by The Swirling Brain » (Master)

What would it take for all the thinkers and scientists to come to a mutal agreement, in the name of progress, on what consciousness is?

It's easier to disagree than to agree. Wouldn't you agree?

See more of the latest robot news!

Recent blogs

25 Jul 2014 mwaibel (Master)
20 Jul 2014 Flanneltron (Journeyer)
11 Jul 2014 shimniok (Journeyer)
3 Jul 2014 jmhenry (Journeyer)
3 Jul 2014 steve (Master)
2 Jul 2014 Petar.Kormushev (Master)
10 Jun 2014 robotvibes (Master)
10 May 2014 evilrobots (Observer)
2 Mar 2014 wedesoft (Master)
1 Dec 2013 AI4U (Observer)
X
Share this page