11 Dec 2004 motters   » (Master)

Over the last couple of weeks I've been adding more bits to the robocore simulation. I've now got a much better value assignment system going, such that in the simulation the robot is able to learn that things with different colours have a strong discriminative value (they taste good or bad), but that their texture and shape don't. This is quite a nice system which would work with any kind of categorisation.

The part which is currently occupying me is speech understanding and production. I'm trying to make a model of the way speech works which is reasonably realistic, such that damage to the maps or connections within Brocas or Wernickes areas produces similar aphasias to those seen in people after brain damage or strokes. Fundamentally, speech production is just about motor control. As the philosopher John Searle says, "I open this flap in the bottom half of my head and a racket comes out". Rather than trying to explicitly represent words I'm just having the system detect phonemes - the small components of speech which have a direct correlation to specific movements of the larynx and mouth.

The first experiment just involves repetition. So the system listens to some speech and then tries to speak exactly the same sentence itself. This may sound really easy and trivial but it actually involves learning complicated sequences of phonemes.

Latest blog entries     Older blog entries

X
Share this page