Interviews

Robots: Brain-Machine Interfaces

Posted 14 Aug 2009 at 09:55 UTC (updated 14 Aug 2009 at 15:25 UTC) by mwaibel Share This

Like reported a few days ago, insects have an amazing vision system, far surpassing any of our current sensing technology. Charles Higgins from the University of Arizona has now told the Robots podcast how he taps into the spinal cord of dragonflies to use them as extremely powerful sensors for his robots (compare his work on Neuromorphic Vision). The picture above shows an earlier version of the robot with an onboard moth used as a sensor in a closed-loop control system and a dragonfly. The same episode also features an interview with Steve Potter at the Laboratory for NeuroEngineering at Emory and Georgia Tech famous for his Hybrots (hybrid robots). Rather than interfacing with existing animals, he grows neural circuits in Petri-dishes and hooks them up to the sensors and actuators of robots. Potter describes the resulting semi-living animals or animats, and he discusses both technical and ethical implications of this technology. Tune in to this episode on brain-machine interfaces or listen to two previous, related episodes on building robot flies and manufacturing insect-sized robots.

See more of the latest robot news!

Recent blogs

16 Sep 2014 shimniok (Journeyer)
5 Sep 2014 mwaibel (Master)
5 Aug 2014 svo (Master)
20 Jul 2014 Flanneltron (Journeyer)
3 Jul 2014 jmhenry (Journeyer)
3 Jul 2014 steve (Master)
2 Jul 2014 Petar.Kormushev (Master)
10 Jun 2014 robotvibes (Master)
10 May 2014 evilrobots (Observer)
2 Mar 2014 wedesoft (Master)
X
Share this page