Robots: Brain-Machine Interfaces

Posted 14 Aug 2009 at 09:55 UTC (updated 14 Aug 2009 at 15:25 UTC) by mwaibel Share This

Like reported a few days ago, insects have an amazing vision system, far surpassing any of our current sensing technology. Charles Higgins from the University of Arizona has now told the Robots podcast how he taps into the spinal cord of dragonflies to use them as extremely powerful sensors for his robots (compare his work on Neuromorphic Vision). The picture above shows an earlier version of the robot with an onboard moth used as a sensor in a closed-loop control system and a dragonfly. The same episode also features an interview with Steve Potter at the Laboratory for NeuroEngineering at Emory and Georgia Tech famous for his Hybrots (hybrid robots). Rather than interfacing with existing animals, he grows neural circuits in Petri-dishes and hooks them up to the sensors and actuators of robots. Potter describes the resulting semi-living animals or animats, and he discusses both technical and ethical implications of this technology. Tune in to this episode on brain-machine interfaces or listen to two previous, related episodes on building robot flies and manufacturing insect-sized robots.

See more of the latest robot news!

Recent blogs

31 May 2016 steve (Master)
29 May 2016 mwaibel (Master)
24 May 2016 shimniok (Journeyer)
2 May 2016 motters (Master)
20 Apr 2016 Petar.Kormushev (Master)
6 Nov 2015 wedesoft (Master)
20 Oct 2015 Flanneltron (Journeyer)
10 Sep 2015 svo (Master)
6 May 2015 spirit (Journeyer)
14 Nov 2014 Sergey Popov (Apprentice)
Share this page