Recent blog entries for Masse

For Christmas I received a Robosapiens. It's a little remote controlled humanoid figure with about 7 degrees of freedom. (I didn't count them.) I've been sworn to NOT modify it, so, clearly, I had to think of another way to accomplish my nefarious ends. I came up with using cameras and infrared lights tied to a central computer. These camera/IR control points would be placed to cover Robosapiens' operational area, grabbing images and sending commands via IR link. The entire capability of the remote computer could be used to accompish image processing and control, all without modifying the unit at all. The location of the robot could be determined from the cameras, with only one IR port used for commands. (Whichever is closest to him.) A basic setup would require: 1) A camera, and ability to grab images using some preferred programming language. 2) An IR controller. This will require an output from the computer- perhaps a microphone jack. 3) Robosapiens' command language. This might be available from the manufacturer. 4) Software to accomplish the image processing. At it's most fundamental, it would return cartesian location and angular heading. 5) Software to generate a path to follow. Could be as simple as predetermined path. 6) Control software to follow the desired path. Would compare where it is to where it should be next.

Most of the robotics work I've done has been on immobile, 5 degree of freedom robots. I developed drivers that went on to be used in a couple of other student's work for the Rhino, a small, toyish robot designed for use in education. This machine was controlled with an old VME setup and was later ported to QNX. (Real time Unix-like operating system.) As mentioned earlier, CWRU has a project based class, where we worked on the Rhino and a Motoman. There were some interesting projects, such as following a laser with a sensor in order to measure the robot's lengths, part sorting using image processing and cutting circles from foam with a soldering iron. The final project was to implement part manufacturing- use autocad to draw something, divide it into slices and then have the robot create the slices from foam. While each peice worked (I think), it never was coordinated to work all together from start to finish. Currently, I'm looking at biologically inspired AI. The idea is that presently, there are certain tasks that robots just aren't very good at, like path finding. The most advanced robots currently can't move 200 miles (in the desert) without falling in a ditch or something. (See DARPA Desert Challenge for details) The mars rover moves something like 100 feet per day. Your dog, however, can easily run half a mile through a forest to find you if you call it loud enough. And it can do it even if you have a friend trying to stop it, meaning it's pretty fast. How's the dog's brain do that? If we (humanity) knew that, we might be able to make our machines (robots) do it to.

CWRU has a couple of very interesting classes on Robotics: Robotics 1: Grad Level, covers theory- kinematics, inertia, control. Rototics 2: Grad Level, entirely project based class w/Prof. Newman. Autonomous Robotics: Build LEGO robots to compete in egg hunt. Mechatronics: (New this year) Covers mechanical, control, computer science and electrical issues in automation. Robot soccer class highlight. Prof. Newman again.

Of course, there are many related classes to cover control, embedded design, microprocessors, vision and the like.

Share this page