"PEDRO" the quadruped robot got a lithium polymer battery last week courtesy of eBay. I had been putting off buying a heavy and expensive large capacity LiPo battery until I had got the walking algorithms working a bit better and I could truly test the ability of the robot to take the 500g or so of a heavy battery, but now batteries are available on eBay for not as much money, I just took the plunge and bought a 3000mAh 3 cell battery pack and charger for the robot.
The battery weighs just 215g, and cost approx AU$50 delivered. The charger can charge up to 5A and will do a balance charge, monitoring individual cell voltages.
Next I had to fit it to the robot, as the original design called for a larger pack slung underneath the body and neatly held in by the spacers between the side plates. The new smaller pack needed some extra holes and some plastic brackets to hold it in;
Once the battery was on and tested ok, I had to return to the firmware for the AVR micro and write some code to support turning the unit on and off via the onboard mosfet and soft on/off button. Also needed was some code to talk to the ADC chip on board which has the battery cell voltages on it, so I could implement low battery automatic cut out and (hopefully soon) viewing of the battery level remotely over the wifi link. Of course it didn't go smoothly...
Once again the usb logic analyser came in very handy in debugging the SPI comms between the micro and the AD7490 chip (16 channel, 12 bit converter). You can also see the JTAG MkII from Atmel connected (expensive gadget, but very handy for debugging AVR code).
I've also been working on the walking algorithms, and various other software and hardware bits and pieces such as improving the communications (the WiPort wifi module doesn't seem to work as well as I'd hoped - it needs really good signal strength or every now and again it will drop out for up to a second or so). Once I've got some of these wrinkles smoothed out and the walking nice and smooth I'll post some more video.
Finally "PEDRO" the quadruped robot has taken it's first steps outside of simulation... Admittedly it suffers from the shakes and falls over continually, but it is still a milestone.
I finished the Microsoft Robotics Studio control service to the point where it can send out regular servo position updates over the wifi link to the robot, taking direction from the quadruped differential drive service which implements the walking algorithm.
Here's a short video of the shaky movements;
Obviously there is a lot to improve upon. The balance and stability control aspects of the software, although they worked ok in the simulation environment, are obviously not working too well on the physical robot. I will definitely improve these, but I'm more worried at the moment about making the servo movements a lot more smooth than they are now.
I was initially sending servo position and speed updates (via a SYNC WRITE command on the dynamixel network to the AX-12 servos) at only 10Hz, and the first thing I tried was increasing the rate of updates. Even increasing the rate to 50Hz or even 100Hz didn't improve things much. I also removed the speed control code and used a AX-12 speed register setting of zero which represents maximum speed with not much effect. A quick look at the AX-12 data stream using the completely cool and now indispensable logic analyser showed the root cause of the problem - relatively large amounts of jitter in the timing of the AX-12 packets.
So next I'll be working on improving this aspect so the robot can have smooth movements. Although some people have managed to cram inverse kinematics and other trig type calculations into Atmel AVR micros similar to those on the PEDRO robot, I think I would rather keep the robot platform as independent from the control methodology as possible to allow for completely novel approaches to be implemented on desktop computers and quickly tested. So step one will be to use more accurate timing within the Microsoft Robotics Studio code. Rather than use the TimeoutPort mechanism provided which according to online sources can have 16-20ms jitter (apparently improved in MRDS 2.0) I will use change the code to use timing based on the multimedia timer which should give much more accurate results.
Mind you I expect this will only be step 1, because there are many sources of varying latency in my robot system - wifi, processing delays on board the robot, SPI comms between the main micro and the AVR handling the servo comms etc. So I am thinking I might put a queuing system onto the AVR which handles the servo comms, and rather than just passing data onto the servo bus as soon as it is received, I might actually decode the AX-12 packets and buffer them so I can resend them out at precise intervals. If the buffer holds only a few packets then the delay between the PC end and the actual robot movement won't be that great (maybe 100ms?). I will have to make the rate that the packets are taken from the buffer and sent out to the AX servo bus adjustable with a long term average, so that the micro can tolerate timing differences between the PC end and the micro. For example if the PC end is sending out packets on an average interval of 20.4ms and the micro expects packets at exactly 20ms and is taking packets out of the queue at that rate, then the queue will eventually empty. If the micro measures the average arrival timing of the incoming packets and adjusts its interrupt driven timer for precisely sending out packets accordingly, then the buffer will stay more or less at its intended size.
I thought I would post some info on some of the work that has gone into making the quadruped robot "PEDRO" walk in Microsoft Robotics Studio simulation environment.
Here's a video which briefly shows the latest hardware progress before moving on to demonstrations of some elements of the control software. The robot hardware and onboard firmware is now basically ready - though as you can see in the video I'm only up to manually moving one joint at a time until some more PC end software is written.
The control software is written in C# and implemented as robotics studio services. Using Microsoft Robotics Studio has allowed the control software to be tested out on a simulated robot before being let loose on the real nuts and bolts hardware. It is amazing how often a simple software bug seemed to make the simulated robot do complicated yoga poses that I'm sure would have led to stripped gears on the real bot...
The most important element of the control software is the inverse kinematics routines. These take a foot position vector relative to the robot body and work out the necessary joint angles to put the foot into that position. Once this important function is in place, you can then make a leg step simply by moving the foot position through the desired curve, and the leg joint angles are all taken care of by the inverse kinematics.
After the inverse kinematics was working well, the next step was simply adding some other ways to set the foot positions by setting a body movement relative to the ground. Although this sounds complicated, to move the body forward relative to the ground, for example, all you have to do is adjust all the feet position vectors backwards a bit, and again, the inverse kinematics takes care of the rest. Body rotation is handled much the same way; a method is provided to rotate the body which actually rotates the four feet position vectors (or however many feet are actually on the ground at the time) around the body centre.
The actual stepping motion is created by choosing a target for the foot position using the current movement direction and current desired body rotation. The path the foot follows during the step is an elliptical one so that the foot moves almost directly straight up off the ground at the onset of the step motion and similarly comes down at footfall almost vertically. This should help in clearing obstacles (or carpet etc.). Once the target foot position is chosen and the step is started, the software updates the foot position and servo rotation speed using the calculated path every iteration. Meanwhile the other three non-stepping legs are being moved by the body motion functions - ie. every iteration the body stance is shifted in the desired movement direction.
Another important component of the control software is the methods I have added for calculating the robot's current centre of mass. This is calculated using the known current position of all the joints and the mass of each segment. It is important to know the centre of mass in static walking so that stability can be maintained by shifting the body pose before a foot is lifted off the ground. For example if a particular foot is selected as the next to step, then the body centre of mass should be compared with a triangle formed by the other feet. If the centre of mass falls vertically within the triangle, then the robot will be stable with that foot lifted off the ground. If not, then the body stance can be shifted before the step is started so that the centre of mass falls within the stability triangle.
The step order is most critical for stable walking, and I found that for forward motion, stepping the rear then front leg on one side followed by the rear then front leg on the other side resulted in the best stability. Currently PEDRO's walking algorithm adapts this basic pattern for whichever quadrant the desired direction of travel falls in. For example if the desired direction is pretty much side stepping to the right, then the best foot step order will be front left, then front right, then rear left then rear right. I have also worked in the desired rotation into the algorithm which picks the next foot to step, and if the desired rotation is more dominant than the translation movement, then the best order for rotation takes over which is stepping around the clock ie. if rotating clockwise, then step front left, front right, rear right, rear left.
This method is working quite well in practice, however I would like to replace it with something a little less scripted. I did try choosing the next foot to step based on which foot could possibly step the furthest in the desired direction, however this wasn't particular successful. I won't do any further work on this right now however, because it will be far more exciting to test out the walking algorithms as they are with the real robot, and this is hopefully not too far off. I have to finish the software service which communicates with the robot using tcp/ip, passing on the joint movement messages to the onboard controller. With a bit of luck the real robot will walk just as successfully as the simulated one, but I'm tipping there will be problems to solve...
Controlling robot simulation with Xbox wireless controller
Recently I bought a wireless Xbox controller purely for use with Microsoft Robotics Studio, to remote control the PEDRO quadruped robot and steer it around the floor. To get to this pretty simple goal, a lot of steps are needed - I already had a simulation of the robot going in Robotics Studio, and even walking, but in a very 'canned' sequenced fashion, with all the joint angles moving between key frames that are hard coded. See this blog post for a video of the rather unsteady robot simulation walking in a straight line. This method won't do if you want to control the robot in an unplanned fashion where the direction and speed of walking are able to change at any time, so a more sophisticated approach is needed.
Since then I have written several robotics studio services to support the robot. One communicates with the real physical robot (still in progress), another controls the simulated robot. These two implement the same 'generic contract' as robotics studio terms it, so other services can be built to interchangeably communicate with either the real or simulated robot without any recompilation. A third robotics studio service implements the 'generic differential drive' contract so that it can accept differential drive control signals from other services which use this contract. This service contains all the software which implements the new walking algorithm capable of rotation, walking in any direction and varying the speed of the robot. The output of this service is essentially a continuous set of joint angles which can get sent to either the real robot control service or the simulation robot service.
The beauty of allowing the walking service to accept differential drive control signals (also called skid steering) is that potentially many different services can be connected up to control PEDRO, even though the original authors of the services have not even seen the robot. The drawback is that there is not much control with just a left and right motor speed signal for a 14 degree of freedom quadruped robot! However I will allow more complicated forms of control in the software which will work with services I create, as well as the more standard differential drive control. Another cool little side benefit is that the standard dashboard service supplied with robotics studio accepts input from the mouse or game controllers like the wireless Xbox controller and so can be used to remote control your robot with a convenient wireless handheld device. Because of this, I had to buy one to try it out!
I'm still working on a video which describes the technical detail behind the actual walking algorithms, and I'll post it when its done. Of course making the robot walk in simulation, and making the real robot actually walk across your floor might be two very different things - and this is something I'll hopefully find out pretty soon as the hardware has progressed really well.
The physical robot is now fully wired up with all the AX-12 servos connected up to the onboard controller which is communicating over WiFi quite well. I can manually drag a slider up and down and move individual joints one at a time using a slightly modified version of Scott Ferguson's DynaCommander software. I haven't fitted any battery to the robot yet, so it is still tethered to a power supply, because I would like to put a dummy weight equal to the battery weight (nearly half a kilogram for the one I want to use) onto the robot to make sure it can lift and carry this weight comfortably. This way if the robot can't in practice support the heavy battery I can still source and fit a lighter battery option with less run time.
I'm currently working on the service which will replace the simulation and communicate with the real robot. When that's tested and debugged it will be really interesting to see if the walking software that works so well in simulation can work as well in reality!
Quadruped robot nearly together, and MRDS simulation work
Hi everyone, here's an update on the recent work I have been doing on assembly of the quadruped robot design, and also some adventures using the Microsoft Robotics Developer Studio package. I also talk a bit about some experimentation with casting plastic parts using silicone moulds.
As you can see from the picture, the robot is getting near completely assembled now, the only parts remaining are the mounting of the two cameras on the head. No wiring up of servos has been done yet, and the battery pack hasn't yet been mounted (or purchased), but still it's looking more and more complete. The shoulder brackets you can see attaching each leg to the body are actually urethane plastic parts cast in a silicone rubber mould which I made using the original milled plastic version of the shoulder brackets. The milled HDPE version took me about 3 hours to machine, so I was looking for a faster way to make a few of these things. I have never done any plastic casting before, and this is a fairly tricky part to start with I think, as it needed a two part mould and had features like counter sunk holes which I chose to try to include in the casting. The brackets didn't come out as good as I'd hoped, with difficulties around the holes, bubbles in the casting and various imperfections all over and they needed a lot of cleanup, but they are functional.
As an interesting aside, since making these I discovered an interesting company on the net, Shapeways, who have an online 3d printing service where you can upload a 3d model, and get an almost instant quote to have it 3d printed in a choice of a few different materials. If you click on the Gallery and search for AX-12 you can see my parts... Price seems fairly reasonable, so I have actually got an order in for a set of shoulder brackets for the quadruped robot, as a concept test to see if the material is strong enough and whether this could be a good robot prototyping resource.
The feet for the quadruped are a simple design based on half a squash ball (would you believe!) which will give a uniform contact area with the ground during movement. I had originally intended to use some force sensors I have within the feet, however I have instead made a simple variety of the foot which has no sensor at present. During the construction of the feet I did have a good idea (I think) to use a low cost gas pressure sensor and make the squash ball hemisphere airtight - that way force from any direction on the ball will result in a pressure change and should be able to be sensed. That's an idea for the future at the moment.
I have been working recently on desktop software support for the quadruped, specifically Microsoft Robotics Developer Studio services to support the quadruped. Although I don't plan to limit the PC end software to MRDS alone, I did want to evaluate the use of Robotics Studio as a means of simulation (especially to develop some simple walking gaits initially) and also perhaps as a rapid software prototyping platform which might enable me to get some interesting behaviours going quite quickly using services developed by other people linked to the hardware of the quadruped robot via a custom MRDS service I would write. It is early days and still very much a work in progress, especially since C# is not my native tongue yet, but I got a physics simulation going and also a very simple walking motion (using the crawl gait).
Here's a video of the simulation environment running;
The crawl gait I have used is pretty unstable, and it doesn't take much to knock over the robot in the sim. However I intend to improve it by adding some inverse kinematics and a better method of controlling group moves of servos. I also need to add proper support for the differential drive service in MRDS which will enable simple steering and control of the robot using xbox wireless controllers and even the nintendo wiimote. Of course all this is just essentially to demonstrate the capabilities of the robot platform (and have a bit of fun with it) - eventually I would like to implement a brain like hierarchical memory network with the grand aim of teaching it to walk without classical control concepts.
Well, it's been a good 6 months since I received the laser
cut acrylic panels for the quadruped robot and last posted
progress on this blog. In that time I've got married, moved
house, and starting fixing up both the old and new houses
(still in progress)! Hopefully I can be forgiven for slow
progress on the robot...
Finally I have machined
a couple of the plastic parts I needed to assemble the robot
and made a start on assembly. First I needed to move the
milling machine from the old house (no small job) and get it
These two images show the assembled progress versus
the cad model of the whole assembly. I have made two of the
custom plastic brackets for the AX-12 servos and have fitted
them to the front left shoulder. There are two matching
brackets required for each shoulder (click on these images
for higher resolution).
I've also done the angle
bending on the top acrylic cover and used velcro patches to
hold it on. I was very happy with how the bends came out,
though the job is a lot simpler than the bending needed for
the head. I was also happy with the machining on the
shoulder brackets, though it was very tricky to do,
involving a lot of planning ahead of how to best grip the
part in the vice so it was held firmly enough. Took about 3
hours each for the two I've done so far, and there's 8
needed in total...
Here is a close-up of the HDPE shoulder brackets. The
two brackets needed for each shoulder are a mirror image of
each other forming a left and right pair. While the standard
Robotis brackets available in the Bioloid kits might do the
job, these custom brackets allow for the minimum gap between
the two servos, just enough for clearance as the shoulder
Just to check on how much weight
he's gaining; the scales say 857 grams so far. The target
weight for the whole robot including battery is around
The next steps involve machining the other 6 shoulder
brackets needed, thermoforming the head plate, drilling the
holes needed in the shin parts, and making the feet. Because
I have a fair bit on the go at the moment, it is just a
matter of grabbing an hour here or there to progress the
build, but at least it's getting exciting with real physical
assembly taking place!
First servo moves over WiFi, and Asimo in Melbourne
It has been a full two months since the last post here, it has been difficult to find much time recently to work on the Quadruped4 robot.
Finally though I reached a point where I felt I had achieved enough progress to warrant a blog post. I have written enough firmware for the two Atmel processors on the Dynamixel Robotics Controller to get AX-12+ servos moving under control of a laptop connected via WiFi.
I also had to change the small test application I had written to send AX-12 commands via serial to use a TCP/IP socket connection, but once this was done, I had a servo happily moving back and forth as I dragged the mouse over a dial control. May not seem like much, but this is a pretty important milestone I think. If you look really hard at the fuzzy video, you can see the temporary wires tacked on to add a JTAG interface on the secondary micro (secondary micro handles the 1Mbit/s servo comms). I had a fair bit of trouble getting the SPI link between the two micros going well, and a borrowed JTAG debugger was very handy.
On another note, this week Honda's ASIMO robot appeared in Melbourne as part of an Australian tour. Of course I had to go see it even though the show would have pretty much zero technical content! Here is a bit of a sample of some video I took during the show as I sat among all the kiddies.
Back to the Quadruped4 robot, and the next step in construction. Now I have enough electronics and firmware going to move servos over WiFi I think it is time to start on the physical build. I have the design finished to a buildable level I think, and in the interests of getting something going sooner rather than later, I will start on construction. I am quite happy with the attached render of the latest cad model - it looks kind of like it is on the surface of Mars...
Today the Lantronix WiPort wifi module on the Dynamixel Robotics Controller board was successfully tested!
Initially I had quite a time getting the wifi module going - mainly due to confusion regarding the two serial ports (3.3V logic level) the WiPort provides. On the controller board I had designed port 0 to run directly into the main AtMega2560 micro, and the secondary port 1 to run through a RS232 level converter IC and to a DB9 serial connector on board. This port was intended to be used as a serial configuration port only, a means to initially configure the wifi module, and to act as a fall back connection for fault diagnosis if the wifi link was not working for any reason.
Unfortunately the Lantronix documentation for the module, although it mentions both serial ports, doesn't really draw any distinction between the operation of them - as I read the documents, either port could support their serial configuration console.
So, when I fired up the board and tried to enter the 'xxx' required to drop the module into console mode from the DB9 connector - nothing. I went through all sorts of debugging steps because I didn't know whether module, my pcb or the RS232 level converter circuitry was at fault, all to no avail. I did think that perhaps the serial configuration console feature was only available through port 0 (which is connected to my Atmel micro, which currently has no firmware in it to speak of...), however this was harder to test.
I resolved to write some firmware to just pass through a connection from the FTDI USB chip on board to the WiPort module transparently. This way I could use hyperterminal to talk to Port 0 of the WiPort as if I was connected directly to it. I then ran into more difficulties - the USB port was not working - not even getting recognised on the PC at all!
Eventually I found an incorrectly loaded resistor (wrong value...) in the USB circuitry that was preventing the FTDI chip from functioning correctly. Once that was fixed, the USB sprang into life, and I could open Hyperterminal connected to com4 (assigned to the USB serial). Wrote some quick firmware for the transparent link from USB to the WiPort - basically any characters received from either port are passed across to the other. Turned it all on, entered the 'xxx' to jump to the WiPort console, and voila! Success. I could access the menu setting up all the parameters in the WiPort. Next came some rapid learning about AdHoc and Infrastructure modes in wifi networks. I had initially wanted to use an adhoc connection between my laptop and the robotics controller so the whole setup would be portable outside my house, however I couldn't get this to coexist with the infrastructure mode I use at home with my wireless router. Basically I would have to reconfigure my laptop wifi everytime I wanted to use the robot... not very satisfactory. So I ended up just configuring it to use the home network router, which I suppose I will have to take with me if I want to demonstrate the robot elsewhere.
Now I had the WiFi module basically going, I thought of the next test... Since I had firmware in the Atmel to pass through comms from USBWiPort I could telnet from my notebook using hyperterminal to the WiPort Port 0, which would get passed through to usb, which I had open again in hyperterminal on the laptop. This worked! I effectively had an elaborate loopback between hyperterminal windows, and could type in either window and see the results in the other. I then wanted to see whether the WiPort would support simultaneous connections to Port 1 at the same time as port 0, so hooked up a serial cable from the laptop to the DB9 connector on the board, another hyperterminal on com1, and a fourth hyperterminal window telneting to the WiPort Port1. Amazingly, all this worked as well! The image above has this crazy scheme illustrated, two simultaneous telnet sessions, one looped back through an FTDI usb serial connection, the other looped back through the DB9 connector onboard the controller to the laptop com1 serial port!
So now I have tested some large slabs of the Dynamixel Controller, it is time to get it to actually do something halfway useful. I think the next step is to write firmware to link to the Robotis Dynamixel AX-12+ servo network, and then get some software on the laptop to move a servo over the wifi (or even the usb) link. When that happens I am sure I will have to post an blog entry (even though others may not be as excited about the event as I am...).