Older blog entries for shimniok (starting at number 83)

2 Jun 2014 (updated 3 Jun 2014 at 06:16 UTC) »

AVC: Heading Errors, Path Following Woes

A little over two weeks to go as I write this. That's not much time.

Testing uncovered a couple of obvious problems. The robot is experiencing heading errors and the pure pursuit path following is still working poorly.

Heading Error

I'm seeing unpredictable heading deviations possibly suggesting heading drift. The robot's ending position was incorrect, rotated counterclockwise, on three test runs.

The problem manifests most apparently in the first leg of the square course. In three autonomous runs plus one manual run, I could see in the GCS that heading was drifting off.

Time to look at logs and see what is going on. Looking at a plot of the log data you can see some problems right away.

The correct initial heading --from the starting point to the first waypoint-- was around 254 degrees. This plot shows activity from about 1 second after logging starts.

The red line, GPS heading, won't report a correct heading until it hits about 2-3m/s which occurs just after 2 seconds into the run. My code won't use GPS heading until then, relying only on gyro, which is corrected for bias at the starting line.

The blue line shows the time-lagged heading estimate based on gyro and GPS heading. The green line is the current heading estimate. The gyro is used to compute a new heading and error corrected with the time-lagged estimate.

Both blue and green lines drift off the initial heading. Whether this is actual vehicle motion or gyro drift I don't know yet and will have to investigate.

Crystal clear is that the gyro and Kalman heading estimates are almost 5 degrees in error versus the GPS heading in this first stretch of the course. That's pretty terrible, and may explain what I witnessed in field testing.

What's going on? Possibly the code is fusing in GPS data before it converges to the correct heading. Or I need to include bias as a state in my Kalman Filter. Also, I'm using a new uBlox GPS with 4Hz update rate instead of the previous Venus GPS' 10Hz rate, so GPS heading will have less influence over gyro bias.

But what of that strange spike in the green and blue plots at just after 3 seconds? These spikes are all over the plots for my runs. It appears in the gyro estimate, current heading estimate, and time-lagged estimates.

The plot above shows the spikes pretty clearly. It also shows the Z-axis gyro values. I don't see any significant correlation between the estimate spikes and the gyro values. So the spikes aren't caused by shock, apparently.

There may simply be a bug in my computations. The gyro-only estimate is particularly bad with massive spikes appearing at seemingly random intervals.

That's the first place I'll start looking. The Kalman Filter smooths out this noise and error correction reduces the effect on the current heading estimate. The spikes don't appear to gravely affect heading estimate, in the end. But I don't see how a robot can hope to win with these kinds of errors present.

Path Following Instability

Path following is unstable. The robot has a hard time staying on the path. I tried a few different lookahead distances for the virtual 'rabbit' the robot chases around the course.

With a 2m lookahead distance, the robot experiences quite a bit of cross track error, particularly after the 2nd waypoint, with some fairly wide oscillation. This is what I saw several months ago and have procrastinated dealing with.

A longer lookahead distance should result in improved stability. I tried 3m which provided no appreciable change. With a 4m lookahead, the robot didn't turn fast enough to follow the path, but it was much more stable.

To verify the other extreme, a short lookahead of 1m the path following oscillated and was nearly unstable, as expected. I can try further tuning of the lookahead distance, sure.

Meanwhile, Josh Pieper of Savage Solder fame suggested lag as a possible problem. This paper [HEREDIA-2007-ADVANCED ROBOTICS.pdf] suggests instability problems when there's a lag between sensor reading and control actuation that's caused by computation delay and mechanical delay. I verified this awhile back with simulations in Processing.

I'm seeing about 1.4ms lag between gyro read and when the steering is updated. The robot updates the steering position every 50ms. The servo is an entry level Futaba S3003 with a typical slow speed of 60° in 0.19s.

I'm not yet sure what the answer is for path following. Until I get the heading issues figured out there's little point in working on this issue.

Syndicated 2014-06-02 19:00:00 (Updated 2014-06-03 05:18:07) from Michael Shimniok

2014 AVC: Entry Video

Here's my Data Bus proof of concept video for the 2014 Sparkfun AVC.

Syndicated 2014-05-29 23:00:00 from Michael Shimniok

22 May 2014 (updated 29 May 2014 at 22:14 UTC) »

AVC: Pose and Map Display

To fix the path following algorithm on Data Bus I have to know what the robot is thinking.

To know what it's thinking, the robot is now sending some new data in its telemetry stream to the GCS.

  • All waypoints
  • Index of the next waypoint
  • Lookahead position

The GCS now opens a map window which scales and displays the data above as well as vehicle pose (position and heading).

But before I got to that point, when I first started coding up the map window, it was immediately obvious that the lookahead point was wrong. Below, the green dots are waypoints, the tiny yellow dot in the upper right is the rover, and the blue dot is the lookahead point.

Lookahead point. You're doing it wrong.

I was computing the position of the lookahead point relative to the previous waypoint, but failing to add the coordinates of that waypoint.

In my attempt to fix the problem I mistakenly added the relative distance to the robot's position. The map display made it obvious what was wrong. The lookahead point kept getting farther and farther away.

I got it fixed. Then I added vehicle heading as a line. I draw waypoints in red and the next waypoint in green. I was able to verify that the code switches to the next waypoint properly. Heading looks good. Now I can do some diagnosis on the path following.

There. That's better.

Point is, visualization is awesome. I think it will be a big time saver for troubleshooting.

Syndicated 2014-05-22 14:00:00 (Updated 2014-05-29 22:00:51) from Michael Shimniok

20 May 2014 (updated 21 May 2014 at 21:16 UTC) »

Sparkfun AVC Update

The last few months, within the free nooks and crannies of an incredibly hectic life, I've done my best to fit in work on my Sparkfun AVC entries, Data Bus and the still-top-secret SHARC entry.

The triumvirate of robot expos are over: Robots at the Hangar, Denver Mini-Maker Faire, and Deep Space Open House. Prepping displays for these shows was daunting. My efforts to bring motion to Trash Bot succeeded. The Data Bus GCS is in good shape. Mom's clock project is wrapped up. And I've been working on OpenMV Camera as time permits with a small scale prototype run planned for the near future.

The challenges have been many. The family was hit with a string of illnesses, with my wife down for the count for a week straight in one case. I've been on travel almost 50% the last 6 weeks. My aging mom fell and went to the hospital and has been going through rehab. Numerous other regular life things have come along as well.

Several months ago, when I last tested Data Bus with the new path following algorithm, it made it to the second waypoint before its path started oscillating wildly.

In the interim I experimented with FreeRTOS, a number of code changes. But I never faced up to the strange path following behavior. After a couple months I had made a mess of things and reverted to an much earlier version, removing the RTOS. I haven't done a single test run in many months.

Data Bus is sitting next to me as I type, sporting a number of changes and hopefully getting close to another test run. I've added a map display to the Ground Control System. The robot now informs the GCS of its estimated position, waypoint locations, and the location of the pure pursuit lookahead point, the "rabbit" it chases around the course. The lookahead data is going to be key to troubleshooting the problems I saw before. It also logs this data to the onboard microSD card.

The deadline for a proof of concept video is end of May so it's time to step it up and get this robot working again.

Syndicated 2014-05-20 15:00:00 (Updated 2014-05-21 20:08:14) from Michael Shimniok

AVC: Ground Control Station

With less than two months left to get Data Bus working, why am I working on Ground Control Station (aka GCS) software?

One reason, honestly, was to have something fun to show at Denver Mini Maker Faire. Another reason is that I'm convinced it'll help me save time troubleshooting. I'm also going to need it for the SHARC AVC entry.

The GCS Software

I had
already written the GCS software in Java Swing, complete with Google Maps. What wasn't working was telemetry. Also, it was written for ancient Java 6. I updated to less ancient Java 7, a newer IDE, and more.

Observer Pattern

The gauges (GaugePanel) are implemented as JLayeredPane objects and each gauge needle is a JPanel added to the gauge (GaugeNeedle). The needles listen for changes to associated Property objects (like voltage, current, heading) that are set when reading data from the telemetry stream.

These Property objects implement an Observable interface which means that whenever a new value is set the object invokes a callback on a ChangeListener. The GaugeNeedle implements ChangeListener and the callback changed() method.

So what happens is the TelemetryParser reads, say, a voltage from the robot, sets the voltage Property, which then calls voltageNeedle.changed() and the needle updates its position.


Probably the hardest part of all of this was getting RXTX to work. I am going to abandon it in favor of other options at the earliest possible opportunity because it's always such a nightmare to get working and things are much worse now with 64-bit machines, 64- and 32-bit JVMs, the new Mac OS X version that appears not to be supported, and more.

Anyway, for better or worse I have a SerialPanel object that handles UI aspects of connecting to a serial port and receiving data, and then calling a callback (a Parser) to do something with the data. The TelemetryParser is responsible for splitting the CSV and setting the associated Property objects.

Map Visualization

Now that I'm on Linux, I can't use the Google Earth Plugin; it's not supported. One of the key features I need is the ability to quickly visualize the robot's pose estimate and bearing/distance to waypoint. So I'll have to get something working soon.


For the moment I have a stupid simple telemetry stream set up. It just spits out a header character, a CSV of the telemetry data, and a newline termination. I was scrambling to get this all done before Maker Faire, so there's more to do, like error detection and correction.


The robot logs a lot of data as it runs. I'd like to implement log file download within the GCS. Presently, my Java serterm serial terminal program serves this purpose. I'd like to run it all in the same software, however.

Also, it'd be handy to be able to record telemetry data for immediate playback without having to wait for (giant) log file download over serial. So that's also part of my plans.


It'd be nice not to have to remove the bot's cover and plug in a USB cable to check data after each run. That burns up precious time of which I have little.

I figure it'll also be helpful to watch robot telemetry as it runs. So I picked up a pair of Xbee Pro S1 radios to play with. The 60mW power claims a mile range. I need a couple hundred yards, tops.

See that blue Xbee board on there? That's new...
So far so good but I'll need to do some range testing and make sure there are no issues with the RC receiver and Xbee antennas sitting in close proximity. I realize lots of stuff shares the 2.4GHz band relatively successfully. Exactly how I don't quite know. Like I always say, "it should work"...

Syndicated 2014-05-15 17:00:00 (Updated 2014-05-15 17:00:01) from Michael Shimniok

Kids Like Taking Out Trash (With a Robot)

Where can you watch hundreds of kids volunteer to take out the trash? At the Denver Mini-Maker Faire, that's where!

Kids loved driving Trash Bot, grabbing, and dragging the recycling bin.

I spent most of last weekend manning the Bot Thoughts / SHARC combined booths where my remote control Trash Bot (TOTT Bot) was a huge hit with kids and adults alike! But that's not all.

This was my first Maker Faire of any sort, and it was awesome. I am interested in lots of disparate things from art to science and the Mini Maker Faire had all that and much more.

Our 3PI line followers were a big hit as well. We got a lot of questions about the line followers and a lot of interest. They are fun to watch. We probably blew through 4 packs of AAs, however.

Data Bus was there with my Java Ground Control Station (GCS) displaying telemetry data: voltage, current, speed, GPS satellite count, heading, and bearing to waypoint.

I was glad to meet several prospective AVC competitors and others interested in the Bus and talk tech with them.

My Raspberry Pi Rover made an appearance on Saturday but experienced technical problems at the end of the day taking it out of commission for the rest of the Faire.

I also showed off OpenMV Cam to several people. It ran face detection reliably for both days.

Next time the Faire is in town, you should go. It's absolutely awesome.

Syndicated 2014-05-08 16:00:00 from Michael Shimniok

29 Apr 2014 (updated 8 May 2014 at 16:16 UTC) »

Denver Mini-Maker Faire

Are you in Denver? Got some time this weekend for robots?

Saturday May 3rd and Sunday May 4th 9:00am - 5:00pm

Come visit Bot Thoughts and SHARC at the Denver Mini Maker Faire at the National Western Complex Saturday or Sunday. Tickets available here.

What I'll be showing...

Exclusive peek at the OpenMV Cam!

Sparkfun AVC 2014 Course Preview

Sparkfun posted the 2014 AVC Course Preview today. Two changes. For ground, Micro/PBR classses can follow a line. For air, 36" red balloons are added. Pop them for a bonus (an OpenMV cam might be handy here -- we're planning to build a small batch of prototypes soon).

Syndicated 2014-04-17 18:36:00 (Updated 2014-04-17 18:36:36) from Michael Shimniok

My Pixy arrived in the mail!

Well, can't wait to play with my just-delivered Pixy cam! Meanwhile I hope to finish OpenMV Camera assembly soon so I can demo at Robotics At The Hangar here in Denver on April 13.

Syndicated 2014-03-28 16:48:00 (Updated 2014-03-28 16:48:57) from Michael Shimniok

74 older entries...

Share this page