Older blog entries for shimniok (starting at number 82)

2 Jun 2014 (updated 3 Jun 2014 at 06:16 UTC) »

AVC: Heading Errors, Path Following Woes

A little over two weeks to go as I write this. That's not much time.

Testing uncovered a couple of obvious problems. The robot is experiencing heading errors and the pure pursuit path following is still working poorly.

Heading Error

I'm seeing unpredictable heading deviations possibly suggesting heading drift. The robot's ending position was incorrect, rotated counterclockwise, on three test runs.

The problem manifests most apparently in the first leg of the square course. In three autonomous runs plus one manual run, I could see in the GCS that heading was drifting off.

Time to look at logs and see what is going on. Looking at a plot of the log data you can see some problems right away.

The correct initial heading --from the starting point to the first waypoint-- was around 254 degrees. This plot shows activity from about 1 second after logging starts.

The red line, GPS heading, won't report a correct heading until it hits about 2-3m/s which occurs just after 2 seconds into the run. My code won't use GPS heading until then, relying only on gyro, which is corrected for bias at the starting line.

The blue line shows the time-lagged heading estimate based on gyro and GPS heading. The green line is the current heading estimate. The gyro is used to compute a new heading and error corrected with the time-lagged estimate.

Both blue and green lines drift off the initial heading. Whether this is actual vehicle motion or gyro drift I don't know yet and will have to investigate.

Crystal clear is that the gyro and Kalman heading estimates are almost 5 degrees in error versus the GPS heading in this first stretch of the course. That's pretty terrible, and may explain what I witnessed in field testing.

What's going on? Possibly the code is fusing in GPS data before it converges to the correct heading. Or I need to include bias as a state in my Kalman Filter. Also, I'm using a new uBlox GPS with 4Hz update rate instead of the previous Venus GPS' 10Hz rate, so GPS heading will have less influence over gyro bias.

But what of that strange spike in the green and blue plots at just after 3 seconds? These spikes are all over the plots for my runs. It appears in the gyro estimate, current heading estimate, and time-lagged estimates.

The plot above shows the spikes pretty clearly. It also shows the Z-axis gyro values. I don't see any significant correlation between the estimate spikes and the gyro values. So the spikes aren't caused by shock, apparently.

There may simply be a bug in my computations. The gyro-only estimate is particularly bad with massive spikes appearing at seemingly random intervals.

That's the first place I'll start looking. The Kalman Filter smooths out this noise and error correction reduces the effect on the current heading estimate. The spikes don't appear to gravely affect heading estimate, in the end. But I don't see how a robot can hope to win with these kinds of errors present.

Path Following Instability

Path following is unstable. The robot has a hard time staying on the path. I tried a few different lookahead distances for the virtual 'rabbit' the robot chases around the course.

With a 2m lookahead distance, the robot experiences quite a bit of cross track error, particularly after the 2nd waypoint, with some fairly wide oscillation. This is what I saw several months ago and have procrastinated dealing with.

A longer lookahead distance should result in improved stability. I tried 3m which provided no appreciable change. With a 4m lookahead, the robot didn't turn fast enough to follow the path, but it was much more stable.

To verify the other extreme, a short lookahead of 1m the path following oscillated and was nearly unstable, as expected. I can try further tuning of the lookahead distance, sure.

Meanwhile, Josh Pieper of Savage Solder fame suggested lag as a possible problem. This paper [HEREDIA-2007-ADVANCED ROBOTICS.pdf] suggests instability problems when there's a lag between sensor reading and control actuation that's caused by computation delay and mechanical delay. I verified this awhile back with simulations in Processing.

I'm seeing about 1.4ms lag between gyro read and when the steering is updated. The robot updates the steering position every 50ms. The servo is an entry level Futaba S3003 with a typical slow speed of 60° in 0.19s.

I'm not yet sure what the answer is for path following. Until I get the heading issues figured out there's little point in working on this issue.

Syndicated 2014-06-02 19:00:00 (Updated 2014-06-03 05:18:07) from Michael Shimniok

2014 AVC: Entry Video

Here's my Data Bus proof of concept video for the 2014 Sparkfun AVC.

Syndicated 2014-05-29 23:00:00 from Michael Shimniok

22 May 2014 (updated 29 May 2014 at 22:14 UTC) »

AVC: Pose and Map Display

To fix the path following algorithm on Data Bus I have to know what the robot is thinking.

To know what it's thinking, the robot is now sending some new data in its telemetry stream to the GCS.

  • All waypoints
  • Index of the next waypoint
  • Lookahead position

The GCS now opens a map window which scales and displays the data above as well as vehicle pose (position and heading).

But before I got to that point, when I first started coding up the map window, it was immediately obvious that the lookahead point was wrong. Below, the green dots are waypoints, the tiny yellow dot in the upper right is the rover, and the blue dot is the lookahead point.

Lookahead point. You're doing it wrong.

I was computing the position of the lookahead point relative to the previous waypoint, but failing to add the coordinates of that waypoint.

In my attempt to fix the problem I mistakenly added the relative distance to the robot's position. The map display made it obvious what was wrong. The lookahead point kept getting farther and farther away.

I got it fixed. Then I added vehicle heading as a line. I draw waypoints in red and the next waypoint in green. I was able to verify that the code switches to the next waypoint properly. Heading looks good. Now I can do some diagnosis on the path following.

There. That's better.

Point is, visualization is awesome. I think it will be a big time saver for troubleshooting.

Syndicated 2014-05-22 14:00:00 (Updated 2014-05-29 22:00:51) from Michael Shimniok

20 May 2014 (updated 21 May 2014 at 21:16 UTC) »

Sparkfun AVC Update

The last few months, within the free nooks and crannies of an incredibly hectic life, I've done my best to fit in work on my Sparkfun AVC entries, Data Bus and the still-top-secret SHARC entry.

The triumvirate of robot expos are over: Robots at the Hangar, Denver Mini-Maker Faire, and Deep Space Open House. Prepping displays for these shows was daunting. My efforts to bring motion to Trash Bot succeeded. The Data Bus GCS is in good shape. Mom's clock project is wrapped up. And I've been working on OpenMV Camera as time permits with a small scale prototype run planned for the near future.

The challenges have been many. The family was hit with a string of illnesses, with my wife down for the count for a week straight in one case. I've been on travel almost 50% the last 6 weeks. My aging mom fell and went to the hospital and has been going through rehab. Numerous other regular life things have come along as well.

Several months ago, when I last tested Data Bus with the new path following algorithm, it made it to the second waypoint before its path started oscillating wildly.

In the interim I experimented with FreeRTOS, a number of code changes. But I never faced up to the strange path following behavior. After a couple months I had made a mess of things and reverted to an much earlier version, removing the RTOS. I haven't done a single test run in many months.

Data Bus is sitting next to me as I type, sporting a number of changes and hopefully getting close to another test run. I've added a map display to the Ground Control System. The robot now informs the GCS of its estimated position, waypoint locations, and the location of the pure pursuit lookahead point, the "rabbit" it chases around the course. The lookahead data is going to be key to troubleshooting the problems I saw before. It also logs this data to the onboard microSD card.

The deadline for a proof of concept video is end of May so it's time to step it up and get this robot working again.

Syndicated 2014-05-20 15:00:00 (Updated 2014-05-21 20:08:14) from Michael Shimniok

AVC: Ground Control Station

With less than two months left to get Data Bus working, why am I working on Ground Control Station (aka GCS) software?

One reason, honestly, was to have something fun to show at Denver Mini Maker Faire. Another reason is that I'm convinced it'll help me save time troubleshooting. I'm also going to need it for the SHARC AVC entry.

The GCS Software

I had
already written the GCS software in Java Swing, complete with Google Maps. What wasn't working was telemetry. Also, it was written for ancient Java 6. I updated to less ancient Java 7, a newer IDE, and more.

Observer Pattern

The gauges (GaugePanel) are implemented as JLayeredPane objects and each gauge needle is a JPanel added to the gauge (GaugeNeedle). The needles listen for changes to associated Property objects (like voltage, current, heading) that are set when reading data from the telemetry stream.

These Property objects implement an Observable interface which means that whenever a new value is set the object invokes a callback on a ChangeListener. The GaugeNeedle implements ChangeListener and the callback changed() method.

So what happens is the TelemetryParser reads, say, a voltage from the robot, sets the voltage Property, which then calls voltageNeedle.changed() and the needle updates its position.


Probably the hardest part of all of this was getting RXTX to work. I am going to abandon it in favor of other options at the earliest possible opportunity because it's always such a nightmare to get working and things are much worse now with 64-bit machines, 64- and 32-bit JVMs, the new Mac OS X version that appears not to be supported, and more.

Anyway, for better or worse I have a SerialPanel object that handles UI aspects of connecting to a serial port and receiving data, and then calling a callback (a Parser) to do something with the data. The TelemetryParser is responsible for splitting the CSV and setting the associated Property objects.

Map Visualization

Now that I'm on Linux, I can't use the Google Earth Plugin; it's not supported. One of the key features I need is the ability to quickly visualize the robot's pose estimate and bearing/distance to waypoint. So I'll have to get something working soon.


For the moment I have a stupid simple telemetry stream set up. It just spits out a header character, a CSV of the telemetry data, and a newline termination. I was scrambling to get this all done before Maker Faire, so there's more to do, like error detection and correction.


The robot logs a lot of data as it runs. I'd like to implement log file download within the GCS. Presently, my Java serterm serial terminal program serves this purpose. I'd like to run it all in the same software, however.

Also, it'd be handy to be able to record telemetry data for immediate playback without having to wait for (giant) log file download over serial. So that's also part of my plans.


It'd be nice not to have to remove the bot's cover and plug in a USB cable to check data after each run. That burns up precious time of which I have little.

I figure it'll also be helpful to watch robot telemetry as it runs. So I picked up a pair of Xbee Pro S1 radios to play with. The 60mW power claims a mile range. I need a couple hundred yards, tops.

See that blue Xbee board on there? That's new...
So far so good but I'll need to do some range testing and make sure there are no issues with the RC receiver and Xbee antennas sitting in close proximity. I realize lots of stuff shares the 2.4GHz band relatively successfully. Exactly how I don't quite know. Like I always say, "it should work"...

Syndicated 2014-05-15 17:00:00 (Updated 2014-05-15 17:00:01) from Michael Shimniok

Kids Like Taking Out Trash (With a Robot)

Where can you watch hundreds of kids volunteer to take out the trash? At the Denver Mini-Maker Faire, that's where!

Kids loved driving Trash Bot, grabbing, and dragging the recycling bin.

I spent most of last weekend manning the Bot Thoughts / SHARC combined booths where my remote control Trash Bot (TOTT Bot) was a huge hit with kids and adults alike! But that's not all.

This was my first Maker Faire of any sort, and it was awesome. I am interested in lots of disparate things from art to science and the Mini Maker Faire had all that and much more.

Our 3PI line followers were a big hit as well. We got a lot of questions about the line followers and a lot of interest. They are fun to watch. We probably blew through 4 packs of AAs, however.

Data Bus was there with my Java Ground Control Station (GCS) displaying telemetry data: voltage, current, speed, GPS satellite count, heading, and bearing to waypoint.

I was glad to meet several prospective AVC competitors and others interested in the Bus and talk tech with them.

My Raspberry Pi Rover made an appearance on Saturday but experienced technical problems at the end of the day taking it out of commission for the rest of the Faire.

I also showed off OpenMV Cam to several people. It ran face detection reliably for both days.

Next time the Faire is in town, you should go. It's absolutely awesome.

Syndicated 2014-05-08 16:00:00 from Michael Shimniok

29 Apr 2014 (updated 8 May 2014 at 16:16 UTC) »

Denver Mini-Maker Faire

Are you in Denver? Got some time this weekend for robots?

Saturday May 3rd and Sunday May 4th 9:00am - 5:00pm

Come visit Bot Thoughts and SHARC at the Denver Mini Maker Faire at the National Western Complex Saturday or Sunday. Tickets available here.

What I'll be showing...

Exclusive peek at the OpenMV Cam!

Sparkfun AVC 2014 Course Preview

Sparkfun posted the 2014 AVC Course Preview today. Two changes. For ground, Micro/PBR classses can follow a line. For air, 36" red balloons are added. Pop them for a bonus (an OpenMV cam might be handy here -- we're planning to build a small batch of prototypes soon).

Syndicated 2014-04-17 18:36:00 (Updated 2014-04-17 18:36:36) from Michael Shimniok

My Pixy arrived in the mail!

Well, can't wait to play with my just-delivered Pixy cam! Meanwhile I hope to finish OpenMV Camera assembly soon so I can demo at Robotics At The Hangar here in Denver on April 13.

Syndicated 2014-03-28 16:48:00 (Updated 2014-03-28 16:48:57) from Michael Shimniok

25 Mar 2014 (updated 14 Apr 2014 at 15:15 UTC) »

Clock for my Mom, Complete.

Mom's clock is complete. She has trouble remembering the day of the week and has impaired vision, so after a fruitless search for an affordable solution I made my own. Here's how.

The high-visibility desktop timepiece uses 8x8 LED matrices, is powered by a phone charger, and features a hand-rubbed Maple enclosure faced with smoked acrylic for a high-contrast display. It alternately displays the time and day of week and automatically corrects for daylight savings time.

I previously covered
assembling the core electronics, all available on Tindie.com:


I thought about using one of my many microcontrollers but I wanted a quicker, simpler solution so I ordered an LED Matrix Master from FriedCircuits on tindie.com, made up a cable to connect it to the LED matrix, loaded up the software I was using on the Uno, and called it done.

Power Supply

The Matrix and Real-Time Clock/Calendar were put together but I needed a power supply for permanent use. After contemplating several options, for sake of expediency I decided to use one of my eeZee Power USB modules, run a USB connector into the box, and plug the other end into a common 5V phone charger.

Building the Enclosure

The enclosure is constructed of 1/4" maple boards sourced at the local home improvement store. I picked this particular board for it's curly figure which I found appealing.

My plan, after considering mitre cuts and other options, was to use my router to notch the top and bottom edges of each side, so that I could lay a plank of the Maple into the notches at the top and bottom, glue it, and call it done.

Finishing the Enclosure

I'm far more experienced with coding and electronics than wood, particularly with finishing wood. Not surprisingly, my first attempts to build and finish a box ended poorly.

Polyurethane is notorious for its tendency to retain bubbles and collect dust and that's just what happened on a half-dozen separate attempts to add coats to the original box. I first tried using a pad which is supposed to help. Better but not great. Then I tried using a spray can of Urethane. Better still, but still not great and tended to develop sags. I worked on isolating the piece from dust. That helped. A little.

Finally I gave up, bought some tung oil, tested it, and declared Urethane the devil and vowed never to use it for anything serious again. I rebuilt the box from scratch.

I applied the tung oil with an old t-shirt rag, rubbing it in and letting it dry overnight after each coating. Between coatings I sanding progressively finer starting with 320-grit then 400-grit, and buffing with #00 steel wool in the end. The finish isn't quite as glossy or hard as I'd hoped but it leaves a nice natural look to the wood. I'm quite pleased.

Attaching Acrylic Face

The acrylic face is mounted directly to the edge of the enclosure with cyanoacrylate (CA) glue. I experimented briefly with acrylic, wood, and CA and found the bond to be suitably strong, though not impossible to break free with significant force.

I purchased a large sheet of smoked gray acrylic from eBay, so I'd have plenty of opportunities to screw up.

After carefully measuring the piece needed, leaving excess that could be sanded away for a precision fit, I used a plastic scoring tool to notch the acrylic, then bent it over a flat edge until it snapped cleanly.

I sanded down the edges of the acrylic a little at a time until it nearly matched the size of the enclosure, then finished the edges with 320-grit and 400-grit sandpapers.

Then, I scuffed the mating surface of the acrylic with 400-grit sandpaper, applied a small amount of CA on the wood to avoid squeeze out, sprayed kicker (catalyst for CA) onto the CA, and placed the acrylic.

Almost perfectly.

The face is offset horizontally by about 1/32" which will haunt me for the rest of my days. Sigh. At least the top and bottom are perfectly flush. I'll use a jig next time I do something like this.

Building and Mounting the Frame

My first attempt to build the box involved adding a permanent back. Fortunately I screwed up the finish multiple times and abandoned that box for one with no front or back. This enabled me to permanently install the acrylic face and then, from the rear, slide in a frame holding the clock's guts.

I started by mounting the LED matrices to a piece of Poplar cut to fit inside the enclosure. I used the Eagle files to print out drill locations for the mounting holes. Actually attaching the bolts and nuts was rather tedious but I finally got it. I fixed the metal nuts with CA once they were tight.

Why not use four screws per matrix? I'm low on screws and didn't want to delay the project any longer. It's solid enough. Don't worry about it.

After thinking about and mocking up several options, I built a frame using craft dowels at the top and 3/4" square poplar dowels at the bottom. When I build things like this I think about it then I try stuff and make it up as I go. The point is that it works brilliantly even if I arrived at this solution rather erratically.

I made a second piece of poplar the same size as the first, then clamped them together and drilled two holes in the upper corners large enough to fit 3/8" craft dowels. I slid the two poplar pieces onto the dowels, inserted the frame into the enclosure, pushed the rear face in until it was just below flush, and then used CA glue to fix everything in place.

Then I discovered the dowel was sticking out in front. Crap. I should've glued that first. I used a hacksaw to trim the protruding dowels down. Next I cut two 3/4" square dowels down and sanded until they were the same size as the gap at the top of the frame. I glued these in place with wood glue using clamps.

Could I have screwed these in place from both sides? That'd work too and it would've been possible to reach the LED Matrix screws... which I can't currently do without a hacksaw. Drat. Oh well.

Finally I added the remaining electronics to the frame. The LED Matrix Master is attached to one of the frame posts with some screws I found. The real time clock and power supply are hanging loose for now. There's a notch at the rear bottom of the frame to allow the USB cable to fit through.

Finally I marked two spots on the bottom of the enclosure, drilled and countersunk holes for screws to fasten the frame to the enclosure. I slide the frame in, marked the spots for the screws, drilled pilot holes, then fastened everything together. It's tight. It's solid. It looks pretty darned good.

Daylight Savings Time

I'm from Arizona where we don't do this silly daylight savings time thing and I'm still getting used to it 20 years later up here in Colorado.

I kind of forgot about DST and the clock/calendar module doesn't account for it. I decided to code up automatic daylight savings time detection. I didn't spend much time on it so I'm sure there's a more elegant solution but it was a fun puzzle. I mocked it up in Python (partially to hone my Python craft):