Older blog entries for shimniok (starting at number 86)

20 Jun 2014 (updated 16 Sep 2014 at 03:16 UTC) »

AVC: SHARC FSV, "Troubled Child"

Troubled Child, 1st Place in Doping and Crowd Favorite

Update: The SHARC FSV team Jeep, "Troubled Child," won Doping Class! And we were given the Crowd Favorite award! Wow! Yeehaw! Everyone on the team is super happy! Read on for details and video.

Congrats to all the other teams. I mean all of them. It's no small feat just to have something you can show up with. Anything beyond that is icing.

Major thanks to Sparkfun for putting on an amazing event, better than ever. Thanks to the guys who serenaded us with a hilarious song about the robot Jeep. And thanks for all the help from everyone and special thanks to Atmel for letting us troubleshoot inside the trailer once we overheated ourselves sitting in the Jeep. So, let's talk about that Jeep, safety first.

Third turn, Photo by Alicia Gibb, CC-BY-SA


The SHARC Full Size Vehicle team knew from the beginning safety had to be the highest priority. The SHARC FSV has a number of safety features.
  • A human driver is always in control of throttle, ignition, gear shift, and brake and watches for any issues.
  • The vehicle runs the course at very slow speeds
  • A Failsafe brake actuator is engaged before and after run or if the emergency stop button is pressed.
  • Only steering is autonomously controlled during the run. When the steering servo is disabled, the driver has full steering control.
  • A human team member monitors system status, position and heading estimates, and more, watching for any problems.
  • External human spotter with voice radio link to the Jeep.
  • An Emergency stop button is in easy reach of driver and passenger. It disables the steering servo and restores steering control to the driver and applies the brake.
  • The vehicle employs visual and auditory warning signals when operating.
  • The main controller hardware and software is AVC-proven in Data Bus in 2012 when it precisely navigated the course for a 3rd place finish.
  • We use a high precision, industrial-grade GPS with 60cm accuracy.
  • Our team is comprised of experienced roboticists / technologists / successful AVC veterans.
  • We use pre-flight and post-flight checklists
  • Testing showed very consistent results.

    Video Proof

    Here's footage of the 3rd and final run from inside the cockpit synchronized with a GoPro on the bumper. I'm in the driver's seat and George is in the passenger seat. Scott is taking the footage from the back seat. We all have our seat belts on.

    You'll notice at the start we go through a pre-flight checklist to ensure that we are consistent and safe, that all the safety features are enabled, and that the robot will actually go when we press the go button. :)

    Here's some footage Ted took of our first run.


      MCU, Power, Toggle, etc.
      Data Bus is my 1/10th scale rover robot and its code and hardware were race proven in 2012 when it took 3rd place in the AVC. The Jeep uses the same brains as Data Bus: an mbed microcontroller on a Bot Thoughts RoverBaseboard.

      The code base is the same as well, with just a few customization specific to the Jeep. Everything else was user-configurable. For example, steering and path following worked as soon as I entered in track width, wheel base, and a few other parameters.

      You can find the code used in the Jeep right here on github.


      We're using a Hemisphere Crescent A100 high precision, industrial grade GPS with 60cm accuracy thanks to WAAS correction. This is the sort of GPS used in agriculture, shipping, and more. The GPS runs at 10Hz and is powered by a dedicated NiMH battery.


      Steering and brake
      We had to create our own steering servo from scratch. We elected a large, high torque DC motor and a chain drive configuration to turn the steering wheel. The Jeep has power steering and so the motor has no trouble turning the wheel even with the vehicle stopped.

      Position feedback from a potentiometer is taken directly from the Jeep's steering box shaft to maximize accuracy. An Arduino Pro Mini reads a servo signal to determine the target position, reads the steering potentiometer, and then drives the steering motor until it reaches the target point. We've set up a configurable control loop to ensure stability and position hold. Motor speed slows as the wheel gets closer to its target. Here's the code on github.


      Article: Stop That Robot Jeep

      We use a fail-safe brake actuator that can also be controlled by the MCU when the robot is armed to begin its run, or after the run completes. When power to the robotic systems by the emergency switch or main switch, the brake actuator will depress the brake.

      Onboard Air Compressor
      A heavy duty aluminum channel forms the frame, bolted to the driver seat. A long throw pneumatic piston is driven by a regulated on-board air compressor and reserve tank to depress the brake pedal and engage the vehicle's brakes. The system is designed to allow human brake application any time the brake actuator is disengaged.

      Auditory and Visual Warnings

      We use an amber flasher and back up beeper to indicate the robot computer is powered and ready to arm. A loud horn, controlled by the microcontroller, announces when the brake system is about to be deactivated.


      The SHARC Full Size Vehicle Team team consists of experienced technologists and robotics hobbyists including three successful, veteran AVC competitors.

      Steven Gentner is the founder of the RoboRealm machine vision software company and a machine vision expert, with background at the University of Southern California, Los Angeles participating in the groundbreaking Mercury Project and Robotic Telegarden systems; holds an M.S. in Computer Science and went on develop some of the first Internet controlled mobile robots which lead to the development of RoboRealm.

      Dr. Scott Harris is a 2-time AVC winner in 2010 and 2011; authors the Cheap Science blog; has a professional background in applied mathematics, electronics, and programming; and holds a Ph.D. and M.A. in Aerospace and Mechanical Engineering from Princeton; M.S. Aeronautics and Astronautics from Stanford and B.S. in Engineering and Applied Science from the California Institute of Technology. He currently runs his own research and development consultancy, that provides expertise in areas such as electro-optics, laser beam control and propagation, atmospheric optics and aero optics, custom electronics design and development, and technical program management.

      Richard Howlett is the VP of Engineering for the Nilar battery company and a leading expert in Bi-Polar NiMH technology and hold patents within his respective fields. With over 14 year of experience in the advanced vehicle engineering industry including alternative fuels, electric, hybrid electric, and fuel cell vehicle research, Richard chairs the Battery Testing Standards Committee under the Vehicle Battery Steering Committee for SAE International. He also spent 7 years with Hewlett Packard designing state of the art HP and Intel Micro Processors and support chip sets, that variants can be found in today's high performance computers. Richard received his BSEE and MSEE from Texas Tech University with a focus on controls and system modeling. He enjoys electronics and robotics hobbies, as you may have guessed.

      Ted Meyers, one of our AVC veterans, studied computer science at Washington State University and worked as an engineer for Northrop Grumman since graduation.  The past three years he has entered his own autonomous vehicle, Daisy Rover, in the Sparkfun AVC.  Daisy Rover can brag that it has successfully completed the AVC course!

      George Mitsuoka is the Founder and Organizer of SHARC: Greater Denver Area Robotics and member, organizer and Chief Irritant of the SHARC Full-Size Vehicle Team. George has a degree in Computer Science from MIT and more than 25 years of industry experience at companies including Apple, HP, Northrop, Panasonic, and Yahoo! He has been building robots for more than 10 years, has mentored 5 FIRST Robotics Competition Teams in the Denver Area, and is the Technical Editor of Robot Magazine. George has received a Certificate of Completion with Highest Distinction for Udacity CS373: Programming a Robotic Car taught by Sebastian Thrun. For his day job, George designs high-power energy storage systems for ground vehicles, ships, and electric grid applications.

      Mike Peel is a robotics hobbyist who has worked in the computer, electronics, and avionics industries for 38 years and is a veteran of the USAF. Projects supported include USAF B52 Offensive Avionics suite, NASA Space Shuttle Flight Computer, NASA Shuttle Training Aircraft simulation avionics suite. Mike is currently working for Lockheed Martin as a Staff Electrical Engineer developing and operating the NASA Orion Multi-Purpose Crew Vehicle avionics test laboratory. His other hobbies include fishing and hiking.

      Michael Shimniok is the owner of "Troubled Child" and has turned almost every bolt on the vehicle while refitting it for four-wheeling. His Sparkfun AVC robot, Data Bus, placed 3rd in 2012 thanks in no small part to participation in Udacity CS373: Programming a Robotic Car. A former FIRST mentor, Michael is also author and creator of the Bot Thoughts blog. He enjoys mechanical, electronic and software disciplines and is a bit of a jack of all trades with too many hobbies. He has been a professional IT geek since 1993 and is currently employed by Lockheed Martin. He earned a B.S. in Computer Engineering from the University of Arizona and an M.S. in Systems Engineering from The George Washington University.

      Syndicated 2014-06-20 20:00:00 (Updated 2014-09-16 02:59:36) from Michael Shimniok

AVC: 5 Day Panic

The SHARC top secret AVC entry has been taking up a lot of time. It's come together very nicely but there are several problems to work through.

The heading estimate on Data Bus isn't working with the new technique I'm trying. So I have to get that fixed before I can look at path following. I tried to fix one thing and screwed up another.

I haven't even thought about testing it on the jump ramp or coding it for different starting line positions. Without the ramp I'm pretty certain not to medal in this thing. I'll be happy to get the robot around the track, though. Maybe that's asking too much. Next day or two should tell.

Good luck and safe travels to all the other entrants if I don't get a chance to post before Saturday.

Syndicated 2014-06-16 18:00:00 (Updated 2014-06-16 21:04:29) from Michael Shimniok

AVC: 10 Day Panic

Oh crap. The competition is heating up for the Sparkfun AVC. There's only 10 days left.

And I'm doomed. As usual.

Top Contenders

I'm in the Peloton class. So are some heavy hitters.

Minuteman is tuning his rover for over 20mph in the straights and well over 10mph in corners by carefully analyzing sensor plots and optimizing vehicle dynamics. How do you beat that?! He took 2nd place last year.

Tom Coyle,
APM:Rover developer, working with Tridge, had his robot buttoned up so early and doing so well he already shipped it to Colorado several days ago. Tom won 1st place last year. The code is much, much better now.

There are undoubtedly other strong competitors in Peloton this year, too. At this stage, it doesn't look like I will be one of them. I'll need another breakthrough.

Other Competitors

You can find a variety of videos on youtube for other AVC entries. SHARC, my robot club, is fielding several entries. The diyrovers list has lots of people rolling their own rovers this year, too.

As for me?

I'm still working on fixes for the heading drift and path following oscillation I saw before.

To address heading drift, I implemented bias as a third state variable in my Kalman Filter so that I can use that bias throughout the run. Testing and simulation look promising. I found several bugs along the way and changed the gyro resolution down to 250 degrees/sec.

But the latest test saw the robot doing crazy circles. I'm afraid to find out why. I haven't had a good path following run in many months.

Processing lag may account for instability with pure pursuit path following I'm using. A simulation I ran as well as a scholarly paper point to this as a possible problem.

It's probably just a stupid bug, though.

With only 10 days left, let's hope I find it and all the other ones that matter.

Stay tuned!

Syndicated 2014-06-10 20:30:00 (Updated 2014-06-10 20:40:03) from Michael Shimniok

2 Jun 2014 (updated 3 Jun 2014 at 06:16 UTC) »

AVC: Heading Errors, Path Following Woes

A little over two weeks to go as I write this. That's not much time.

Testing uncovered a couple of obvious problems. The robot is experiencing heading errors and the pure pursuit path following is still working poorly.

Heading Error

I'm seeing unpredictable heading deviations possibly suggesting heading drift. The robot's ending position was incorrect, rotated counterclockwise, on three test runs.

The problem manifests most apparently in the first leg of the square course. In three autonomous runs plus one manual run, I could see in the GCS that heading was drifting off.

Time to look at logs and see what is going on. Looking at a plot of the log data you can see some problems right away.

The correct initial heading --from the starting point to the first waypoint-- was around 254 degrees. This plot shows activity from about 1 second after logging starts.

The red line, GPS heading, won't report a correct heading until it hits about 2-3m/s which occurs just after 2 seconds into the run. My code won't use GPS heading until then, relying only on gyro, which is corrected for bias at the starting line.

The blue line shows the time-lagged heading estimate based on gyro and GPS heading. The green line is the current heading estimate. The gyro is used to compute a new heading and error corrected with the time-lagged estimate.

Both blue and green lines drift off the initial heading. Whether this is actual vehicle motion or gyro drift I don't know yet and will have to investigate.

Crystal clear is that the gyro and Kalman heading estimates are almost 5 degrees in error versus the GPS heading in this first stretch of the course. That's pretty terrible, and may explain what I witnessed in field testing.

What's going on? Possibly the code is fusing in GPS data before it converges to the correct heading. Or I need to include bias as a state in my Kalman Filter. Also, I'm using a new uBlox GPS with 4Hz update rate instead of the previous Venus GPS' 10Hz rate, so GPS heading will have less influence over gyro bias.

But what of that strange spike in the green and blue plots at just after 3 seconds? These spikes are all over the plots for my runs. It appears in the gyro estimate, current heading estimate, and time-lagged estimates.

The plot above shows the spikes pretty clearly. It also shows the Z-axis gyro values. I don't see any significant correlation between the estimate spikes and the gyro values. So the spikes aren't caused by shock, apparently.

There may simply be a bug in my computations. The gyro-only estimate is particularly bad with massive spikes appearing at seemingly random intervals.

That's the first place I'll start looking. The Kalman Filter smooths out this noise and error correction reduces the effect on the current heading estimate. The spikes don't appear to gravely affect heading estimate, in the end. But I don't see how a robot can hope to win with these kinds of errors present.

Path Following Instability

Path following is unstable. The robot has a hard time staying on the path. I tried a few different lookahead distances for the virtual 'rabbit' the robot chases around the course.

With a 2m lookahead distance, the robot experiences quite a bit of cross track error, particularly after the 2nd waypoint, with some fairly wide oscillation. This is what I saw several months ago and have procrastinated dealing with.

A longer lookahead distance should result in improved stability. I tried 3m which provided no appreciable change. With a 4m lookahead, the robot didn't turn fast enough to follow the path, but it was much more stable.

To verify the other extreme, a short lookahead of 1m the path following oscillated and was nearly unstable, as expected. I can try further tuning of the lookahead distance, sure.

Meanwhile, Josh Pieper of Savage Solder fame suggested lag as a possible problem. This paper [HEREDIA-2007-ADVANCED ROBOTICS.pdf] suggests instability problems when there's a lag between sensor reading and control actuation that's caused by computation delay and mechanical delay. I verified this awhile back with simulations in Processing.

I'm seeing about 1.4ms lag between gyro read and when the steering is updated. The robot updates the steering position every 50ms. The servo is an entry level Futaba S3003 with a typical slow speed of 60° in 0.19s.

I'm not yet sure what the answer is for path following. Until I get the heading issues figured out there's little point in working on this issue.

Syndicated 2014-06-02 19:00:00 (Updated 2014-06-03 05:18:07) from Michael Shimniok

2014 AVC: Entry Video

Here's my Data Bus proof of concept video for the 2014 Sparkfun AVC.

Syndicated 2014-05-29 23:00:00 from Michael Shimniok

22 May 2014 (updated 29 May 2014 at 22:14 UTC) »

AVC: Pose and Map Display

To fix the path following algorithm on Data Bus I have to know what the robot is thinking.

To know what it's thinking, the robot is now sending some new data in its telemetry stream to the GCS.

  • All waypoints
  • Index of the next waypoint
  • Lookahead position

The GCS now opens a map window which scales and displays the data above as well as vehicle pose (position and heading).

But before I got to that point, when I first started coding up the map window, it was immediately obvious that the lookahead point was wrong. Below, the green dots are waypoints, the tiny yellow dot in the upper right is the rover, and the blue dot is the lookahead point.

Lookahead point. You're doing it wrong.

I was computing the position of the lookahead point relative to the previous waypoint, but failing to add the coordinates of that waypoint.

In my attempt to fix the problem I mistakenly added the relative distance to the robot's position. The map display made it obvious what was wrong. The lookahead point kept getting farther and farther away.

I got it fixed. Then I added vehicle heading as a line. I draw waypoints in red and the next waypoint in green. I was able to verify that the code switches to the next waypoint properly. Heading looks good. Now I can do some diagnosis on the path following.

There. That's better.

Point is, visualization is awesome. I think it will be a big time saver for troubleshooting.

Syndicated 2014-05-22 14:00:00 (Updated 2014-05-29 22:00:51) from Michael Shimniok

20 May 2014 (updated 21 May 2014 at 21:16 UTC) »

Sparkfun AVC Update

The last few months, within the free nooks and crannies of an incredibly hectic life, I've done my best to fit in work on my Sparkfun AVC entries, Data Bus and the still-top-secret SHARC entry.

The triumvirate of robot expos are over: Robots at the Hangar, Denver Mini-Maker Faire, and Deep Space Open House. Prepping displays for these shows was daunting. My efforts to bring motion to Trash Bot succeeded. The Data Bus GCS is in good shape. Mom's clock project is wrapped up. And I've been working on OpenMV Camera as time permits with a small scale prototype run planned for the near future.

The challenges have been many. The family was hit with a string of illnesses, with my wife down for the count for a week straight in one case. I've been on travel almost 50% the last 6 weeks. My aging mom fell and went to the hospital and has been going through rehab. Numerous other regular life things have come along as well.

Several months ago, when I last tested Data Bus with the new path following algorithm, it made it to the second waypoint before its path started oscillating wildly.

In the interim I experimented with FreeRTOS, a number of code changes. But I never faced up to the strange path following behavior. After a couple months I had made a mess of things and reverted to an much earlier version, removing the RTOS. I haven't done a single test run in many months.

Data Bus is sitting next to me as I type, sporting a number of changes and hopefully getting close to another test run. I've added a map display to the Ground Control System. The robot now informs the GCS of its estimated position, waypoint locations, and the location of the pure pursuit lookahead point, the "rabbit" it chases around the course. The lookahead data is going to be key to troubleshooting the problems I saw before. It also logs this data to the onboard microSD card.

The deadline for a proof of concept video is end of May so it's time to step it up and get this robot working again.

Syndicated 2014-05-20 15:00:00 (Updated 2014-05-21 20:08:14) from Michael Shimniok

AVC: Ground Control Station

With less than two months left to get Data Bus working, why am I working on Ground Control Station (aka GCS) software?

One reason, honestly, was to have something fun to show at Denver Mini Maker Faire. Another reason is that I'm convinced it'll help me save time troubleshooting. I'm also going to need it for the SHARC AVC entry.

The GCS Software

I had
already written the GCS software in Java Swing, complete with Google Maps. What wasn't working was telemetry. Also, it was written for ancient Java 6. I updated to less ancient Java 7, a newer IDE, and more.

Observer Pattern

The gauges (GaugePanel) are implemented as JLayeredPane objects and each gauge needle is a JPanel added to the gauge (GaugeNeedle). The needles listen for changes to associated Property objects (like voltage, current, heading) that are set when reading data from the telemetry stream.

These Property objects implement an Observable interface which means that whenever a new value is set the object invokes a callback on a ChangeListener. The GaugeNeedle implements ChangeListener and the callback changed() method.

So what happens is the TelemetryParser reads, say, a voltage from the robot, sets the voltage Property, which then calls voltageNeedle.changed() and the needle updates its position.


Probably the hardest part of all of this was getting RXTX to work. I am going to abandon it in favor of other options at the earliest possible opportunity because it's always such a nightmare to get working and things are much worse now with 64-bit machines, 64- and 32-bit JVMs, the new Mac OS X version that appears not to be supported, and more.

Anyway, for better or worse I have a SerialPanel object that handles UI aspects of connecting to a serial port and receiving data, and then calling a callback (a Parser) to do something with the data. The TelemetryParser is responsible for splitting the CSV and setting the associated Property objects.

Map Visualization

Now that I'm on Linux, I can't use the Google Earth Plugin; it's not supported. One of the key features I need is the ability to quickly visualize the robot's pose estimate and bearing/distance to waypoint. So I'll have to get something working soon.


For the moment I have a stupid simple telemetry stream set up. It just spits out a header character, a CSV of the telemetry data, and a newline termination. I was scrambling to get this all done before Maker Faire, so there's more to do, like error detection and correction.


The robot logs a lot of data as it runs. I'd like to implement log file download within the GCS. Presently, my Java serterm serial terminal program serves this purpose. I'd like to run it all in the same software, however.

Also, it'd be handy to be able to record telemetry data for immediate playback without having to wait for (giant) log file download over serial. So that's also part of my plans.


It'd be nice not to have to remove the bot's cover and plug in a USB cable to check data after each run. That burns up precious time of which I have little.

I figure it'll also be helpful to watch robot telemetry as it runs. So I picked up a pair of Xbee Pro S1 radios to play with. The 60mW power claims a mile range. I need a couple hundred yards, tops.

See that blue Xbee board on there? That's new...
So far so good but I'll need to do some range testing and make sure there are no issues with the RC receiver and Xbee antennas sitting in close proximity. I realize lots of stuff shares the 2.4GHz band relatively successfully. Exactly how I don't quite know. Like I always say, "it should work"...

Syndicated 2014-05-15 17:00:00 (Updated 2014-05-15 17:00:01) from Michael Shimniok

Kids Like Taking Out Trash (With a Robot)

Where can you watch hundreds of kids volunteer to take out the trash? At the Denver Mini-Maker Faire, that's where!

Kids loved driving Trash Bot, grabbing, and dragging the recycling bin.

I spent most of last weekend manning the Bot Thoughts / SHARC combined booths where my remote control Trash Bot (TOTT Bot) was a huge hit with kids and adults alike! But that's not all.

This was my first Maker Faire of any sort, and it was awesome. I am interested in lots of disparate things from art to science and the Mini Maker Faire had all that and much more.

Our 3PI line followers were a big hit as well. We got a lot of questions about the line followers and a lot of interest. They are fun to watch. We probably blew through 4 packs of AAs, however.

Data Bus was there with my Java Ground Control Station (GCS) displaying telemetry data: voltage, current, speed, GPS satellite count, heading, and bearing to waypoint.

I was glad to meet several prospective AVC competitors and others interested in the Bus and talk tech with them.

My Raspberry Pi Rover made an appearance on Saturday but experienced technical problems at the end of the day taking it out of commission for the rest of the Faire.

I also showed off OpenMV Cam to several people. It ran face detection reliably for both days.

Next time the Faire is in town, you should go. It's absolutely awesome.

Syndicated 2014-05-08 16:00:00 from Michael Shimniok

77 older entries...

Share this page