Sensors

Vision Sensors Through the Looking Glass

Posted 9 Mar 2007 at 17:49 UTC by steve Share This

Nelson Bridwell writes, "One of the many excuses for not using stereo vision on mobile robots is the complexity and expense of arranging two synchronized cameras that can simultaneously capture image frames from different positions. However, in many cases it is possible to capture simultaneous images from two different positions, using only a single camera and a single element mirror arranged in a very simple geometry." For more details on this idea plus diagrams and photos, see Real Time Stereo from a Single Camera

Using mirrors, posted 10 Mar 2007 at 09:59 UTC by motters » (Master)

I have seen similar methods being used years ago, but with a more complicated binocular periscope type of arrangement. I might try this technique myself at some point.

The only disadvantage to mirroring is that you lose half the resolution per image, and in stereo vision its horizontal resolution that counts (higher resolution means higher ranging accuracy), but some of the modern webcams have quite a high resolution anyway so this may no longer be an issue.

Getting stereo vision systems to work is undoubtedly a complicated business, involving multiple kinds of algorithm. My own system, still in development, can be found here http://code.google.com/p/sentience/

Is it possible to synchronise two USB or FireWire cameras?, posted 10 Mar 2007 at 10:59 UTC by dafyddwalters » (Master)

The mirror system suggested by Nelson Bridwell certainly seems to be an interesting solution to simultaneously snapping a pair of images.

I have done some experimentation myself with two FireWire webcams, and the problem I run into when the robot is moving at speed, is that the images are not taken at exactly the same time, which introduces errors.

I wonder, does anyone happen to know if there's a programmatic way of persuading a pair of cheap consumer webcams (either USB or FireWire) to start taking an image at the same instant (i.e. within a millisecond or two)?

Firewire Cameras, posted 10 Mar 2007 at 21:11 UTC by Nelson » (Journeyer)

This approach is so simple that I assumed that it must already be in use, but when I looked around on the web all I could find was a stereo photography parallel that was developed by Steve Hines several years ago: http://www.hineslab.com/MirrorStereo.html

Most of the inexpensive cameras are VGA (640w x 480h). If you need slightly more depth resolution you can rotate the camera to 640h x 480w. And if you want to try this out in non-real-time at ultra high resolution, try out a photograph with your 7 MP digital camera! The camera-mirror alighment process is trivial compared to the Leica rangefinder approach.

For most stereo matching algorithms the Firewire cameras produce higher quality uncompressed images that do not wreak havoc on sensitive feature detectors. I use the Unibrain Fire-I board camera http://www.unibrain.com/index.html with the CMU 1394 Digital Camera API (Windows), which gives you very complete control and works with just about any Firewire camera, because they all use the same standard interface. http://www.cs.cmu.edu/~iwan/1394/

When I read the technical reports for the 2005 DARPA Grand Challenge almost every report showed pictures of vehicles equiped with stereo pairs of cameras, but at the race just about all of them had been removed, presumably because of basic issues such as camera synchronization.

Stereo, posted 10 Mar 2007 at 22:50 UTC by motters » (Master)

I've been experimenting with stereo vision for a long time, and I don't know of any way to synchronise two cameras using software (I don't think it's possible). The best you can get away with is either have the robot stop before it takes pictures, or have the robot travel at a sufficiently low speed so that timing error are not large. I'm gambling that the companies which manufacture webcams will eventually begin producing dual units and solve this problem for me.

Of course you can buy synchronised and pre-calibrated stereo cameras off the shelf, but these are always very pricey items typically intended for industrial vision tasks.

I think the reason why stereo was not used more extensively in the Grand Challenge is firstly because stereo vision involves a lot of uncertainty which ultimately arises from limited pixel resolution or ambiguous correspondences, and secondly for instantaneous stereo pairs the effective range is quite limited. The way to deal with this, and get a much longer effective range is either to track features over time, or to use an occupancy grid based SLAM method, such as DP-SLAM. Both these approaches remain somewhat experimental at present.

In contrast using laser rangers is much easier from a programming point of view, and the uncertainties are far smaller, usually being considered to be a single point in space. Ultimately though I think multi-camera based ranging will trump lasers since they will provide a very cheap solid state solution capable of grabbing far more data at each time step than a laser can.

Firefly MV, posted 11 Mar 2007 at 19:09 UTC by Nelson » (Journeyer)

It looks like the closes contender for an inexpensive firewire camera that you can externally trigger for synchronization is the Point Grey Firefly MV. When Point Grey says inexpensive, I tend to be skeptical, but they were so cheap that they were giving them away free at their boot at the Machine Vision Show in Boston. They are supposed to cost less than $200.

Yes, the SICK sensors were the obvious choice for so many time-strapped DARPA teams because they had a serial interface and would generate fairly reliable range values, although they were good for only about 60 feet. (The high school team did not bother starting to work on their stereo until 2 weeks before the qualifying competition. They said to their technical advisor: "OK, we have 2 cameras. Now what do we have to do?")

In case you don't belive me..., posted 11 Mar 2007 at 19:11 UTC by Nelson » (Journeyer)

http://www.machinevisiononline.org/public/calendar/details.cfm?id=42

See more of the latest robot news!

Recent blogs

17 Apr 2014 shimniok (Journeyer)
8 Apr 2014 Petar.Kormushev (Master)
6 Apr 2014 steve (Master)
4 Apr 2014 mwaibel (Master)
10 Mar 2014 Flanneltron (Journeyer)
2 Mar 2014 wedesoft (Master)
1 Dec 2013 AI4U (Observer)
13 Nov 2013 jlin (Master)
23 Jun 2013 Mubot (Master)
13 May 2013 JLaplace (Observer)
X
Share this page