Older blog entries for rudybrian (starting at number 35)

I was able to finish the cable harness repair job on Zaza Saturday 8/2. It took a few more hours than expected, but gave me the opportunity to clean up a bunch of the other wire management without fear of missing the following weekend's scheduled demo.

The demo on the 9th went as planned, but due to time restrictions we weren't able to do the regularly scheduled Phase IV test. Last weekend we were able to do both the demo, and the Phase IV test and things went quite well. Prior to starting the test, I was able to upgrade the JRE from Sun's 1.4.1, to the recently released 1.4.2 on Zaza2. Improvements in the networking, Java2D and audio code showed noticable reliability and performance increases with the face applet. The map and webcam applets running through the control interface on a remote Redhat 9 laptop also worked quite a bit better. Since ALSA is now supported, it is now possible to run a Sphinx audio capture client onboard to robot at the same time as the face applet.

Since migrating from the original Breezecom 802.11 (FH) WLAN hardware to Wi-Fi, there has been a noticable improvement in the performance of the localizer/planner during Phase IV tests. I attribute this to fewer lost laser scans being sent to the localizer due to bandwith saturation. Zaza no longer inexplicably 'gets lost', or 'gives up' searching for a route to the next target after being blocked by visitors for an extended period of time.

Things with work have settled down a bit, so I was able to spend yesterday working on a few of the outstanding issues with the Phase IV code. After adding the button management and destination announcement code a while back, we noticed a need to better integrate these two. I made some modifications to poslib, and poslibtcx adding some handshaking allowing the robot to announce the list of planned destinations when the robot begins moving, as well as when a user presses a button when the robot is in motion. After stopping at a target destination, the button management behavior changes and will eventually be used to speak additional information about the exhibit the robot is nearest to.

Since the basic operation of the tourguide code is now stable, it is now time to start improving the web-based control interface to allow dynamic map and start position selection to allow the robot to operate in this mode in other areas of the museum without substantial manual tweaking.

Things have been rather busy with work over the last few months so I haven't had much time to work on Zaza. To compound the issue, the robot's +5V booster DC-DC converter died on the 28th of last month and nearly vaporized some of the cable harness connectors that weren't rated for the load when running off of only one converter. Fortunately, a local surplus shop had a drop-in replacement for the booster module for really cheap. I was able to get some of the repair work done yesterday, but will need to spend some time on an upcoming weekend to wrap things up.

There are a couple items of note since the last diary entry, so heck, why not write another one :)

We finally got the bridge hardware necessary to port Zaza over to the Tech's 'new' 802.11b wireless network two weeks ago. The extra 8Mbps noticably improves the response time of the face applet, and the framerate of the webcam. The museum's IT folks finished upgrading the firmware on all of the AP's last Friday, and resolved a cabling issue causing problems with an AP in the Exploration gallery on Monday. By Friday it should be possible to wander the robot through the whole museum again (Huzzah!)

I spent some time over the last month re-writing the map applet, so that it would be easier to make UI changes using Netbeans' Form Editor. The earliest versions of the applet had been hand written, then were eventually imported into Netbeans. Since the import tool has no facility for auto-generating forms UI changes were a bit more involved. I was also able to fix a few more bugs in this release. Getting position updates now happens as a background thread independant of the Swing GUI, so there is a noticable performance increase. I also fixed a bug that prevented the scroll bars around a map from working properly in auto-tracking mode.

A bunch of new pictures have been added to Zaza's Hardware page that Robbie, one of the project volunteers took last year during the major repair of Zaza's base in September. If you have never seen inside of a synchro-drive system have a look, it's quite interesting.

4 Feb 2003 (updated 4 Feb 2003 at 20:00 UTC) »

It's been a while, so I guess it's time for an update ;)

Back in November I fixed a bunch of bugs and added some substantial features to Zaza's voiceServer. The module now has full support for Sable markup on input voice cues. This allowed reintroduction of the sound effects we used with the old VB version of Zaza's face.

I finally managed to squash a longstanding bug a few weeks ago in Zaza's tourguide code that would occasionally cause the robot to not notice that it had arrived at an exhibit destination. It's not an elegant solution, but works consistantly.

Last week I added initial back-end support to the poslib module for button interactivity during a tour. This will (eventually) allow visitors to press Zaza's three buttons when she is stopped at an exhibit to hear additional information about it (beyond the standard monolog), language selection, etc... It also announces the destinations the robot plans to visit when a button is pressed when in motion.

We have managed to max-out our offboard server during the Phase IV test runs. During previous tests the robot would inexplicably get lost from time to time. We had attributed this to a network or hardware glitch, but it appears that the machine would get I/O bound when running the web-based GUI locally and cause the localizer to miss critical updates from the base and laser servers. Running the web interface on an external laptop seems to improve reliability. Ideally we would have dedicated machines for the web interface, video server, and localizer/planner but a single-processor PIII 800 is all we have available at the moment.

Zaza is back in action!

The encoder adapter cable arrived from iRobot on 10/17 allowing the installation of the last replacement motor on 10/18. Everything with the new motors and belts checked out during tests in the afternoon.

The following week the Museum's shop was able to mill some cooling/speaker grille slots in the upper-enclosure panels. They did a fantastic job, and the slots do everything we had hoped they would. The internal temperature never gets above 94 degrees even under high-load, and the audio fidelity of the synthesized speech is much better.

Robbie Stone, one of the other Zaza project volunteers discovered what Sebastian has been up to in the past year or so. After the talk he gave at PARC in September of last year he hinted that an updated and open-sourced (yea!) version of BeeSoft was in the works. The CARMEN toolkit is fruit of their labor and looks like it could be a very good thing. Included in the standard distribution is a Monte Carlo Localization (MCL) application (localize) and path planner (Navigator) as well as map creation and editing tools. The CARMEN toolkit uses Reid Simmon's IPC communication framework, obsoleting BeeSoft's TCX. The down-side is that it will require re-writing all of our software to use it with Zaza :(

Zaza 's replacement belts and motors arrived on the 3rd, but there is a problem the encoder on one of the motors. The new motor uses the current HP encoder technology, and a different cable from the old one. Unfortunately iRobot didn't ship the needed adapter cable with the order, so we have to wait yet another week before it arrives and can do the hardware checkout and qualification before returning Zaza to service.

On a positive note, I spent some time improving the client-server communication schema for the voiceServer and face applet over the last few weeks. The old method used a CGI-based polling technique that was rather slow and inefficient even with mod_perl. The new method uses a Perl POE-based server-side interface (POEfaceClient) to manage each of the connected face applets. This method greatly reduces the amount of handshaking needed to keep current with the voiceServer's cue stack. Another advantage that using shared memory provides is that backward compatablity is retained, so clients connecting through a firewall can still use the old interface method if needed.

The new client handshaking technique required re-writing several key areas of the face applet, so I re-named it zazaface2 so there won't be any caching problems with older browsers. The 2.01 release supports auto-reconnect on socket error, and can now handle server disconnect gracefully. I'll probably add an auto-fallback to CGI handshaking option in 2.02 and put a few of the options as applet parameters instead of hard-coding them.

The Tech received a rather sizable donation of Intel PRO/Wireless 5000 (802.11a and 802.11b) wireless LAN gear and is getting ready to install it. This opens up all kinds of options for Zaza, including real-time high-rate video and audio streaming. I have started the search for a ethernet-802.11a bridge that will alow use of both of the onboard computers, or alternatively a PCI card that has linux drivers available and an external entenna that can be positioned somewhere in the acrylic hood...

It ends up that Zaza's base troubles are a bit worse than I had originally estimated. After pulling each of the motors and running them through a few tests, I found that all three drive (translation) motors are bad. Fortunately, the Tech is closed for renovation the entire month of September, so we should be able to get her back up and running by the time the museum re-opens.

During the month-long downtime we should be able to make a few other enhancements like adding cooling and speaker vents in the robot's upper-enclosure, similar to what the newer B21r's have. This should allow us to keep the enclosure doors closed all the time without the potential for overheating. The speaker audio vents should help improve speech intelligibility too.

I recently added support for Sable markup to the voiceServer. There are some quirks, but it's now possible to re-introduce support for the sound effects we used with the original VB-based face application. Use of Sable also alows support for multiple voices and languages, as well as specifying pronunciation and inflection to improve the quality of the speech.

It's been a while since the last post, so I guess it's time ;)

We have had to push back the Zaza Phase IV deployment plans until we can resolve a voice intelligibility problem introduced with the new speech system. I had originally been using the MBROLA 'us1' voice with Festival, but both the quality and pitch of the voice were too low. I made several attempts to improve recognition accuracy by pitch-shifting the waveform and applying high pass filters to block-out some of the resonant frequencies of the enclosure, but they seemed to have no effect on intelligibility. I recently switched to the OGI CSLU 'tll' voice, which isn't quite as good as the M$ SAPI4/5 'Mary' voice, but a marked improvement over the MBROLA voice. Initial tests this past weekend showed a remarkable improvement in recognition accuracy. Some of the informational text monologs for the exhibit locations were taken directly from the museum's webpage, and should probably be re-worded to improve pronunciation with the TTS engine. It might also be beneficial to introduce support for Sable in the near future.

Zaza had another major hardware failure on Saturday. Since her hardware has used quite a bit, the belts that link all of the wheels to the motors are worn and due for replacement. It also looks, rather sounds, like one of the motors is in desperate need of new brushes. The amount of odometric error has been growing progressively worse and has begun to negatively influence the localizer. The 'MCP' has begun 'limping' the drive motors after most motion commands with a 'TERR' (translate error). This needs to be fixed before we can run the robot again. Hopefully RWI will be forthcoming with the info we need to service the robot...

17 Jun 2002 (updated 17 Jun 2002 at 19:01 UTC) »

Whooboy... The Zaza Phase IV deployment target is edging ever closer, and quite a bit has been done to wrap things up, but there is more to do. I have made major updates to reaction, poslibtcx and poslib and a bunch of minor updates to the map applet, face, and voice system since the last post.

Last weekend's test run went very well. All of the software modules performed flawlessly. The only problem observed was an ACCESS.bus lockup at the end of the run.

Three weeks ago the DC/DC converter that supplies the -12V reference voltage for both of the motherboards on the robot failed and needed to be replaced. Unfortunately the RocketPort serial board needs this voltage to drive the serial interfaces for three of the robot's subsystems. A drop-in replacement part wasn't available, so a TI PT4224 had to be installed in it's place. The new part is substantially more efficient, but larger and a different form factor so it required an adapter board to be made. Removing the old part was a real exercise. The power distribution board needed to be removed from the robot which took about an hour and a half. Then the old part needed to be delicately removed with a Dremel tool and large pair of Vise-Grips ;) The new part was installed and tested operational less than a week after the old part failed, but I might have stressed the ACESS.bus cables a bit re-installing the power distribution board causing the overly sensitive bus to be even more flakey. I'll need to fix this before next weekend's run.

I'll be announcing a limited public run of the new web-based tourguide functionality (Phase IV) for TRCY club members this week for next Saturday's run. If you aren't already a member, join and get in on the fun!

Last Saturday's Phase IV test went pretty much as planned. We didn't have any collisions or close calls during the run. During the test another demo activity converged in the 'atrium area' on the lower level of the museum blocking Zaza's only route to her next goal. The large number of people and the unmapped obstacles currently in the area from the installation of the 'Play' exhibit in the temporary exhibit space prevented Zaza's localizer from getting a revised position for an extended period of time. She began searching for an obstacle that looked familiar but the visitors were so engaged in the demo that they would not let her pass. The 'reaction' module was running, and the robot began verbalizing her dislike of being blocked to the demo audience, to the dismay of the presenter ;) To keep Zaza from further interfering with the demo, we manually joysticked her out of the area. The remainder of the test was fairly uneventful. I was finally able to get the high-level people detection code working about half-way through the run, and we used it for the remainder of the test. If it can do a better job of detecting people, I'll write a new version of reaction to support it for Phase III and IV operation modes.

I made a few architectural improvements to the voice/face system over the last few days. The voiceServer now maintains a 'stack' of the last n cues in shared memory to provide slower asynchronous clients a loss-free way to get speech cue data. This should eliminate the possibility of loosing cues that are sent too quickly to be spoken in real-time. The change required updating the clients and applet, but it was worth it.

Just for grins, I ran the code I have written since the aquiring the robot through SLOCCount, quite amusing this obsession is ;)

26 older entries...

Share this page