Older blog entries for cschur (starting at number 43)

Well, Ive concluded that the RIS software with the mindstorms kit is truly wimpy! Can hardly do anything with it. But if I keep in mind Im only trying to prove the validity of new mechanical designs, then its ok. Its hard to get serious on doing months of hard work on a robot that you will have to tear apart to get the kit back again when its over with. Enough said on Lego.

Im working on two interesting robots now. The first is a precision navigation robot, to experiment with pinpoint navigaion to a given point and back. Ill need this experience to start the next huge robot project, the GeoBot, a geologic explorer robot that will autonomously go to a specific location (my yard first!) and collect interesting rocks - THEN bring them back to me all by itself. It will be awsome. But first, I am using my PICbot to do the navigation proceedures and get them down.

The other robotic device is to make my Canon 10D automatically take long exposures of the night sky on a tracking platform, much like my Aurora Cam robot. But I will have a plunger push the remote button to start, and then time all the exposures all night long unattended. This will be also a cool project.

Well, thats what Im working on for now, write me if you have any great ideas to include in my upcomming geobot - the rocks are waiting to be discovered!



Hi all,

I am working now on two separate robotic projects. I have to temper this work with my astrophotography (www.schursastrophotography.com). On one front, Ive spent the $200 bux and got one of those mindstorms kits. I always wanted to know why there was such a cult-maddness about that system. I built my first robot with it, a simple wall avoiding light sensing rolling bot, I called XBot. You can see it here (dont laugh):


Now that your done laughing, well see where I head on this. I expect to be able to test new mechanical designs primarily, because the RCX code is so, well, wimpy.

NExt, Im starting a now multiprocessor PIC robot project, NAVbot. My plan is to get a small table top robot to very accuately navigate to precise coordinates and experiment with the CMU cam. The goal is to get the routines and math down to incorporate into the much larger tracked GeoBot which will be an outdoor bot that uses the Stuart M5 tank base that has been stripped to the bone.


HI all,

Well, we finally did it. We finished our PAAMI project, and the last goal was realized this weekend, to locate, collect and deposit an empty soda can in the charger bin. Ive put together some movies of this final project, you can find them here:


Thanks for looking! Chris

Made more progress on the final challenge for PAAMI. The top most processor in the hierarchy is the processor that will search, identify and grab a soda can, and then deposit into a bin on the top of the charger. Ive programmed all the pieces separately, and they all work, now its a matter of assembling it into a complex Finite State Machine. None the less, last night when the "can search" switch is thrown, the robot goes into a can search routine. The robot goes forward or avoids obstacles then every 5 seconds the new processor subsumes, stops, rotates 360 in 55 steps while scanning for a can with the new can sensor. If a can is seen, it stops, makes the "happy sound" and for now stops. Eventually it will approach the can to put it in the grab zone, and pick it up and go off to the charger.

One issue is the fact that if I dont leave at least 4 to 6 seconds between can find processor subsumptions, the robot has some difficulty escaping objects since it takes a few seconds to go away from impacts. This leaves gaps in the coverage on the zone around the robot to find a can. Right now it can reliably spot a can up to 16 inches away. Pretty cool, ay?


Hi All,

Havent posted a message in a while, been busy getting my act together for the upcomming Riverside Telescope Makers conference in California. Im giving a talk there.

On the bot front, Ive been struggling with getting the a/d converters working right with Picbasic with the 16f73 device. It works fine for one A/d, any channel, but If I tried to take measurements from two channels consecutively, it would muck up. Finally, I had to use the ADCIN command, and it works perfectly. I now have a can sensor that on the bench gives a go/no go lamp when a soda can is within its beams. Im adding the sensor array to the front of Paami, and have now wired it in to the last level 8 processor. Now onto programming the FSM's for this level!

Write me!

Chris Schur comets133@yahoo.com

Hi all,

Well, here it is - our first operational images and movies of our new robot arm/gripper on PAAMI. Right now, Im just showing you how it works, and how it will be used. The arm motor is slow right now to keep the arm from jerking when it lifts, and will be replaced with a standard servo which is on order. This is a great example of priority arbitration architechture. The highest level now is the Grab module, and subsumes the other levels when triggered.



Chris comets133@yahoo.com

HI all,

Another update on PAAMI. We are on the last processor module we intend to implement, the can seek and grab module. Im using a PIC16F73, which has 22 inputs and a nice4 A/d inputs at 10 bits. Its also great because it will hold 8k of program memory. This will be the 12th parallel processor working in her. This weekend, I wired in the servos on the arm including the lift and grab motors, and got the new processor up and running. I was able to FOR THE FIRST TIME to have the robot reach out, grab a soda can and lift it and store it on its back. Pretty cool, ay?

Now heres the freaky part. I removed one of the two boards in the bot, and had it on the bench to add the new processor. But the other parallel processors were also powered up on that board, and were alive and thinking! While I tested power and voltages on the socket for the new processor, the dozen lights or so on the board were flashing, showing status and trying to drive the bot around! Oh man, thats kindof freaky working on a living AI....

Chris comets133@yahoo.com

HI all,

Exciting news - our last endeavor on PAAMI is to get the can lifing arm and gripper working. This weekend I added the lexan gripper, driven by one tiny servo, and the arm which is a pivoting horseshoe shaped bar that pivots over the top of the robot, over the beacon dome and lowers the gripper to the position the can is supposed to be in eventually. Im controlling it manually right now with a servo driver box, but after I refine the mechanism, which works darn well right now, Ill wire it to the motherboard and start the final level of subsumtion, the can grab and deposit level. Videos to come soon!

Chris Schur

HI all,

Here are some more IR images of the robot in the dark, You may be interested in the patterns projected onto the ground by the GP2D120 and 12 sensors:



We have finished our drop off sensor array for our priority arbitration bot, PAAMI this weekend. The final touch was to add two GP2D12 sensors on each side of the rear caster wheel with a small 12F675 PIC to convert the analog to digital and drive the IMPACT processor. The robot is programmed to go forward a short distance ballistically when the rear drop off sensor swings over the stairwell. You can either rotate your back wheel over the stairs and fall off, or back up over it. Either way, its doom. The other sensors were tilted out a bit on both sides of the main drive wheels to add some additional buffering to the dropoff. Finally, Ive done some IR imageing of the GP2D120 and 12 sensors light beam. You may find this intersting, and Ill post some images later on this week, but in essence, the beam diamter is pencil eraser sized in the first few inches, with a sharp edge. At 1 foot, its up to half an inch and by 2 feet up to an inch and becomming very diffuse.


34 older entries...

Share this page