Category Archives: Robots
Robo-Magellan Robot Project – “Odyssey”
After many months of effort, here is Odyssey, our SRS/Servo Robo-Magellan contest entry. What is Robo-Magellan you ask? It is a contest devised by the Seattle Robotics Society, inspired by the problems to be solved for the DARPA Grand Challenge. It is an outdoor, autonomous, robotic navigation contest. A Robo-Magellan robot must be able to autonomously drive around a 500ft. square area of the Seattle Center, avoiding trees, rocks, park benches, garbage cans, light posts, railings, sculptures, and many other obstacles, and find orange road cones placed at specific waypoints. The GPS coordinates of the cones are given out 30 minutes in advance of the contest, and the robot that navigates autonomously from the start to the finish cone in the shortest amount of time, wins the contest. The only remote control that is allowed is for a safety shut-off switch should the robot run into trouble. Each robot gets three tries to navigate to the final cone. The best time of three is used for their final score.
Our robot is a monster. The contest rules state that the robot cannot weigh more than 50 pounds, Odyssey weighs in at 48 pounds even. Bob and I have worked hard and long on this robot, and inside, it contains a culmination of all our electronics efforts to-date. Of course, this robot would never have come together, if it weren’t for our sponsor, NPC Robotics (plug-plug). They provided the excellent motors, and a motor controller that can easily move this beast over almost any terrain with ease, without any shortage of power or torque.
Bob shot the initial snapshot above during the final assembly phase of the robot. Notice once again, Bob’s excellent CAD design, and how close the final product closely resembles the drawing. I think the only major changes that were made was the tail-wheel assembly, and the location of the GPS dome. Since weight was such an issue, Bob literally weighed every nut and bolt on the robot, and had the CAD program calculate the final weight, so we knew if we built the design, we would be under the 50 pound weight limit.
Here is a laundry list of technologies we have on-board:
Almost every single board in our BotStack robot bus, main, sensor, CPU, navigation, camera, radio, and motor control.
Environmental Grade Sonar Array
GPS System with WAAS correction
6-degree of freedom inertial guidance system (The grey box on top of the robot)
2-axis Magneto-Inductive Compass
20fps Camera & Computer Vision System
R/C Receiver for Remote Safety Fail-Safe.
12v Permanent Magnet Motors with optical-encoder feedback for closed loop control and odometery.
100amp Motor Controller
Here is another photo of it from the front, and the inside:
Navigating the Robo-Magellan course could be accomplished in a number of different ways. We chose to use GPS, combined with a 2-axis compass, and a home-brew inertial navigation system. We chose a WAAS enabled, OEM model of GPS unit made by Garmin. The GPS-18 LVC. Since it is an OEM unit it needs a mechanism for reading and storing waypoints, off of the robot. We built this nifty handheld unit to walk the course and store the waypoints, and then download them from the handheld device into the Robot just before the competition.
Here is our home-made inertial navigation unit. It has three ceramic gyro’s, and two accelerometers to yield 6-degrees of freedom of inertial measurement. The compass board mounts just below the gyro board, with the same footprint, and all of this is housed in the square box on the top of the robot. All of these measurement instruments, combined with the GPS should give us fairly accurate position and heading computations as the robot moves through the course, with or without a consistent GPS signal.
Finding the Cone:
Tracking a red or orange target and driving a robot to it is a problem we had already solved before on two other robots. The tabletop challenge robot finds a red box on a table and pushes it into a shoebox, and the balancing robot tracks someone wearing red and follows them around. Operating outdoors is the only additional challenge. The lighting conditions can vary dramatically. ere is the view through the lens of the camera looking at the cone outside.
|Direct Sunlight wo/Filter||Direct Sunlight w/Filter|
|Shadow wo/Filter||Shadow w/Filter|
I spent a considerable amount of time looking at the cone under various scenarios, such as the robot being in the bright sun, with the cone in the shadows, or visa-versa. Notice that the cone becomes quite washed out in direct sunlight, and the reflection makes it almost look white. The addition of a neutral density filter seems to cut back on this problem, but creates additional issues when the cone is in shadow. It has a tendency to make the cone blend in with the background, and not stand out when in shadowed conditions. Currently we are using a CMUcam-2 as our vision system, however, we are currently working on our own FPGA based camera system in order to do assist with multi-target tracking, and high-speed vision processing of the terrain the robot is traveling over and around.
The Robo-Magellan Contest 2004:
The contest was tremendously fun! I think out of the 12 robots who entered, 10 of them showed up for the contest. The course was much more difficult than I anticipated. In hindsight I should have shot more photos of the course, and where some of the cones were located. It was tricky to say the least. The starting point was on a small asphalt path back in a grove of trees, right up next to a building. Most of the contestants GPS did not work properly in that location, so it made for an interesting start. I don’t think any of the contestants emerged from the wooded area on their first try. They all ran into rocks, or trees or got stuck along the way somehow. Each time had a bit of time between each trial, so I think most of them did some minor tweaking to get out of the woods by their 2nd try.
As you can see from the photos, it was fun for the spectators too. It was like watching a golf tournament. The robot would move along its course on the way to the cones, and this big crowd, of what seemed like over a 100 people would follow along to see how the robot would fare. Usually there was a big cheer or gasp from the crowd every time one of the robots cleared a difficult obstacle, or happened to run into an obstacle or get stuck. It was really exciting to watch.
Dave Hylands shot lots of video, and some excellent still photos of some of the competitors. He needed a website to host all this great stuff, so I volunteered. Here are all of Dave’s photos and videos from Robothon 2004.
There was lots of stress, and last minute preparation to get our entry going, but it finally all came together. Here are some photos and video clips of Odyssey during it’s second and third try to reach the final cone. The photo above is the 2nd try, and the photo to the left is the 3rd try. We ran into a rock about 20 feet from the starting line on our first try. Once again, although you see Bob holding a remote control unit in his hand, it is only used as a safety shutoff. This robot is totally autonomous.
It was a difficult contest. For the first year the contest was held, I thought it was excellently orchestrated, and I was impressed with how many robots showed up at the starting line on Saturday morning. Out of all the robots, and all their three runs, nobody touched a cone. It was that difficult. Since nobody reached the final destination, the judges made a subjective decision on who the winners were. How did we do in the contest? We placed 2nd. Here we are with the robots, and the awards at the Robothon. Flexo, the latest incantation of the balancing robot with the camera on top, took the “Best Engineered Robot” award in the open category for the show.
RoboMagellan at the SF RoboGames 2005:
Although we had several months to make improvements after Seattle, of course we waited until the last minute to really get cranking on the lessons learned from Seattle. Here is Odyssey making his third and final run during the RoboGames.
Here is how our runs went:
On our first run, we had major software problems, and ended up not able to make the first important turn in the course, thus sending us off down into some trees that were the wrong way to the destination.
We made a few software changes for the 2nd run, that were hastily implemented, and prevented Odyssey from even getting off the starting line…so we had to forfeit our second run.
Our third run was the best of all. We navigated away from the starting line correctly, made the correct turns to navigate down the starting ramp, and out onto the course. We navigated correctly about half way to the destination cone, got near the grove of trees, and lost the GPS signal. The dead-reckoning code that was supposed to take over to keep Odyssey on track, had a bug in it (of course). So, he ended up navigating a path perpendicular to the destination. Our obstacle avoidance was working flawlessly. We successfully circumnavigated a flower bed, and many plants without driving into them. Odyssey finally got a good GPS signal again, turned around and started heading back to the destination, but by then it was too late. Another bug in the code caused him to stop dead in his tracks about 1/2 way to the destination. The run was good enough to take 2nd place (the silver medal) in the competition.
Lots of excitement! We are definitely looking forward to doing this contest again in the future. I guess several other clubs are talking about holding Robo-Magellan contests, so it will be nice to have more opportunities to compete with our platform. Either way, we will be there next year in Seattle for the 2005 Robo-Magellan.
(03/27/05) – Flameout takes the Silver medal in firefighting at the 2005 RoboGames
(03/21/04) – Flameout takes the Bronze medal in the firefighting event at the Robolympics!
(10/29/03) – Flameout takes 2nd place at the Seattle Robothon
I built the first version of this fire-fighting robot, by taking Bob’s well designed base for the tabletop challenge, and bolting on a top platform with all the necessary fire-fighting gear. It’s first debut was at the 2003 Seattle Robothon, sponsored by the Seattle Robotics Society. All the additional sensors were a challenge to get debugged in-time for the event.
Bob and I got the hardware all debugged the night before we had to leave for Seattle, and I literally wrote most of the software on the airplane. The first time the robot set its wheels in an actual house, was the morning of the contest. All the software was written by looking at a printout of the map of the house on the airplane, and guessing at what the right thing to do would be. Anyway, amazingly, we took 2nd place in the competition! WOW! It was a-lot of last-minute stress to get everything working properly, but well worth the stress to participate in a fire-fighting competition for the first time.
Here is “FlameOut” looking for the fire at the event. I figured it was either going to put the flame out….or Flame-Out itself. I have to mention that only one robot…the one that got first place actually put the candle out. There was a good-sized crowd of spectators on-hand to see the fire-fighting robots in Seattle.
Both fire-fighting robots uses the same board stack Bob and I developed for the tabletop, with one additional board, specific for reading the fire-fighting sensors. The fire board reads the Hamamatsu flame sensor board, as well as the Eltec Pyroelectric sensor, deals with the servo sweeps, of the pyro is mounted to, and then distills all this info down to a digestible form for the main PIC to process. The motion control code is almost identical to the tabletop challenge, except that there are walls to avoid, and obviously, a flame to try and find. The onboard relay circuitry is hooked up on this robot, so I can power up a fan motor and blow out the flames when the robot finds the fire.
Bob has been dreaming of doing robot fire-fighting for years. He has built a fire-fighting house to practice strategy. Hopefully we can find a place to set it up where the winter weather won’t wreck it, and then we can practice more fire-fighting. After the Seattle Robothon, I pulled the firefighting gear off the robot, in order to re-use the base for the tabletop challenge. I designed a different base for doing cooperative swarm work, that I decided would make an excellent firefighting base. I bolted all the firefighting hardware onto this new base.
This 2nd revision of the base, is what I entered into the firefighting competition at the 2004 Robolympics, where it won the Bronze medal!
I learned many new lessons about how to make a fast, efficient firefighter at the Robolympics. I hope to leave this robot set-up in it’s current configuration so I can just improve upon the software for the next robot firefighting event.
RoboGames 2005 Firefighting Competition:
Here are video clips of many of FlameOut’s first two runs:
|FlameOut – First Run|
|FlameOut – Second Run|
Here I am on the left at the 2005 RoboGames accepting the Silver medal for FlameOut. Yes, the Gold medal was won by another member of the Home Brew Robotics Club, Tony Pratkanis. His robot Solenopsis Invicta, ran the house flawlessly, and with good time. Oh, and did I mention he is 13 years old? He put us all to shame with his well designed firefighter.
Here is my Phase 3 tabletop challenge entry. It has come a long way from Phase 1. I designed and built a pan/tilt head for the robot, and mounted the CMUcam on it. The robot head can now track the box while it moves, automatically.
I chose the CMUcam vision sensor as a low cost way to getting a camera on the robot. It seemed like a perfect application for the camera. It took quite a bit of programming to get the camera working well under various lighting conditions. It is still not completely debugged yet, but I am confident I can get it to perform at the level necessary to make the robot do the challenge well. Here is a close-up of the mechanics of the pan-tilt head:
Here are some images captured through the eyes of the robot, looking for his boxes to play with. He has several different boxes of different colors to experiment with:
The camera is not very high-resolution, but the images are good enough for the task of tracking the box. Especially if the tabletop does not have lots of noisy background. For example, here is a shot of his friend, “Mr. Bricks” as seen through the camera. Notice, it is not a very good photo. Much of the detail is lost, versus, looking at his box, up-close…wow…looks pretty good.
Good lighting is clearly an issue. The lighting in the room where I did these experiments is not very good, so it is an excellent test environment for less-than-ideal conditions. Color tracking seems to work best when the color is something rare in the room. I started out using a white box, which was clearly a mistake, because any hot lighting source looks like the box to the camera. I had to limit the range of travel of the camera, so it could not look up. Otherwise, if it caught a glimpse of the room lighting, it would just look up and stare at the light….NOOOO….Not the light!! Get out of the light!! Agggghh!
Here are some video clips of my initial testing of the pan-tilt head doing box tracking:
I finally have the robot chasing the box! Here he is playing with a little white box.
I have some video clips of the robot chasing a red box around his little play area on my office floor:
Box Chase #1 - He is programmed to drive up and stop when the box is snug inside his front scoop. If he loses track of the box, i.e. I move the box away from him too quickly, he does a quick head scan to look and see where it might have gone. Once he is locked on again, he tracks down the box.
Box Chase #2 - As I move the box around, the head tracks the box independently of the body. Once it sees it isn’t moving much, he repositions his body to point at the box, and drive up to it.
Finally! The tabletop challenge is complete! He can now find the goal and put the box into it. Unfortunately, I did not shoot any video of it doing it’s thing at the last club meeting, when it was performing at its best. However, I did set up a little play area on the dining room table to shoot some clips to share online.
Box Chase #3 - Here is Tracker chasing the red box around the goal setup. He seems to be tracking nicely, although he is periodically distracted by a mirror on the wall nearby.
Goal Attempt #1 - In this first attempt, he finds the box, and starts to head for the goal with it, and the power connection to the camera gets flakey. Once I get in there and wiggle it, he finds the goal, and puts the box in.
Goal Attempt #2 - The box is a bit off-center, he finds it and puts it straight in.
Goal Attempt #3 - He finds the box, and heads for it, and he almost puts it in, but then gets hung up on the edge of the goal. He corrects himself, re-centers the box, and puts it in.
(06/01/2005) – Tracker gets a new Head! I recently outfitted him with an AVRcam. This is the camera recently featured in the April 2005 issue of Circuit Cellar, and is the creation of John Orlando. You can get them from www.jrobot.net. It is quite similar to the CMUcam, except is has a higher frame rate, is based on an AVR chip, and has an OpenSource core to it. Yes, I have already been in there fiddling with the firmware.
The tabletop challenge at the HBRC is coming up again here, so I am planning on running him in the challenge this year with his new camera.
What’s left to be done now?!? He still needs a few more tweaks to handle various lighting conditions and he needs more sensors to go head-to-head with another robot on the table trying to steal the box away from him.
Yes, I finally put the finishing touches on my mini, balancing robot that does color tracking. His name is Flexo. Yes…after Benders smarter brother. I used Flexo extensively for testing cone tracking techniques for our RoboMagellan contest entry
Here is Bob showing Flexo’s cone tracking technique:
Flexo has sonar onboard, so we can send him into a crowd, and he will find someone wearing red and follow them around without running into them. We took Flexo with us to Seattle for the 2004 SRS Robothon, and he was a big hit. Every time we powered him up, he drew a good sized crowd, and almost always, he found someone wearing red to make friends with.Sure enough, birds of a feather flock together, he ran into his big, balancing, distant cousin, a Segway, and it’s owner was kind enough to let me get a photo of the Segway, and Flexo, balancing together.
Flexo took the “Best Engineered Robot” award at the 2004 Robothon. Here is a snapshot of Flexo with his award ribbons, and Odyssey, our Robo-Magellan Robot, which took 2nd place in the Robo-Magellan competition.
Next steps for Flexo, are better navigation techniques for roaming around, and a better camera/vision system for doing more than just following the color red.
Bob, worked up a new tabletop, mechanical design, that has downward looking IR sensors. It is more suited toward the challenge goal which is to find a 2×2 box and push it off the table. Bob built two robot chassis, and all I need to do was add my own wheels, and electronics. This provided me with an opportunity, to turn up some all-aluminum wheels on my 9×20 lathe. Here is the robot with my home-made aluminum wheels mounted to it.
It took a day or two, to mount up some electronics, to get it driving around the table about as well as the Phase 1 robot Mr. Bricks. Here is a photo of both of our robots all fitted with electronics. Still needs some work, but it is a good start toward a better overall design.
This robot is turning into a reusable platform that could be used for more than the tabletop challenge….perhaps even a Trinity fire-fighter! Here is a shot of the board stack that Bob and I designed for this robot. Lots of MCU processing horsepower in there. All total there are 10 PIC’s working in parallel to solve the computing problems.
It has a main board for motor driving, with add-on modules for reading sensors, a main brain module with lots of memory, and finally, a fire-fighting board for dealing specifically with fire-fighting sensors when we get that far along.
I have actually used the firefighting board by bolting the appropriate sensors to the top of the robot and rewriting most of the software to focus on a fire-fighting house.
Here are the pair of robots with all the electronics mounted on-board. Yes, you guessed it, it even has a CMUcam interface for chasing that tabletop soccer block. I have the CMUcam mounted on my robot in this photo of the pair, together at a recent robot club meeting. I still have yet to begin figuring out how I am going to read the camera and deal with the sensor fusion. I will probably wait until Phase 3 to figure it all out.
I have recently finished debugging all the motion control for these robots. Heh…it’s not a coincidence that thereis a huge control theory book in the background on that photo.
Anyway, I shot some video clips of the two robots first steps around the kitchen using full dual-differential drive PID control, using all this new motor driver hardware, and a self-designed control algorithm derived from reading a bunch of books on control theory.
In these clips, my goal was to drive a few feet, in a straight line, make a perfect 180 degree turn and drive back. I am using wheel odometery to measure my distance traveled, and to work on that perfect 180 degree turnaround.