Meet BobBot Mk2, the intellectually challenged robot. This subsumption based robot is probably the most challenging robot I’ve ever had to program. It has quite a number of sensors:
- Mindsensors NXTCamV3
- Mindsensors Magic Wand
- Mindsensors Dist-nx
- HiTechnic Sensor MUX
- HiTechnic EOPD
- LEGO Touch Sensor x 2
- LEGO US Sensor
The robot has a couple of tasks, one of which constantly polls all the sensors and stores their values. This stops me having to worry about the different behaviours trying to access those resources individually. There’s another task that displays those values in an easy to understand format; the HUD (Heads-Up Display). The HUD is quite a nifty piece of the program. It displays the coordinates of the object that is currently being tracked with a cut-down view of what the camera is seeing on the right. “D” is the distance that the DIST-nx is detecting. When in tracking mode, this is the distance to the can, otherwise it’s the distance to the nearest wall. “E” is the value of the EOPD, one of the side-wall sensors. “S” is the current state the subsumption engine is in. This is also shown using the Magic Wand’s LEDs. “T1” and “T2” are the two touch sensors that help with controlling the gripper arm. The US sensor has not been programmed into the HUD yet. The small circle and number next to it is the current heading. This is calculated by another thread using odometry.
Here’s a small video of BobBot Mk2 doing its thing.
As you can see, much work remains to be done. I’ll keep you posted.