Although this robot built from the ground up to be modular and reusable, the first revision aims to compete in the Export level at the annual Trinity College Firefighting Robot Competition.
In short, the goal of the competition is to have a fully autonomous robot explore a house, locate and extinguish fires (candles) and potentially locate and mark a baby's location for rescue. At the expert level the robot starts at an unknown location in an unknown maze which increases the complexity quite a bit.

Progress Status

Item Description Status
Have new lexan base built and new Pittman motors installed.
Design and Build new suspension system.
Design and build a new motor driver board.
Port and test old motion control code to new encoders
Build a Power grid (reduced noise/interference and facilitate recharging).
Prototype and build a scanner laser ranger or IR based equivalent
Port high level AI code to ARM linux based SBC.
Build two NiMh battery pack and charger to replace single SLA.
Replace the linux software-based I2C interface with a USB-to-i2c Hub
Find a way to build/make a good omni directional mirror.
Develop Machine vision library.
Replace the two NiMh battery pack and charger with a smaller LiPo setup
Lower SIR's vertical profile and embed camera in there.
Develop a bumper skirt (with a PCF8574?) for worst-case scenarios.
Prototype and build Start Sound Signal Detector board.
Assemble and Mount Fan/CO2 Assembly
Develop an I2C based I/O module to handle extra sensors/actuators (ie. fan, compass, UV-Tron, Sonars, Tilt, etc...).
Add Sonar to help deal with possible reflective surfaces.
Add Digital compass
Make wireless dongle an integral part of the system to facilitate testing.
Finish Implementing Voronoi diagram based exploration algorithm.
In Progress...
Implement a mapping/navigation algorithm more efficiently navigate the topological map.
Soften suspension to prevent robot from getting stuck when hitting a ramp dead-on.
Adjust vision hardware/software to prevent it from seeing stuff above the mirror line (bigger shade disk?)
Build a better bumper system. Preferably a full bumper skirt that covers all around.
Design & test an algorithm to detect/track baby (Burnie) for rescue mode.
Develop an inclination sensor module (using ADXL202).
Design & build Rescue Beacon delivery mechanism for rescue mode.

System Overview

Main Processor

Not that I didn't have enough juice with my old AVR mega128 processor but I felt the need to learn new stuff so when I heard of a cheap commercial product being hacked online I totally jumped on it.

The SBC in question is hacked from a Linksys NSLU2 box which is normally used to share USB based hard drive on a network so it has two USB 2.0 ports, ethernet and sports a 32-bit Intel Xscale (ARM) processor running at 266Mhz with 32Megs of RAM.
All this for a low 100$ Canadian! The only possible problem is that it doesn't have an I2C module in hardware. But that won't stop me as the original product actually uses a software-emulated I2C adapter built-in the linux kernel to access the RTC chip on board. It supports I2C standard speed (100Khz) which should be plenty to start with. If that's not enough I can always use one of those USB interface chip and make my own USB-to-I2C module.

The NSLU2 comes with a Linksys customized (more like crippled!) Linux kernel so it was necessary to create a totally custom kernel and root disk to circumvent the manufacturer created limitations and get access to the board full potential by adding support for things such as wireless USB dongles or even a camera!

Luckily, I wasn't the only one working on this and a small community of hackers quickly put up a yahoo page to discuss it. After barely three months they now have a full wiki page documenting everything, from adding bluetooth wireless modules, upgrading the onboard RAM to overclocking the processor!

The community spawned quite a few variations of the custom software but the one I'm interested in is called Openslug: A 2.6.16 kernel running on a fully configurable root filesystem with none of the original limitations. To make things easier and more flexible, we started using the excellent "distro-maker" OpenEmbedded which permits us to easily add any software from their wide selection of more than 1700 packages. Stuff like an embedded web server, or the bluetooth utilities, even mozilla! A few of the PDAs running linux also use openembedded so it's fully featured.


I use a seperate power source for motors/electronics to prevent electrical interference.
After a short visit in the world of NiMh batteries, I now use Lithium-Polymer batteries to power my robot. I won't go over the pros & cons versus NiMh, but suffice to say that they pack a lot of punch and are much smaller/lighter which is a requirements for me.
I ordered them from Zebra Hobby, located in Calgary, Canada.

Because I need more power for my electronics than my motors I decided to go with a 3S1P configuration for an effective nominal voltage of 11.1V and a current capacity of 2.5Ah. And all this in a very slim package!
For the motors, I'm using two 2S1P 1.7Ah packs in series for an effective 14.4V package. These batteries are able to sustain peaks as high as 12C so I'm not worried.

For those unfamiliar with the LiPo chemistry, Each cell outputs a nominal 3.6V.
One of its only disadvantage is that these batteries are very sensitive to over-discharge (NEVER let it go below 3V/cell!!!). They are also very volatile and risk exploding if not used properly!
Because of the high risk of permanent damage on over-discharge I had to design a small low-discharge circuit to warn me when they are approaching their discharge limit. It's a simple voltage comparator circuit that turns on a piezo buzzer. Not as elegant a solution as one which pulls the plug but it works.

The power grid is arranged in a star topology (single ground point) to prevent ground loops and electrical noise from spreading. Each long run of wire going from platter to platter is further protected with caps.

Each layer/sub-system has a dedicated switching regulator (DC/DC) to increase power efficiency. I use mostly the National 1A LM2825 Simple Switcher(TM) for small loads (see SRX1's example)

For the NSLU2 board, which requires in excess of 2-3 amps with camera and other USB accessories plugged in, I use a dedicated PowerTrend PT6653D device which can source up to 5Amps:


The base is made of 3-4 lexan disks using 4 threaded rods to hold the thing together.
It uses the differential drive method.
The wheels are custom made out of lexan and driven using a timing belt via a pulley.


The two motors are Pittman GM9236 and came equipped with 512 CPR HP encoders attached which is one of the main reason I got them.
They are very powerful and low current and are normally rated at 30V, although most people run them at 14.4V with plenty of torque/speed.
The motors came with a timing belt pulley permanently attached to the shafts so I decided to use it to drive the wheels.
Thanks to my good buddy Lazlo for getting me the matching belts and pulleys!
TODO: add link to data sheet.

Peripheral Access Module (PAM)

The original revision of the robot used the bit-banged I2C interface already in use in the NSLU2 to communicate with the various sensors/motors, but after "accidentally' burning out (don't ask!) two of the NSLU2's processor's very scarce I/O pins I decided to switch over to a USB bus interface which I wanted to do initially anyways!).

Being cheap, I decided to try one of those AVR based USB software-implementation. There is the original one by Igor Cesko, (which is mostly in assembly) and a more recent and clean one (Mostly ANSI C) at Objective Development. I ended up using the AVRUSB one from Objective Development which is open-source and well documented.

It was pretty easy to add I2C functionality to it as they provide sample projects complete with schematic and source.
I implemented it as a generic USB-to-I2C hub to make it more scaleable. This way the module is reusable "as is" later (I could replace my ARM board with particularly no changes to the rest of the robot).

To try to compensate for all the time lost working on this I used libusb on the host side, instead of writing my own kernel driver module. It works fine and is very easy to use.

I implemented all this on an Atmel AtMega8 micro to keep the size footprint to a minimum.

Smart Motor Controller (SMC)

The SMC module is basically an L298 motor driver coupled to a Atmel AVR atmega8. The micro reads the dual quadrature encoders (512 segments) from the motors, does PID motion control calculations and passes resulting commands to the L298 driver.
The controller keeps precise odometry and provides status and commands facilities to the host via an I2C bus. This greatly helps in offloading the host processor of those chores.

Other features include:

Smart Infrared Ranger (SIR)

The current sensor array consists of 12 Sharp GP2D12 IR rangers in a circular arrangement (evenly spaced out at 30 degrees to cover a full 360 degrees). An embedded mega8 AVR micro is used to read a 20 channel high-speed ADC chip and do signal processing on the data.
The host can then obtain a status vector of 12 bytes via I2C as distances to objects perceived by the rangers in centimeters (or 255 when no obstacles are seen).

This "brute-force" solution could later be replaced by a laser ranging solution which would dramatically cut down in power consumption.

Input/Output Module (IOM)

Although the PAM (Peripheral Access Module) already allows the master processor to read/write I2C devices via USB I decided to lower my part count and space footprint by creating a single I2C slave device that can interface a multitude of sensors/actuators instead of having each sensor employ a dedicated micro. Thus the IOM was born to handle the UV-Trons, tone detector, CO2 extinguisher servo and SRF04 sonar. It's designed using a mega8. It runs the AvrX RTOS to simplify design and uses interrupts excessively to allow efficient multitasking (definitely no polling here!!).

Below is a list of sensors/actuators currently controlled by the IOM:

CO2 Extinguisher

I've been wanting to replace my trusty old hairdryer fan (as used in my SRX1 firefighter) with a much cooler CO2 solution but i must admit it looked pretty complex at first. I decided to keep an eye open for solutions without committing myself to it. During my research online for a high-pressure electronic valve (those little CO2 bottles used in pellet guns are very high pressure) I stumbled on a mountain bike equipment site which sold some cool little portable CO2 tire pumps! That looked perfect for me. small and it uses the standard little disposable CO2 can. I found a cheap one in a local store and bought it thinking I could actually use for my next motorcycle trip if I don't hack it. The problem was finding out if a normal size servo would have enough torque to press on the trigger though as the level is very stiff. To my surprise my old servo did it! So within a few days I figured out a way to mount it on my robot although it took a lot of effort considering the very limited amount of space I have left on the robot. The output nozzle is simply a piece of rubber tubing pierced with a drill bit and oriented at proper angle in front of robot.

Here are some picture of the setup:

Tone Detector

My SRX1 robot used a lousy way of detecting the start tone (I ran out of time and I had to breadboard something fast). As an anecdote, at Trinity 2003, during my second trial, my robot was in the maze, powered up and waiting for the judge to position the candle in a room; suddenly the crowd at the other end of the gym went wild and cheered so loud that my rudimentary tone detector triggered and the robot started!! No need to say that I wanted to avoid that this time and created a nicer one based on the LM567 Tone decoder chip as described on Alex Brown's page. Local copy of the schematic available here.

UV Sensors

My first firefighter relied on the primitive camera system I had to both figure out if a candle was in range, and to aproach it. This time around I've added 2 UV-Tron modules to increase reliability and detection range. Although I'm pretty certain my vision approach would work fine at long range, I like the idea of being able to detect flames around corners and behind obstacles. And there's never enough redundancy when it comes to sensors! The IOM takes care of counting and debouncing the pulses received by each UV board and sets a flag when the total count passes a threshold per interval.

Digital Compass

The Compass is a CMPS03 from Devantech.
It is read by the IOM using a softare based I2C master (My master I2C bus runs at 400Khz which causes trouble to the CMPS03.)
Although the compass is known to be unreliable in the gym where the Trinity competition is held (because of the electrical wiring and steel structure), I've decided to add one just for fun. I'll try my hand at averaging its readings and integrating it with the odometry. The compass is know to swing as much as 90degrees when close to magnetic disturbances but gives correct reading most of the time. The odometry on the other hand tends to drift slowly with time so by slowly integrating the compass heading into the odometric one at every few centimeteres travelled I'm hoping to get a reasonably correct heading.


To replace the awfully slow/old Intel Me2Cam USB1.0 camera I had originally, I got a USB2.0 Trust SpaceCam 380 camera from eBay. They are both supported by the OV511 linux driver project although I ended up using some guys' prototype driver which is much faster and leaner.
I had to hack it to add YUV422 support as it only did Bayer RGB which was too inefficient for my board.

With the new driver, using USB2.0, I can capture 320x240 images at up to 60fps! And 640x480 images up to 30fps.
I've decided to use 320x240 at 30fps as 640x480 would make it way too slow to do any kind of image processing.
The actual frame capture call is as short as ~8ms so it leaves lots of time for processing when compared to my old setup at 160ms!



The software is changing on a daily basis but it will basically consist of a 3 layer architecture composed of a motor schema based reactive lower level, a simple schema sequencer middle layer and a static planner (merely an FSM for now) at the top.
The current prototype uses LinuxThreads to seperate the various module into threads to maximize performance. There is be a machine vision thread that provides the perceptual schemas with vision data they can use to guide their decisions.

There is an executive thread that runs a TCP server which responds to query from a Tcl/Tk GUI via ethernet/Wifi.

More descriptions to come...


To help in debugging I designed a TCP/IP based server Executive thread which responds to request for data/commands from various Tcl/Tk based graphical interfaces. Here are a few pictures of the working prototypes:

motor schema manager GUI vision GUI manager vision GUI in zoom mode

Computer vision

Although using my new vision library would have been nice (especially after "wasting" a whole year on it instead of this robot!) I'll only be using it to locate, align and approach the blob formed by the candle as seen by the camera. I was thinking of using it to have the AI avoid yellow blobs (furniture) but I doubt this will work well enough considering how low the lights are during the competition.


As the robot is still in construction, I won't be updating the page until later but here's a link to my picture archive in the meantime.

Last updated on: November 8th 2006