SRX1 (Stephane's Robotic eXperiment 1)

This is a picture of my final prototype. Click here for older pictures

Table of Contents

Project Description

This is my newest robotic project. It's going to be the basis of all my future robotics endeavours. I'm ready to devote a lot of time on this so that I end up with a solid mobile base for experimenting with more advanced concepts like camera-based vision, automatic map-building, etc...

But for now I've decided to use it to build a fire-fighter to participate in the Trinity College Fire-Fighting Robot Competition in Hartford. I'm currently aiming to enter the contest in the non dead-reckoning mode with 'alarm start'.

Here's the floor plan.

This is a very good exercise as it mimics a house which is what the future robot will have to tackle anyways. The first step is to develop a robust and flexible mobile base, then implement a very precise PID controller for it. Add some basic sensors to be able to avoid obstacles and follow walls, a bit of programming and the rest is all fun!

Source code, schematics and all documentation will added here as I progress.


Progress Status

Item Description Status
Build Mmobile Base w/ decent power/speed motors and wheels.
Get decent motors and wheels.
Build decent cheap motor controller board.
Add incremental encoders to the motors.
Mount and wire up IR distance sensors.
Prototype Sensors/Motors/Power circuitry interconnection boards.
Port basic RTOS and Executive from last project.
Implement basic PID based Motion Control.
Implement basic Wall Following behaviour.
Upgrade encoders to full quadrature w/ higher resolution.
Upgrade to AVR ATmega128 on STK-300 dev. board (from mega323 prototype).
Build I/O interfacing daughterboard.
Assemble and Mount Fan Assembly.
Redesign the Power grid to reduce noise/interference and facilitate recharging.
Prototype basic Motion Sensor circuitry.
Implement basic I2C slave (for camera module).
Assemble and Mount White Line Detectors
Implement PLANNER layer.
Transition current Subsumption Architecture to a Three Layer Architecture.
Implement remaining behaviours.
Implement behaviour based navigation algorithm.
Make operating constants changeable at run-time (defaults stored in EEPROM).
Design and test a better suspension system to tackle those evil new floor items.
Build new bigger wheels to increase speed and clearance height.
Complete Camera Module Software (make more reliable under varying light conditions).
Implement a way to return to Home position after successfully extinguishing candle.
Transfer SCM's hardware to a final PCB.
Build a power supply board for the upper deck
Test, Test and then Test some more!!
Prototype and build Start Sound Signal Detector board.
Profile motor & processor's power consumption & Optimize.



This is a picture of the base with the gearboxes and the two 12V 2.6Ah batteries installed (A very special thanks to Laslo 'The Man' Roska, for providing this nice base and batteries!):

Here's a picture of one the two Tamiya ball-bearing casters mounted under the base. Those are cheap (~10$ CND for a pack of 2) and have 4 different adjustable heights:

Here I added 4 aluminium treaded rods to be able to add platters to hold the electronics & sensors. The platter concept, when used with wing-nuts, permits quick disassembly and design flexibility.

So here's the final base with all the final PCBs in place (without the motors/encoders):


Here's a few pictures of the Ball Bearing caster I used to replace the Tamiya one to provide for better handling of the new trinity floor ramps. Because of those new floor items/ramps (or should I say BRICK WALLS! aarrrgghhh!) I was forced to come up with a suspension system where the rear caster would take up the slack as the front caster is climing the ramp. Check it out:

UPDATE: I redid the whole suspension and it works quite well! I didn't try it with an actual pizza plate like they used yet but I tried it with a normal plate upside down and it worked ok! (and believe me, my plate has major dips/grooves in it and is really slippery!)

To get there: I quickly realised that whatever suspension I could come up with, it would never work because of the very limited ground clearance I had (around 1-1/4"). So I asked Chris, a fellow member of the Ottawa Robotics Enthusiasts club and mechanical guru, to help me redesign my wheels so I could:

  1. Increase my ground clearance to about 2"
  2. Increase the overall speed of the robot
So my wheel diameter jumped from 12cm to 16cm! (Which results in a 30% speed increase if I my calculations are good)

Here's a picture of the new Lexan wheels:

I now had plenty of space (too much actually!) to fit in 2 ball bearing casters and hinges. I had to buy a whole lot of various size springs to try and figure our which one would give the best firmness without bouncing too much. I ended up cutting one of the larger and softer springs for the final design. Here's the final suspension:


The two DC motors used in the drivetrain are Mabuchi RS-380SH-12300.
They are rated at 30V but run at peak efficiency at 24V, producing a fast 6400 RPM, and consuming a mere 310mA.
Click Here for the Specification Sheet (PDF format. 57Kbytes).


Optical Encoders

Because I'm so cheap (and for the challenge), I decided to build my own optical encoders. Although the smaller Hamamatsu P5587 sensors would have been way nicer, I went with the Fairchild QRD1114 I already got from Digikey.

Here's a picture of the partially assembled optical encoder assembly: (sorry for the blurry pictures but my digital camera lacks a macro mode)
(Click on pictures to see at full resolution)

I used two small slotted plastic brackets to hold the optical sensor's PCB. The slots make the calibration and adjustment of the sensors a lot easier. The PCB were slid up and down the brackets until the desired sensitivity was obtained; at which point they were glued in place. The encoder disks were glued onto a little piece of cardboard and then epoxied on the rear end of the motor's shaft. It's fragile but with the proper glue and cardboard/paper combination it's solid enough to withstand continuous rotation.
Here's a shot of the final assembly:

These encoder disks have 8 segments (4 black, 4 white), which gives a total resolution of 8 encoder "clicks" per motor revolution. With a gearbox ratio of 150:1, it means I get 150 x 8 = 1200 clicks per wheel revolution. Using a PID frequency of 20Hz gives me around 19 clicks per PID cycle (at 100% PWM with 12V), which is not that much resolution. This is especially true at lower speeds when you start approaching 3-5 clicks/PID period. I could make the PID period longer but this would increase position errors and make the PID choppy so I'll stick with a 20Hz period for now. (Note that because the above results were obtained using a 12V battery. The final robot will be using two 12V batteries to maximize the motors' dynamic range)

After adjusting the position of the encoder assemly to a satisfactory level I realised the output of the QRD1114 was too noisy and the amplitude too low to be interpreted properly by a processor's input. To correct this problem I came up with a simple comparator design based on a LM324 quad Op-amp IC with an adjustable voltage threshold. After a bit of tweaking I had a nice and uniform square wave!

Here's the schematic for the encoder system.

Update: I recently decided to completely redo my optical encoders using smaller sensors (Sharp GP2S40) for increased resolution and by using two sensors per motor to generate quadrature output so the processor can determine the exact direction of rotation at all times. Although this robot is to compete in the non-dead-reckoning mode of the fire-fighting competition, I just felt that having higher resolution encoders would be useful for the future and also because I observed some weird readings with the old ones.

Following are pictures of the new and final encoders:

Planning stage
Precision work at its best! :)
Motors were drilled to screw in support rods.
Finished product (top)
Finished product (bottom)
Test encoder disk superglued to shaft.
Calibrated and assembled.
Final installed encoder with cover.

Here are the updated calculations for the new encoders:

1No-load motor speed @ 24V
106 RPS (6400 RPM)
2Number of encoder egments
3Interrupts per motor revolution (full quadrature decoding)
[1] X 2 = 10 X 2 = 20 CPR
4Interrupts per second (full quadrature decoding)
[1] X [3] = 106 X 20 = 2120 IPS
5PID controller sample rate
50 Hz
6Maximum ticks count per PID sample
[4] / [5] = 4240 / 50 = ~42 counts/PID sample
7Gearbox ratio
8Ticks per wheel revolution
[3] * [7] = 20 * 150 = 3000 ticks
9Wheel Diameter
16 cm
10Wheel Circumference
2 * PI * R = 2 * 3.1416 * (16/2) = 50.27 cm
11Maximum Theorical Robot Speed
[4] / [8] * [10] = 2120 / 3000 * 50.27 = 35.53 cm/s

I finally got around to transferring all the electronics for the encoders to a small PCB:

Infrared Distance Measurement

Although sonars are cooler I've decided to use cheaper Sharp Infrared sensor for distance sensing. I went with the GP2D12 model which can measure distances from 10cm to 80cm and outputs an analog voltage representing the distance to the object.

Below is a diagram showing the placement of the Sharp sensors (Note that the side sensors are installed vertically to maximize their resolution):

This arrangement should permit the processor to easily follow a wall by calculating the difference between the two distance readings taken from sensors on the desired side. The result can then be fed into the main PID controller so the robot will try to maintain its target velocity while constantly realigning with the wall.

Here is what the finished upper platter looks like:

Now, the only problem with those Sharp sensors is that their output is inversed and non linear. This means a little more work is required to extract a distance (in cm for examples) from them. I've search quite a bit on the net and I've found quite a few equations that people have came up with and the one that seemed to match my sensors the best was:

Distance (cm) = 3402.5 * analog_reading-1.0427

For this to work make sure that:

This maximizes the dynamic range of the ADC as it maps out the 0 - 2.5V output by the sensor to the 0 - 255 returned by the ADC.

Its' possible to have the processor calculate the result of the equation after every ADC sample but with 5 sensors and a decent sensor sample rate of 20Hz it would be awfully SLOW! (The AtMega128 I'm using does have a hardware multiplier but the AVR flavor of the GNU GCC compiler doesn't fully support it yet) Because I have plenty of FLASH (128KBytes!!) I decided to simply use a lookup table for speed. I wrote a simple little C program to generate the lookup table as a C array (see the source code for the actual table).

Here's the final Sharp IR range-finder connection PCB I made:


Note the addition of a 10uF capacitor between the Gnd and Vcc line to reduce the noise created by the 5 sensors sucking in current and switching.

Another thing I've read about and later experienced for myself is the amount of noise present on the analog output signal of the Sharp sensors: It can fluctuate from +/-200mV and sometimes spikes way higher! I've seen a few people on the web suggesting to connect a 4.7uF (I used 10uF) electrolytic capacitor between the signal and ground to smooth it out. Watching the signal on an oscilloscope, it proved to make a huge difference! Keep in mind that some people say it can affect the way the sensor works (the cap acts as a low-pass filter?) but I didn't run into anything odd during my testing.

Motion Sensor

I was planning on using one or two cheap outdoors motion detectors (the ones that turn on the lights when somebody is detected) I got for 10$ a piece at Canadian Tire a while ago (I knew I'd use it sooner or later!). I had to hack them and do a bit of reverse-engineering to find out where I could tap out a useable signal. This great site came in very handy as it provides a sample schematic which most cheap motion detector seem use as a reference design. Here's the schematic in question. The whole design is based on an LM324 quad Op-Amp IC: The two leftmost op-amps (IC1A & IC1B) are configured as a two-stage amplifier. The other two op-amps (IC1C & IC1D) are arranged as a window comparator which detects and asserts one of two outputs whenever the amplified signal crosses a preset upper and lower voltage threshold. This greatly simplifies my job as I can just have the CPU monitor these outputs and determine the detected motion's direction. I decided to further simplify things (for the trinity contest) and tap out the signal where the two op-amps' outputs are OR'ed together using the D3 and D4 diodes. Note that the rightmost part is unused in my case as the CD4538 IC (a different one is used in my actual hardware) is only used to keep the AC lamp on for a specified number of minutes once motion is detected. To summarize, I feed the signal normally going to pin 4 of IC2 as a way to tell the processor there was "some" motion detected, but keep in mind that for the trinity contest I will always know which way I'll be turning and because the candle is static the direction of motion is irrelevant here. Future robots will have to decode the full direction to figure out which way the human/cat/dog was moving in so it can follow them.

Hacking the El-cheapo motion sensors
A finished hack
temporarily mounted on robot with hood

Line Sensors

These are used to detect the "door frames" of the the various rooms in the house. I've decided to go with the ubiquitous QRB1114 opto device like most other competitors, mostly because I happened to have two unused ones.

What a bad soldering job!
Alignment tests
Final assembly

Update: I decided to move the two line sensors closer to the middle of the robot to prevent them from hitting the new ramps. Technically, the optimal place would be right under the motor axis as it would greatly simplify the "Align with Door frame" behaviour. Because of the lack of space to drill holes (batteries and motors/gearbox) I had to reuse the gearboxes L bracket's fasteners to hold the two line sensors at around 5cm in front of the motor axis. See picture below:

Update: I finally decided to move the two line sensors right under the motor axle as I should have done right from the start.

Update: OK! I know! These updates are getting ridiculous, but this time it was different. :) After the wheel redesign I had no choice but to move the line sensor again as the ground clearance was so much larger that the L backet holding the line sensors couldn't cut it. I think this final design looks perfect! I mad some nice brackets using old PC case expansion slot covers (you know, the ones you have to break out on new cases). The nice thing is that the sensors are real close to the wheels to prevent any mishap where the sensor would hit the floor items and bend or break.

Check it out:


I was planning on using an old high-speed 3" computer fan but after trying a really old hair dryer motor fan asembly I had lying around, I was pleasantly surprised to see how much it kicked ass! I'm not even sure what the voltage rating is on it, but I know that at 12V it's REALLY FAST and NOISY! The funny thing is, I only have 24V available on the top deck and being lazy I decided to run the fan at that voltage! It's like having the Concorde landing in your eardrum! :) It might not last long at that voltage though. It's so good that it blows a candle in less than a second at more than 50cm away! hihihi

Here's a picture of the finished assembly:

Here's the schematic of the fan control electronics:

SCM (Smart Camera Module)

This is the I2C based module that interfaces with the B/W Gameboy camera. Currently working in prototype form.

Heres a shot the prototype board and the hacked gameboy camera with the IR filter:

Here's the final SCM PCB:

And this is the final camera hood:

SCM's schematic coming up...

Power Grid

After running into noise/grounding problems I decided to go back and do my homework and research the subject better. Sure, it would be nice to completely avoid the problem by using two seperate batteries for logic and motors, but I'm already forced to cram two heavy batteries just for the motors so I'm not about to add a third one. Instead I'll redesign the power system so it reduces the noise as much as possible. Another feature I've been thinking about is a battery charging mode switch, I didn't feel like tackling this stuff right now as it's kind of boring compared with testing AI but I knew I would feel like doing it even less in a few months so I implemented it anyways. It's a really simple little switch that just disconnects the two batteries when I want to charge them.

Here's a diagram of a good power system I based my design on:

Other noise reduction tips (see Links section for more details):

Here's the base's power block PCB which handles the power for the encoders, PWM and GP2D12 boards:

I decided to use a Power Trends PT78ST105H DC-DC converter as it has a nice 85%+ efficiency and can handle a hefty 1.5A.

Currently, all the upper electronics use the Vcc extracted from the STK-300 board's built-in LM317 regulator. I might have to add a seperate switching regulator later to give a break to LM317, as it's already getting pretty hot.

Update: Here's a picture of the top deck's power board I made to supply the SCM and any future additions. It uses a real nice LM2825 DC-DC switching regulator which is capable of 1A.

The Fan relay has it's own 7812 regulator to bring down the raw 24V to 12V.


AI Overview

For my last two robots, I have be using the Subsumption Architecture developed by Professor Rodney A. Brooks at MIT (R. Brooks, '85, '86, '89, '90). This form of behaviour control makes it easy to build and modify very powerful task-oriented robots.

This time though, I'll be using the more modern Three Layer Architecture (TLA). In short, it's an hybrid architecture which uses both school of thoughts: the good old Model-based or SPA (Sense-Plan-Act) approach AND the more recent Behaviour/Reactive-based approach based off Mr. Brooks' controversial Subsumption Architecture. As the picture below depicts, this is accomplished by combining a high-level model-based PLANNER layer, a Behaviour-based SUBSUMPTION layer together with a low-level Motion-control CONTROLLER layer.

This provides major benefits that the Subsumption architecture lacked such as Complexity Management and Taskability. The Subsumption approach used a very rigid set of priorities and rules which required a lot of tweaking and thus made it real hard to maintain. Most importantly it made it next to impossible to have the robot perform a different task short of a total redesign. With the TLA approach, the PLANNER can selectively enable/disable and configure each behaviour at run-time, effectively modifying the task of the robot.

Development Tools & Source

To make this even more flexible I'll, again, be implementing all this on Larry Barello's excellent AvrX RTOS (Real-Time Operating System).

As the development tools I use the latest and greatest version (3.3) of the excellent GNU AVR-GCC compiler under Windows. (Thanks to Eric Weddington for his great WinAVR Windows distribution!)

Although the robot is far from complete I'll try to keep a more-or-less updated version of my source code available online for your viewing pleasure.


Motion Control

I've been reading a lot about closed-loop motion control algorithm and I definitely want to implement one for this project. Although a friend of mine suggested using fuzzy logic for this purpose I think I'll go with a classic PID controller.

My whole motion control code is based on Larry Barello's excellent paper on Motion Control.

I've been toying with a few odometry implementations and I seem to be having problems with GCC's floating point implementation or something. During my reasearch for the PID stuff I came accross a really nice Fuzzy Logic tutorial and after serious consideration I might ditch the PID controller in favor of a more flexible Fuzzy Controller. I was blown away by the simplicity of the whole thing. PID controllers are known to be hard to tune and once they are, any changes in load or system characteristics requires retuning. I read a research which compared a PID and Fuzzy Controller and Fuzzy Logic came out on top. It was even able to handle load and other changes without retuning!

Update: I will be sticking with a classic PID implementation for this first version of the robot as I have already enough work ahead of me.
I finally got the odometry code to work thanks to the newest version of AVRGCC! (3.3)

It's not very precise but I only plan to use it for finding my way back to the door after entering and searching a room (and possibly extinguishing the candle!) To keep the errors to a minimum the code will reset the odmetry variables after the robot senses and aligns itself with the door-frame markers.

Wall Following Behaviour

This is a very important part of any firefighter robot as an efficient wall follower can negotiate the model house plan faster and without errors!

My algorithm is simple and strangely enough, it worked better than many, more complicated, ones I've seen around the net. It uses a simple PD (Proportional-Derivative) controller which keeps tracks of the approximate distance to the wall (equation #1) and the current angle to the angle (equation #2). These two results are then mixed (equation #3) to produce an error signal which is later fed in a PD equation (refer to srx1.c in source code folder):

  1. current_distance = (d_front + d_back) / 2
  2. current_attitude = (d_back - d_front)
  3. error = (DISTANCE_SETPOINT - current_distance) + current_attitude
The F_P and F_D constants need to be tweaked to adjust the aggresiveness and/or smoothness of the PD controller.

Last Updated on: June 10th 2003