NavBot



Description

This is my first attempt at a GPS navigation robot (eventhough i started more than 3 years ago!)... It's meant to participate in the local ORE's Magellan competition.

Progress Status

Item Description Status
Adapt SMC's constants for the new encoders/drive train
Done.
Migrate source code from ARM to x86 so I can use a normal portable (mostly endianess issues)
Done.
Find a way to add encoders to output shaft of laslo motors.
Done. (kludgy but works)
Replace kludgy low-encoder-count encoders
Done. (stole the old home-made ones from SRX1!)
Integrate gpsd daemon with robot code
Done (created a pthread-based module and integrated it in existing code)
Mount SRX2's camera in front of robot so it can easily spot the cone
Done. (made a wooden platform for elevating compass, GPS and camera)
Fix broken sonar sensor
Done. (replaced faulty wiring harness)
Test compass input and associated motor schemas
Done. (MOVEAHEAD schema worked without tweaking!)
rewrite planner's high-level FSM for basic waypoint navigation
In progress...
Integrate AVOID schema to avoir obstacles
TODO (should be a simple 1 line change but let's get basic GPS nav working first)
Test GPS-based driving outside (for 1st time)
TODO (Just not practical at home)
Build a good bumper system. Preferably a full bumper skirt that covers all around.
Optional
Use camera tilt/pan base as a rotating turret to decouple camera tracking from base...
Optional
Replace SLA battery with lighter LiPo from SRX2.
Optional
Replace capricious belt-drive with direct or chain drive to reduce slack.
Optional
Integrate basic speech synth (eSpeak, Flite, Festival?) to signal Cone was reached.
Done. (used eSpeak as it's provided by Fedora's repos and API was nice and simple.)
Integrate libconfig so we can migrate most variables/definitions (behaviours,MS, PS, etc...) to config files.
Optional
Replace those old loose wheels with something softer and more precise
Optional

Processing

To avoid performance related issues I ditch my trusty old NSLU2 ARM (ixp425) board and simply popped my Dell portable in there this way I get LOTS of processing power for vision algorithms. It's currently running plain Fedora 13 but could theorically use a custom linux to speed up booting, etc...

Power

Using one of those gigantic 17Amp/hr SLA. Heavy but reliable although I could probably replace it with that nice set of LiPo I got gathering dust...

Base

I'm reusing the old Xtreme Overkill Splatbot base to allow me to navigate over rougher terrains.

Motors

Because I'm reusing the Splatbot base, I use a set of those very powerful "Lazslo" 12V motors. The power is transmitted to the lawnmower wheels using a cogged belt.

My first attempt at adding encoders was less than satisfactory so I ended up "stealing" the home-made ones I made for SRX1! Who knew those would still be in use after so many years! And because they are glued to the motors' output shaft I get a decent max_tick_per_pid_period of 38. More than enough for an outdoor robot, especially considering how much slack I have in those belt-driven wheels.

Peripheral Access Module (PAM)

I'm reusing SRX2's PAM module to allow the portable to communicate with the I2C devices via USB.

Smart Motor Controller (SMC)

Why change what isn't broken right? I'm using a slightly modified SMC codebase (accounting for different drivetrain).

Input/Output Module (IOM)

The IOM is reused to simplify access to the Sonar and compass.

Digital Compass

The Compass is a CMPS03 from Devantech.
It is read by the IOM using a softare based I2C master (My master I2C bus runs at 400Khz which causes trouble to the CMPS03.)

Sonar

an old Devantech SRF04. A more recent one would be nicer 'cause I'm forced to read this one using software I2C (on IOM).

Might consider using a second one later to better avoid obstacles.

Camera

I've changed camera once more. This time opting for a Logitech Quickcam 9000 with Zeiss high quality glass lenses and UVC interface to simplify driver support under linux.
It's quite good and although I tried hard to counter the automatic exposure adjustment It ended up being a blessing as I noticed once my colour thresholds are set they remained almost dead on from indoors to outdoors!
The only side-effect to this is that the frame-rate varies accordingly to the amount of light available so indoors the frame-rate can get as low as 5fps!
To counter this effect I've tried to decouple the Vision thread as much as possible from the faster running sensor thread. But in brighter lightning conditions or even outdoors it runs at 15fps+.

Software

AI

The software is mostly reused from SRX2's unfinished codebase. In short it consists of a 3 layer architecture composed of a motor schema based reactive lower level, a simple schema sequencer middle layer and a static planner (merely an FSM for now) at the top.
The current prototype uses NPTL threading to seperate the various module into threads to maximize performance.
There is a machine vision thread that provides the perceptual schemas with vision data they can use to guide their decisions.

There is an executive thread that runs a TCP server which responds to query from a Tcl/Tk GUI via ethernet/Wifi.





Last updated on: July 4th 2010