Monte Carlo Localization (OTL)
Our robot can now localize itself using the sonar and the IR sensors. The localization is displayed in a nice format through the wonders of MapTool.
Cool Extra Things:
Real Synchronization Code!
We have replaced the previous busy waiting loops with real synchronization constructs. Among the choices available to us we decided to use critical sections, which protect a certain area of code from being entered by more than one thread at a time.
Further Organizing MapTool (esp. Robot)
Originally the Robot class had quite a few functions that, from an object-oriented view, might have been somewhat non-intuitive. This was remedied by separating some of the Robot class's functionality out to a new Instrument class (which is an abstraction for sonars, irs, and the camera) and the old Particle class (newly liberated from subclass status).
It is our team's belief that these object's functions are a bit more intuitive. For example, you can ask each instrument what their reading is in real life (.reading) and what reading they get from the map (.distToWallOnMap()). Additionally, distToWallOnMap() can also work on particles. So you can get the sonar, ir, and camera readings for any particle with a single function call. Gone are the days of sitting down and thinking hard about when you need to covert to global coordinates or rotate the axes.
Each of the Instrument and Particle classes now have their own Draw() routines which greatly simplifies the Robot's own drawing routine.
We have measured the distances from the center of rotation of our robot to each of the sensors (ir, sonar, and camera). As such, our robot is using the correct positions of the sensors relative to itself, eliminating a cause of (large) systematic disparity between real world readings and the MapTool calculations.
Additionally, because of the Instrument abstraction, you only need to tell the instruments their offset once and they keep track of calculating all the translations and readings.
The calibrations can be found at the top of the Robot.cpp file.
Using IR to Help Localize
Yup, all three of them. They are in much the same way as the sonar (just less accurate).
Culling Particles Based on a Heuristic
Each time ResampleParticles is called, particles are culled in the following manner. First, the 25% (tweakable) with the highest probability are kept. Then a random 5% (tweakable) are also saved. All the rest of the particles in the filter are removed.
When spawning in new particles to replace the ones that were removed, first 10% (tweakable) are scattered randomly across the map, the rest of the particles are spawned near previously surviving particles.
Using Overly Complex Math
The UpdateParticleProbabilities function uses gaussian distributions to calculate probability based on the difference between the actual instrument readings and the simulated instrument readings of each of the particles. See evalGauss() in Robot.cpp.
New Particle Color Scheme
Particles that have just spawned are colored blue. Among the surviving particles the average probability is calculated. Any particles below the average probability are red, the others are green.
Really Cool Animated Gifs
How to Use the Code:
An evolution can then be created with Evolution('127.0.0.1', 5001, 5010, 5005) and controlled with the provided python functions, or the MCL algorithm itself can be tested from within MapTool. When using evolution.py you must manually update the MapTool with updateMapTool().
Although, only evolution.py and the updated MapTool are included below, both base_drive_test.exe and Sonar.class are additionally necessary. All four must be run for the robot to correctly wander and localize using the IR and the sonar. Eventually the camera will be included and then you will need to run the vision tools as well.
Note: Sonar must be run fron JCreator
The Evolution class: evolution.py
The MapTool code MapTool.rar