Our idea for this project was to create a vision-based system that could track an object in a room with the robotic camera and then aim a laser at it. We planned to use the available servo motors and a Handyboard to manipulate the laser. The Handyboard would be controlled via a serial connection to the computer operating and interpreting the camera. Due to the possible ballistic applications of this device and our recent viewing of a particular Schwarzenegger movie, we dubbed this project the Predator
The robotic camera is a Sony EVI-D30. It has four degrees of freedom: pan, tilt, focus, and zoom. All of this functionality is fully controllable over a serial connection. We used pre-existing code to control the camera. To obtain the actual frames of video, the computer a video capture card. The image manipulation tools are comprised of the Video for Linux kernel module, the Simple Direct Media Layer libraries, and interface code provided by Ross Luengen and Chuck Schied.
To track objects in the room, we implemented a vision algorithm. Our original hope was to have the capability to detect new objects (ie motion) in the room and then track and laser them. However, this proved rather difficult in our time frame. So we restricted the problem to detecting and tracking a blue Lego plate. Through trial and error, we isolated the approximate hue and saturation values that the plate reflected. We then implemented a vision algorithm that looks for pixels that are in the specified color range and have at least five out of eight neighboring pixels also in that range. The program then changes the color of these pixels in the video frame so that they can be easily identified in the output feed. After doing this, the program draws a rectangle around the pixels closest to each edge of the image. Finally, the predator program sends a directive to the camera in an attempt to center its view in the rectangle and thus keep the target in the frame.Camera Code
The laser portion of the project consists of a Handyboard which controls two servos, upon which is supposed to be mounted a laser pointer. The servos are mounted such that they can pan and tilt the laser, which matches up nicely with the range of motion of the camera.
The Handyboard provides easy control over the servos. To extend this control to the computer, we wrote an interrupt-driven serial handler for the Handyboard. We also crafted a rudimentary protocol between the server and Handyboard, allowing for a few basic commands such as absolute and relative movement and resetting the servos to their centered positions. Code to implement the protocol is written in Interactive C on the Handyboard side and and C++ on the server side.
We have not yet accomplished our goal of integrating the laser itself into the system. Our main hangup was the sub-par construction of the laser pointer we purchased. It functioned only sporadically, apparently having problems with the electrical contacts inside. Also, the batteries did not allow for extended periods of operation. We attempted to correct these drawbacks by gutting the laser like a trout. However, we were unable to get the laser to run from a power supply, and further attempts at modification resulted in ultimate failure of the unit, along with some slight burns/electrical shocks to my left thumb. Oh well, it was satisfying to rip that thing apart...Laser/Servo Code
At this point in time, the camera follows the blue plate quite well. There are some serious slowdown issues somewhere between the camera and the frame grabbing functions, which make the effective frame rate quite low. As such the camera can easily lose quick-moving objects at short range. The camera also adjusts its position at the maximum motor speeds which gives it a very creepy aspect.
The vision code is able to distinguish the blue plate from the rest of the room very well after we tuned it to a narrow range of hues and saturations. There is still some occasional noise from the other blue Legos, the recycling can, computer monitors, and various articles of clothing. Another problem we had which remains unresolved is a bizarre refresh behavior in the video frames. Somehow the upper portion of the video frames periodically seems to skip processing by our code and go straight to the output. This confuses the centering code into thinking the plate is that much shorter and it tilts down to compensate. This is one contributing factor to the creepiness of the camera's movement. Despite these issues, the camera is able to pick up the plate due to is dense and large cluster of target pixels and the intermittent nature of the noise pixels. Most distractions are momentary. The way the target box is drawn makes the tracking forgiving for glare and other lighting effects on the plate. Even at substantial range, the camera is able to center on the plate.
Although we did not fully integrate the laser sub-system, the functionality that we did implement appeared to work smoothly. We successfully sent commands from the server to the Handyboard, and the control of the servos was responsive. The only real remaining challenge left in this project is to determine how to best have the camera detect the actual laser dot so that we can keep the laser centered on whatever the camera is tracking. Initial testing seemed to indicate that differentiating the laser from the surroundings could be accomplished without too much trouble based on the laser's high intensity.