Final Report 2
History of the Project
Our robot and strategy for Botball have changed quite a bit as the semester progressed. We originally focused on a two bot strategy using the main bot with the Handyboard to pick up ping-pong balls for depositing in the nest, and a secondary RCX bot for the foam ball. We designed a vacuum system using Lego treads to scoop up the balls and put them in a hopper on the bot. This hopper would then be raised to drop the balls into the nest or a green basket. Meanwhile, the secondary RCX bot would manipulate the central nest, pulling it into our end-zone and giving us points for the foam ball.
However, we ran into some problems with this strategy fairly quickly. The nest is heavy, and so the motors on the RCX bot were not powerful and fast enough to be able to push the nest. Additionally, there was some trouble with having the main bot knock over the toilet paper rolls in such a way that the balls were easily accessible, and the TP rolls didn't also get pulled in. We were also worried about size, as the Handyboard robot seemed to take up most of the space, leaving us very little for the RCX bot.
We decided to redesign the bot and make a new strategy. Since the foam ball was worth a lot of points, and readily identifiable by color, we decided to concentrate on this. Putting the foam ball in a green basket on our side would be worth a lot, especially if we could move the basket to help prevent the other team from taking it. A secondary goal was to pick up TP rolls and dump the ping-pong balls into a hopper on the robot, then to take the opponent's green basket, and put the balls in that.
To support this, we scrapped the idea of a second robot, and decided to use both boards on the single bot. A claw was designed capable of picking up the foam ball, manipulate the green basket (pushing and pulling) and picking up TP. A hopper on the back of the bot was made to hold the ping-pong balls, with a gate to release them into the green basket. The camera was to be used for detecting both the orange foam ball and the green baskets. Because of the dual-function nature of the camera, a swivel-arm was designed that could rotate the camera so that it pointed either in front of the bot or behind the bot. Sonar would be used for finding the range to targets, and various sensors would be used to enhance the ability of the bot to locate itself.
The current bot accomplishes the primary goal of picking up the orange foam ball and depositing it in the green basket. The camera and sonar are used for finding the targets, and the claw is used to pick up and deposit the foam ball. The support for picking up TP rolls was not added in to the current claw design, so while the support is there for holding and depositing ping-pong balls, the bot cannot actually do this. However, we feel that putting the orange foam ball into the green basket is a big accomplishment, and should help us to be competitive at the Botball competition.
Several design modifications occurred during the development of the current bot. Because of the difficulty of acquiring and manipulating the foam ball and the time constraints of the project, the last three weeks of the project focused on developing a consistent algorithm to acquire and deposit the foam ball into the nest. As such, the camera swivel was no longer necessary for the design. Therefore, to facilitate consistent angle and placement for the CMUcam, the swivel was removed from the mounting and placed statically on the side of the bot, pointing at an inwards angle. This aided immensely in the robot's performance consistency.
A closeup of the current camera and sonar mountings.
Besides sonar, CMUcam and bump sensors, the original (second) bot design also called for the use of odometry and IR-tophat sensors. Both sensors would aid in the localization of the bot. In practice, the former sensor proved to be too inconsistent to be useful. Although the odometry calculations could potentially provide accurate orientation and location readings, the actual odometry readings themselves varied drastically between runs. The IR-tophat sensor could successfully detect the electrical tape lines. The sensor reading was used to detect generally where the bot was in the arena; these readings were used for timing in the robot's initial rush to the nest.
Requires the following files from the CMU cam library: cmucamlib.ic cmucam3.ic cmcsci2b.icb. (The last file is a binary assembled from cmcsci2b.asm). The robot's current program consists of the following script:
Performance AnalysisWe bought a piece of semi-reflective white dry erase board from Lowe's to finish the BotBall arena. It is a very accurate reconstruction of the official arena.
Figure: The testing grounds for our robot
Here the green baskets are placed in one of the 4 possible configurations.
To analyze the performance of our robot we had the robot start in the starting area (the small taped rectangle) and execute the code listed above.
The robot was always able to find the orange blob and green basket, however the claw would not reliably lower and open, nor would it repeatedly drop the ball into the green basket. One in three trials resulted in a successful acquisition of the orange ball, one in five resulted in a successful drop into the green basket.
These problems seem to be systematic errors, in that more tuning needs to be done with the parameters governing the distance from the robot to the green basket, and the speed at which the claw is lowered.
The difficulties finding the green basket could be ascribed to the sonar sensor giving misleading readings if the perpendicular bisector of the sonar sensor is not colinear with a diameter of the green basket. Because the sonar bounces off the basket, the reading is larger than in reality. These errors should be accounted for.
There are still a considerable number of improvements to be made before the robot will be ready for competition. The most immediate work needs to be done on the code currently running on the robot. It needs to be made more robust so that the robot can recover if it misses putting the ball in the basket, and the depositing routine still requires some more careful calibration, so the bot does not end up accidentally throwing the ball out of the arena. We also need to work more on exactly how we want the robot to respond to the bump sensors while performing each of its tasks. One thing that will definately need to be changed is the distance that the robot currently backs up when hitting a front bump sensor. Eventually code will also need to be added to start the robot automatically when a light is detected and then to stop all of the robot's subroutines after 90 seconds.
Because the current static mounting of the CMUcam worked out well for dealing with the foam ball, we are reconsidering a front design for the robot that would provide enough clearance for the camera to be centered below the arm. This would greatly simplify the code for our entire search routine and would increase the consistency with which the robot aligns itself correctly on the ball and/or basket. Another structural change that needs to be made is the addition of a light sensor to detect the starting signal lightbulb. Also, during our many trials the rear bump sensors have proven to be a bit fickle, so they also need to be redesigned to a more sturdy and reliable configuration, like the front bump sensors.
If all of this is completed sufficiently before the conference date, we will also reconsider the strategy of trying to go after toilet paper rolls and ping-pong balls once we have the ball in the basket, and putting the LEGO RCX back into play. Otherwise the RCX is simply a weight that serves to balance the robot, and we may consider removing it completely and having a smaller and thus more agile bot.
Perhaps without the RCX and hopper, the robot would be more agile?
HMCHammer in action Video Clips