# Mallard Bane: Progress to Date

Here, we chronicle our epic struggles and ultimate success in the field of robotic duck hunting on a week by week basis. We have broken the project down into semi-manageable chunks, which we can tackle in orderly fashion. Of course, we assume that things will go horribly and hilariously wrong at times, and we will detail these events here as well. First, let's determine what it is, exactly, that we need to do.

# Problems To Solve

The problems we have to deal with in designing Bane are in the areas of computer vision, identifying the duck and determining it's position, and in effecting the motion of the arm to fire at that duck. Here is a breakdown of the basic problems we have to deal with (red stuff has yet to be done):
• Vision
• Hardware
• Camera
• Verify that the camera will effectively capture the images.
• Test against television sets to determine the level of artifacts present in the video signal due to the scan lines of the CRT.
• Attach camera to chassis.
• Capture Card
• Test performance with camera.
• Software
• Interface with capture card to get pixel data
• Process pixel data.
• Determine position of screen in view.
(We cheated on this for the time being, and hard-coded the position of the screen. This doesn't really affect our process, it just makes computations faster since we don't process pixels outside of our screen window.)
• Determine position of duck on screen.
• Determine position of duck in environment.
• Motion
• Hardware
• Determine if power supply is faulty. If so, get another one.
• Test
• Design trigger-pulling mechanism.
• Make trigger-pulling mechanism function correctly.
• Attach gun to tilt-pan device.
• Attach Laser pointer to gun.
• Test speed of movement.
• Test speed of firing.
• Software
• Given coordinates (x,y,z), determine angles to fire at.
• Given relative position of TV, camera and gun, determine coordinates to fire at.
We have made the following progress:

# Week1

• We determined that the power supply for the tilt-pan arm seems to be malfunctioning.
• We attached the good camera to the computer and successfully got the image to the screen.
• We attached the bad camera to the computer and got something to come on the screen.
• We looked at code on the box that previously was used to get data from the camera and control the arm. We are confident that we can do the same, given some time.
• We came up with a name for our robot (Bane, for short).
• We decided on a reactive architecture for our robot.
• We developed algorithms for locating the screen, the duck, and tracking.
• We made this snazzy website.

# Week 2

• We determined that our computer would no longer boot.
• We procured a new computer, and established a development environment for ourselves.
• We got a new fuse for our pan/tilt unit, and successfully reassembled it. It actually works now!
• We compiled and installed the SPU Toolbox, which gives us a nice C++ wrapper for controlling our hardware.
• We set a Nintendo and TV up in the Robotics lab, so now we can play Duck Hunt and Mario at any time.

# Week 3

• We can now directly manipulate the incoming image buffer. This will allow us to write duck-finding code.
• We now have a programmatic control of the pan/tilt unit. Unfortunately, it is alarmingly slow.
• Simple image differencing code is complete. It allows us to detect motion effectively, which should make finding the duck much easier.
• Our driver program now allows various video operations to be performed, with a stylish text overlay that tells you which mode you are currently in.
• We tried a threshold mode, which differences the images and eliminates all pixels which are less bright than some threshold, but the duck is not really much brighter than some of the image noise.

# Week 4

• We determined the actual size and position of the playing field as it looks to our camera. This allows us to ignore pixels that can't possibly contain the duck, which makes everything run faster. We sort of cheated on this, since we coded in the position and size of the screen by hand. Eventually, Bane should self-detect the screen. But this is a problem for another day.
• We now have effective duck localization. We use a sliding window that is approximately duck-sized, and check the total intensity of pixels within the window. We guess that the duck is located in the window with the greatest intensity. Remember that we use image differencing before we do this, so that moving objects should be brightest.

A few notes on this technique:

It seems to work very well, at least up to level 9 or so. Maybe it works well past that too, but we got bored of playing and decided level 9 was good enough for now. Whenever the duck is flying around on screen, our localizer keeps it within the best-guess box. We have some pictures that show the box in red, as the image-differenced duck flies in the background.

It has no idea of whether a duck is actually on screen or not. The dog is usually selected when it's around, and when nothing is there, it seems to like the tree pretty well. We're not sure why the tree is even showing up, since typically trees don't move, but no firm conclusions on this yet.

# Week 5

• We got extra guns, so that if we break one, it won't be catastrophic. We'll be poking the insides of unknown electronics, and we'd hate to break Ian's gun. Besides, they only cost \$6 each.
• We opened up a gun and identified the parts (check out the pictures page). We figured out how the trigger operates, and we now know that the gun will fire whenever we close a particular circuit. The plan is accomplish this programmatically through our computer's parallel port. If you're interested in the internal workings of these guns, i recommend howstuffworks for a nice description.
• We made our pan/tilt unit move really fast. Previously, we were getting sluggish results, speed-wise. We did some hacking on the SPU-Toolbox (see files) to allow us to increase the panning and tilting speed. It now pans and tilts like you wouldn't believe. We think it might be fast enough that we won't need to consider the duck's movement that occurs during pan/tilt operations.
• We did some calculations for translating estimated duck locations (in (x,y,z) coordinates, which include distance to the television) into angular movement for our pan/tilt unit. We haven't coded the algorithm yet, but it looks good on paper.

# Weeks 6-7

• Spring break. Oh yeah.

# Week 8

• We got the parallel cable, plugged it in, and set some pins with nifty information pilfered from this page and this one, which contain more than you will ever need to know about parallel port programming, but not quite what we needed to know. Alas, our first technique did not work. We were unable to get the gun to fire by setting one wire as hot as we had hoped.
• We coded up the (x,y,z) targeting. Now, given coordinates, the pan-tilt unit will move to point at them.
• We noticed that the speed of the pan-tilt unit is erratic. We cannot count on it to go as fast as we thought.

# Week 9

• We made attempt the second at getting the parallel port to do our bidding, this time attempting to run it through a transistor that would connect the two wires from the trigger in the gun. Alas, for we were unsuccessful. The parallel port control isn't quite working as expected, and we can't seem to actually register voltage changes in the data pins. We suspect that once we figure out how to make this work, the transistor switch will work just fine. We're not exactly experts on circuits and so forth, but actual engineers have assured us that this should work, given the right control over the input voltage.
• We improved tracking of the duck. One problem that we've had is that, on certain ducks, our system tends to track the tree rather than the duck. Since this was sub-optimal behavior, we instituted a number of changes to the vision algorithm, including removing the red channel and establishing a threshold for duckness that the tree cannot meet. Now, although we don't always have a known target duck, we never have a target non-duck.
• Also, our camera has some weird color output when we hook it up to the computer. When you hold up the orange pliers in front of the camera, they show up as distinctly blue on the screen. This isn't a huge problem as long as the camera continues to do this. Bad things might happen if it all-of-a-sudden decided to give us the real colors.

# Final Week

We did a lot this week, so we'll break it up into individual days.

### Day 1

We finally got the parallel port working! It's not clear whether we were addressing the wrong port or had the circuit wired incorrectly, but it works now. Unfortunately, we now know that the transistor circuit doesn't work the way we'd hoped. If you're wondering, the lp0 port address in Linux is 0x372. We need to get the circuit going, but we don't really know what's wrong.

After consulting Phil Vegdahl, who is an electrical engineering major, we determined that there is a 35 volt difference between the parallel port circuit and the photodiode circuit. This is apparently bad. We tried sticking a capacitor in there and sacrificed a duck to the god of electricity, but our circuit still didn't work and the duck left a big mess in the lab.

### Days 2-4

We did nothing. Surprisingly, the robot still doesn't work.

### Day 5

We're pretty desperate to get the gun to start firing, so we begged Professor David Harris, an electrical engineering super-guru, to help us. He kindly pointed out that we had no idea what we were doing, and that our problem could be trivially solved with a \$2 relay switch from Radio Shack. Kevin ran out and got one, and lo and behold, the gun fired! Kevin was pretty happy, but he didn't realize that I was taking a movie and not just a picture. Notice how the screen is flashing--that means the gun is firing!

### Day 6

We soldered our circuit together and attached it to the gun. We put the gun back together after making holes to mount it to the pan/tilt unit and to allow parallel port inputs. We had pictures, but someone deleted them from the camera. Just imagine the normal insides, but with a lot more tape and wires everywhere. We then went down to the wood shop and mounted the gun on the pan/tilt unit, then mounted the pan/tilt unit on a wooden base so that it couldn't flail around as it aimed.

We dragged the unit back to the lab and hooked everything up to our driver program. Firing worked correctly, but the tracking code needed a lot of calibration. After fiddling around with it for a while, we hard-coded in the perfect magic numbers and Mallard Bane got all the way to level four.

Now, level four is a perfectly respectable level, if you're blind and are having a seizure. From Mallard Bane, we expected a little bit more. We went to work on the tracking code, and added a few new heuristics and upgraded a few old ones.
• When we think we know where the duck is, we search only a small region proximal to that point.
• We were losing a lot of bullets when Bane fired at the dog (we hate that dog!), so we stopped it from shooting at the three places where the dog can pop up.
• We ensured that the gun fires no more than every 0.5 seconds, so that we never waste all of our bullets immediately and never fire at the flash from a previous shot.
• We check that the gun is close to where we think the duck is before we fire.
• We added prediction code, which helps account for the lag between the frame data we get from the camera and the actual game state. The code extrapolates the duck's movement over the last frame three frames into the future. This can fail if the duck changed direction, but works pretty well otherwise. The lag we experienced is due to the image capture software that we used; the overhead added by our tracking code is not noticeable.

### Day 7

We were scheduled to give a demo of our robot to the class in the afternoon, so we spent the morning tweaking our code. By the time class rolled around, we were getting to level 10 consistenly and level 11 on occasion. Not quite as good as we'd hoped, but still quite good considering the time lapse incurred by the image capture.

# Future Work

We're pretty happy with Mallard Bane the way it is now, but it could be even cooler. Here's the extra stuff we'd do, if we had the time and motivation:
• Automatic detection of the screen. Right now, we have hard-coded the position and size of the television. It would be cooler if Bane could automatically find the TV and determine its position and size within the program.
• Improve the vision algorithm. Ducks come in several colors, and we have a hard time seeing blue ones.
• Play the two duck game. Right now, Bane only plays with one duck. It can't play with two because it tends to shoot the falling corpse of the first duck it kills.
• Eliminate the lag between the screen and the image buffer. There are a few ways to go about this. We could try a different library; it's possible that ours is slower than it needs to be. We could use our current library, but strip it down so that we don't spend any time on operations that we don't need, such as drawing the image on the screen. We could even use an emulator on the computer to drive the on-screen game and take the frame data directly from the emulator, but this feels like cheating to me (Kevin likes it, though).
• We could add clay pidgeon mode. But who the hell cares about clay pidgeon mode? It's not called Clay Pidgeon Hunt, after all.
• Play more Duck Hunt.