Interfacing with a webcam to interpret movement and drive a remote control car.
HOW WE DID IT:
Our project involved the use of the OpenCV computer vision library with VC++ to interpret a live image movement from a webcam. Scanning the image for the color black (it can search fo any color), the library is used to determine the placement of a user's hand. It then sends comands to the RC controller, via the Arduino Uno, to move the car accordingly. The image viewed by the camera is broken into 7 blocks where the top left would tell the controller to move forward and left, top center would be forward and so on... There is a middle section this is considered neutral, and isn't scanned at all to speed up the processing, which is where the users hand should remain when they start or just don't plan on moving anywhere. OpenCV will check the amount of the color it finds in the other 6 sections, and determine a winner for what to output to the remote control. For a more natural and intuative feeling for driving the car, we placed the camera to look straight up at the ceiling so that the hand movements forward and backwards would move the car as such.
The computer constantly sent commands to an external control (Arduino), which triggered the cars original remote control input to transmit the command to the car. In order to get acceptable range with the remote we had to supply it with a 9 volt power supply, which in turn required logic level buffering between the Arduino and the controller. We used simple NPN transistors as digital switches to buffer between the Arduino and the RC (5v to 9v buffering).
We quickly found that driving the car was complicated because it was hard to keep an eye on where the car was and where your hand was relative to the web cam's vision. To fix this, we mounted a camera to the car and projected this streaming video onto a wall. This gave a feeling of first hand driving and, while still a bit complicated, added a level of fun and entertainment for the user. Because the car was now driven by watching the video feed from the car, we realized that there was no need to even have the car in the same room as the driver, so for demo purposes we put the car in the hallway. The hallway allowed for a longer stretch of open area to drive (i.e. no one would step on the car accidentally). Driving the car has a similar feeling to controlling military-like unmanned vehicles, which isn't something people get to do everyday (bonus).
There are many purposes for unmanned vehicles (EOD bomb searches, military arial survailance), but the average person doesn't not usually get a chance to use them. We could have simply strapped a camera on the car and controlled the remote by hand, but where is the challange in that? Also, interfacing using a computer added the posibilty of using more advcanced modes of remote control, such as BlueTooth or satellite communication. For our purposes, this set up is kind of a neat idea for a game because of its entertainment value and challange. It was plain to see who had a lot of gaming experience and who didn't at the demo night because it reflected in how well they drove the car.
This project has many more possiblilities that, due to time constraints, we were not able to implement. The setup could be easily modified to control two cars at the same time. Then a track could be created to have two players race each other. This was the original intent, but when we found out that the two remote control cars we bought were on the same RF frequency we decided to focus on just getting one car working. Another idea would be to better control the speed of the vehicle, and then speed up based on how far into a block the driver's hand is. We tried to slow down forward movement by pulsing the forward trigger, but modifying the car itself and controlling the voltage to the motor with a DAC or PWM would allow for smoother driving and more speed settings. In the program, the color tracking thresholds should be further tightened around the desired color (black for us), as the it would sometimes pick up things like shadows or sleeves of the user. Finally, we could overlay the grid onto the video feed from the camera mounted on the car. On that same video, the computer could output an indicator of where the user's hand is located on the grid. This would make the first-person remote driving much more intuitive and simpler to use. Also, we could have installed a clear screen that would act like a track pad for the user while still letting the webcam see the threshold color. This setup would allow the user to feel his or her way across the screen while still paying attention to the video feed.