First, the camera detected all the blobs of within the camera frame. The color along with its horizontal and vertical position are encoded into a 7-character string, which is then sent to the feather board using the i2c protocol. the feather board iterates through list of string. If the color code matches the color that the feather is currently seeking (in this case red), it will map which quadrant the desired color blob is at. This functionality can further be expanded to track a blob without its depth accounted. An example of such application is that if the camera sees a green blob, it can send the information to the feather. In turn, the feather will ensure that the blob stays in the middle of the frame as the whole robot moves toward it.

In the video above, the casing of the phone recording this video is red. As the phone moves around to record the video, the OpenMV camera detects the motion of the phone and map it to the LED panel.

Next Post Previous Post