successful primitive blimp automation
Shahrul Kamil bin Hassan (Kamil) blimp autonomy visual processing i2cLast week, the engineering effort was focused on setting up the hardware and building the blimp and any other required electronic components. Initially, we couldn't set it up using the balloon that we bought from Amazon, so we had to reuse the blimp that was used in the previous tournament. Due to the post-event delivery, the blimp was in a terrible condition. It took as a day to re-iron any visible leakage to make it somewhat usable. Even after that, there seems to be some smaller non-visible leakage, limiting the time that we can test the blimp functionality.
The video above shows that the blimp can successfully track a green balloon in the air. There are, however, some limitation that we now need to consider. The efficiency of the camera detection is not that high against a reflective green balloon that was used. This caused the camera to only detect a portion of the balloon rather than the whole body. As a result, the "center" of the blob that the camera is detecting might not be the center of the balloon itself. This would cause an issue if we want to make sure that the green balloon goes into our capturing mechanism as the blimp move towards it. For this week, we want to focus on dealing with this issue in addition to adding some calibrating mechanism to the blimp.
This vido shows how the blimp is seeking for the green ball. But there are still some issues in the control and visual processing:
- The direction of the propeller that controls the height is reversed
- The camera now is using only the color information as input, which is almost certain that not robust enough in the competiton
Some suggestions for these issues:
- Write a code to easily change the direction of the propellers (ex: using dash website)
- Using not only the color information, but also the shape and other machine learning algorithm to discriminate green balls