14 Aug

During SURP 2021’s week 8, Bhavik Joshi (Team Arnhold), Marisa Duran (Team Arnhold), and I managed to integrate the vision processing and feedback controlled cars epic. Using Marisa’s OpenMV code and my code for controlling the cars, we were able to put an OpenMV camera in one of my four-wheeled o...

14 Aug

During SURP 2021’s Week 7, Bhavik Joshi (Team Arnhold) and I were able to create a swarm of car robots using his Mesh Networking code and my cars. The cars were controlled by a single joystick through Sudarshian Seshadri’s (Team Arnhold) React-app dash. A gif of the car swarm is attached below. ...

11 Aug

The video in the last post depicted a very slow detection of the ball. I also tested a CPU-trained model but, as expected, it was quite slow: To improve the performance of the algorithm, I replaced the original FasterRCNN model (torchvision.models.detection.fasterrcnn_resnet50_fpn()) with a mo...

11 Aug

Yesterday, I successfully integrated the camera with the dashboard (and the ESP32 featherboard). In the video demo below, I have the featherboard calling functions on the openMV camera, asking it to detect an apriltag and return rotation information (x, y, and z euler angles from the apriltags). The...

11 Aug

I merged the ball detection code with the distance estimation code and added a camera component to make it live. The following video shows the code running live on CUDA: Model prediction is very slow on the CPU; I am currently checking out techniques such as quantization to optimize the model. A...

09 Aug

We simulate the BOEM algorithm against two other offline algorithms. Since these two algorithms requires a long time, we simulated them with sliding windows. In the first scenario, the block size of BOEM started with 9 sec, and grows geometrically. The sliding windows for both EM and optimizati...

07 Aug

I'm currently designing letters for our swarm of robots. I used cartesian coordinates to represent the points of the letters and made new components for L, E, and M. The svg screenshots for L, and M: Both appear to be upside down or "rotated" because, according prof. Mehta, the y coordinates incr...

07 Aug

Here is the link to the slides for von Mises Fisher Consensus as presented on Thursday during the lab meeting: https://docs.google.com/presentation/d/1O-YE76NR5W3VspuGz8c7o3kw252khtkzwVUPmlhnfqY/edit?usp=sharing

06 Aug

For SURP 2021’s Week 6, I incorporated IMU and Lidar sensors to the car with a tank-drive system. Currently, the Lidar acts as an obstacle detection at the front of the car wherein it would stop when it detects an object less than 130mm. For the IMU, it only shows the yaw values in degrees using the...

05 Aug

In the past few days, I have been working on integrating the visual processing work I have been doing with the work the rest of the Arnhold team has done so far. AprilTags will be integral to our demonstration next week, so I have been trying to determine the ideal resolution where the most area on...