I worked with the team of undergrads to develop this initial list of requirements for the dashboard. It's based on the kinds of experiments we plan to run and what kinds of data will be relevant. Of course, this is a preliminary list only, but I've specified that we will use modular design so that adding new components in the future will be easy. In particular, I've ensured this by using React. Sud has experience with React, so I've made him Actor for this epic.
Goals: fast, accurate perception w.r.t. situational awareness and comms
- Raw video feed (openMV, ESP-Cam, whatever cameras we’re using/testing)
- Processed video feed (with labeled boxes around identified objects such as colored blobs, other autonomous blimps, April tags, LED displays)
- Show different stages throughout the vision processing pipeline
- Perceived 6DoF robot pose (calculated using LED display or April tags)
- Show the output using different algorithms for pose estimation
- A section showing what settings we’re using for the vision processing (like HSV threshold, % probability threshold that an object is there, other hard-coded variables)
Development variables: vision processing filter thresholds, pose estimation weights/variables
Goals: fast, accurate motion
- Raw readings from the IMU, LIDAR, other sensors (and ‘processed’ readings from all those sensors)
- Show the joystick + throttle inputs for tank drive
- Amount of error in all necessary dimensions
- Path planning and localization?
- Amount of output we’re sending to each actuator (ex. energy consumption of the motors)
- Position, velocity, acceleration of the cars
Development variables: feedback control gains, different feedback control algorithms
Goals: robustness, security
- Signal strength from each blimp to each other blimp
- Packet degradation (% packet loss)
- Map out deadzones
- Which frequency currently being used to transmit data
- Which blimp we're using as the root node
- If a frequency channel is getting jammed
- Connectivity topology (if using ESP-Mesh)
Development variables: frequency channel being used, type of data being transmitted
Goals: intuitive user interface, gather and store all relevant data
- Current stage in the autonomous sequence (if we're running the get_ball sequence, are we in the go_to_ball stage of that sequence?)
- Testing and development
- Buttons/dials to test different subsystem variables "Development variables" in real-time (so that we don’t have to stop the experiment, manually change the code and upload it, then run the experiment again)
- Store all data streams in an organized way
- Show statuses of subsystems (ex. is the camera feed working?)
- Show potential causes of error?
- Flag any anomalies in operation (unusually high packet loss, cameras don't see anything)
- Simple plain text input field for some otherwise uncategorized textual description of the experiment (ex. who was piloting, or which shape balloon we were using at the time)
- All data is timestamped
- Any other competition-specific info (@Kamil)
I worked on getting the skeleton code that implements our custom server library to work on the Adafruit Feather boards. I worked with Bhavik to get two-way communication working using ESP-Now protocol. We also researched different communication methods including ESP-Mesh and 433 MHz packet radio. We've defined our goals for a robust and secure communication system, as well as anticipated challenges.
I took on this epic today. So far, I have gained a high-level understanding of what Roco does and how it works. I'm currently working on installing it on my local machine so that I can use it to make new designs next week.