Over the past week, I've been working on measuring the state of the simulated segbot that we have built in Gazebo. Gazebo has 2 main kinds of state acquisition methods. First, we can attach sensors such as lidar or laser beams to the seg_bot model and use a plugin to read those sensor values, which would give us the distance to the nearest object. Noise can also be added to these sensor readings. The second option is to use readings from the joints in the car. In Gazebo, each model is built using various components that are then connected using joints. Motion is achieved by either applying a force to a component or by setting the translational or rotational velocity of a joint to a certain value. The plugin system in Gazebo is then capable of reading the value that the speed of the joint is set to. This functionality is analogous to reading values from an encoder attached to the wheels of a physical robot. We decided to use the second method to get data to train our Proximal Policy Optimization value with since w can get both location and directions from this information with some calculation. If we used a range sensor of some kind, to get the direction, we would need to have previous knowledge of the world's topology.