This past week, through multiple attempts of manipulating and exploring the utilities and interactions between ARCore, used to track Augmented Reality environments; ManoMotion, used to track gestures and hand motion in 3D space using the RGB Camera; and a pseudo-version of the Google Cardboard SDK that allows users to render the Camera from ARCore into a compatible device that can be equipped onto the Google Cardboard (or any compatible VR Headset). Though many of the reasons behind these hurdles and challenges are hypothetical, I have taken time to explore and discuss on developer forums the issues raised with regards to the compatibility of these libraries with each other.

Firstly, by default the Google Cardboard SDK applies a filter on top of the camera that renders it with a "fish-eye" lens that complements the lenses provided within the Google Cardboard. From the way Unity compiles and builds its projects, it appears that these rendering effects are applied at the end of the project, meaning that it is possible these filters also affect the subsequent data extraction from the camera for ARCore. An alternative method is to make a customized shader, which requires further understanding of Unity's development process; however, due to the issues surrounding ManoMotion, we require further investigation to fully determine whether this (in context to the time we have for the research) is feasible given our ultimate deliverables for the project.

In relation to ManoMotion, due to the limited and outdated documentation provided by the developers, there is little to no mention of how ManoMotion can be implemented with ARCore. Thus, I have taken on the task of directly confronting and discussing with developers in the ManoMotion developer forum to seek help and guidance. As the developers have admitted, the documentation is outdated, which is why it is best to directly seek help from them. As a result, for now, to efficiently optimize the remaining time we have left for this research, I will focus my work on implementing (temporarily) the gestures and transformations using the touch screen of the mobile device.

On a lighter note, however, I have successfully prototyped a working version of the Measuring Utility that allows users to extract dimensions from the physical world into the Mobile AR Device. Currently, I have not implemented the GUI for the mobile device, but as of right now, users can accurately obtain distances and (although not depicted in this video) angles between certain points and vectors, given the starting point and ending point of the measure. The quality and accuracy of the measure depends on the complexity of the surface itself, the motion of the camera, as well as the resolution of the plane that is to be detected (which can be affected by factors such as noise and lighting).

Video of Measuring Utilities

Next Post Previous Post