Table of Contents

Yu Hang He Tesla 2019 Journal

Author: Yu Hang He, Email: hey6@unlv.nevada.edu
Date Last Modified: 11/04/2019

Week 14

What I Learned About Myself

This week I have the opportunity to explore the potential of HoloLens 1, an AR solution offered by MicroSoft for workplace application. Just from working with it for a short time, I realized that it has a huge amount of potential in assist engineers and other professionals in their daily work. In addition, HoloLens 2 has much better capability than its previous version. I am excited to see where AR technologies could advance in the future.

Project Status

I have been focusing on development with HoloLens during this week. First, I learn how to develop with Unity and C# since Hololens SDK has the strongest support for Unity engine. The Unity Engine is also used by Dylan in his third person view of DRC-Hubo project. Therefore, I think it is import for me to learn how to use and program in Unity and C#.

Afterward, I familiarize myself with HoloLens SDK, which MicroSoft called Mixed Reality Toolkit (MRTK). The MRTK is a SDK that contains resources for building application for HoloLens 1 and 2. Since the BIM team have multiple HoloLens set. I have the opportunity to borrow a headset and play around with it. I was able to follow through the different tutorials and created some sample applications in HoloLens.

Finally, I started working to integrate Spot and HoloLens. The API provided by BD is written in Python while HoloLens SDK is based in C#. Therefore, to control Spot directly from HoloLens would require me to rewrite the BD's API in C#, which is unrealistic given the time frame. The approach that I decided is to use the Desktop GUI that I developed for Spot as an intermediate/server, that would receive inputs from HoloLens and send them to Spot and vice versa. Basically, the Desktop GUI would setup a websocket server to communicate with HoloLens. The messages would then be relayed to Spot through BD's API.

With this plan in mind, I begin to integrate Spot and HoloLens. I implemented websocket server on my GUI and client on the HoloLens App. I created simple interfaces in HoloLens, buttons, to send commands to Spot. During my exploration, I came across HoloLens' voice recognizer. I decided to include voice commands as part of the control scheme for Spot, which could expedite user's adjustment to HoloLens. The HoloLens App was tested to work as expected in simulations. However, when I deployed it directly on Spot, the connection crashed my GUI after only successfully sending one command.

Project Agenda

I believe that I am quite close to the first step toward integration between Spot and HoloLens. Granted that I would need to debug my code to discover why my application crashed. I believe it may be a result of conflict in asynchronous communication. After this, I would need to start working on the use case of directly command Spot to move to certain locations using HoloLens.