LineCircle

Experimental Augmented Reality Team for the Microsoft® HoloLens 2

NASA & Bradley University Suits Case Study

I am a part of the Bradley University interdisciplinary team designing an augmented reality experience for NASA with the Microsoft Hololens 2. The project is designed to implement a voice interacted user interface allowing for a hands-free astronaut space simulation. I designed the user interface for the onboarding, vitals, navigation, and science sampling screens. I created the new design style, iconography recognition, and new task-oriented navigation system.

Figure 1 • Onboarding

Overview

Bradley University brings forth an adept interdisciplinary team whose members study in some of the Midwest’s most innovative technology departments. Together, we seek to optimally utilize the most cutting-edge, contemporary, and emerging technologies. We look to create an augmented experience that guides the user through space, missions, and procedures through our Interactive Media, Engineering, and Computer Science departments’ collaboration and resource.

Problem Statement

The Bradley University team has begun iterating on our previous build based on our past versions’ feedback and the new additions that this year’s challenge brings. Our primary focus is to implement navigation and science gathering functions into our build while maintaining our experience’s ease and efficiency.

Our build will keep the design fundamentals of times past but with vast iterative improvements from previous years. We hope to receive a HoloLens 2 from Microsoft and NASA to implement a gaze feature and functionality. Additionally, we will continue to have our build acquire procedures and data from a dedicated web server. Our vision is for the user to interact with our interface through voice commands and/or eye-gaze, allowing their hands to be free during the experience. We plan to incorporate these philosophies into the navigation and science sampling solutions.

Onboarding
LineCircle

Figure 2 • Vitals | Figure 3 • Navigation | Figure 4 • Science Sampling

Design Analysis

Our NASA SUITS Project will be improved through new features and functionality and improvements to previous interactions. Our team aims to alleviate stress and cognitive overload by using object recognition and an adaptive, intuitively designed heads-up display (HUD). These elements include visual cues, adaptive layouts, transparency for optimizing on-screen focus and field-of-view, headset networking for efficiently sharing information, gaze eye-tracking controls for easy interfacing, and navigation features mini-map of the user’s location. Overall, in all of our HUDs, high contrast bright colors against dark backgrounds are used throughout the HUD so that no matter what the lighting or background is for the user, it is easily readable and legible.

As shown in Figure 1, the application opens with the onboarding screen. This screen is primarily used for user testing and new user training, as astronauts will have previous training when utilizing the HUD during their missions.

A smart and practical vitals display, shown in Figure 2, will display important information in a discreet and customizable way. All vitals will be continuously tracked and can be viewed with a voice command. The astronaut will be able to choose which elements they want to “pin” to be omnipresent. Another customizable option for the vitals screen is a minimized view. This option allows the astronaut to prioritize the environment around them instead of the user interface details. For greater ease-of-use, astronauts see dynamic icons that change simultaneously with their vitals. The use of color is also utilized to efficiently display meaningful information, such as green for full/almost full, yellow for the middle range, and red for when vitals are low. These colors aid the astronaut in detecting important information, such as if oxygen levels are low.

We will use our server to get data pertaining to navigation. This information would include heading, elevation, bearing, ETA, points of interest, and PET (The counter of which would start on user command). This acquired data will be inserted into our interface and displayed to the user through intuitive displays, guiding the user to their destination with access to all the information they would need.

As shown in Figure 3, the design elements are minimal but informative on tasks needed to be completed. Green arrows will be displayed to show the quickest route for each objective. Clear icons will be used to show the astronauts what tools are necessary for each job.

Like the navigation solution, the science sampling task will obtain the necessary instructions from our web server. The display of which would be similar to any other procedure the user would interact with. Also, we will be adding the commands “Take Photo,” “Take Video,” “Record Voice” to allow the user to use the HoloLens to take videos, photos, and field notes during the sampling process.

As shown in Figure 4, objective markers are used to display the nearest and most important objectives. The shorter the stem of the objective, the closer it is. The larger the circle around the icon, the more critical the task to complete. Each objective is given an icon to resemble what tools are needed for the job. Instructions will be fed automatically to the user with more detailed information on the astronaut’s aim.

By utilizing each element, astronauts will be much safer and will have an easier time completing tasks. Our mission is to design a HUD with the well-being of the astronauts as its focus. We plan to accomplish our mission through the use of object recognition, adaptive and customizable UI layouts with immediate access to critical information.

User & Audience

While on a Lunar Mission, astronauts have many different goals that demand their attention. Our design takes a minimalistic approach, so astronauts are not overstimulated when using our interface. 

Our display is organized such that the astronaut has access to the most critical information, such as vitals data, from a single command. More detailed information such as step-by-step instructions for tasks and navigation can be shown by navigating through intuitive HUD menus. Also, any panel can be pinned to the astronaut’s environment or field-of-view. Voice commands and/or eye-gaze can be used to navigate our interface, allowing the total astronaut control, even if their hands are busy. 

We will be using the Unity game engine as our development platform. We chose Unity because it offers native integration with the Hololens, the wide variety of libraries, and our team’s experience with the engine.

Navigation between sites will be started with a voice command. If the astronaut knows where they are going, they can immediately begin to navigate. If not, a map will be pulled up where the astronaut will select their destination. Navigation will be done via dead reckoning and will use the provided heading and distance to lander data for corrections. After the navigation has been started, the astronaut will be shown a combination of heading and approximate distance to destination. Navigation can be exited at any time and reentered later without losing information. 

The current heading and distance to lander data will be streamed into the application via REST API as it calls to a central server. The server will be polled at a rate of twelve times per minute to avoid causing bandwidth problems but still keep the navigation accurate. In between poles, we will dead reckon based on a combination of the previously streamed data and readings from the Hololens’ built-in IMU.

Once at their destination, the astronaut can pull up an instruction screen for their task. The task screens can be navigated step-by-step using voice commands. Also, the steps can be read aloud to the astronaut. A table of contents will be available if the astronauts need to skip to a specific stage in the process. The instructions panel can be closed at any time and reopened later without the current step being lost.

Roles & Responsibilities

I am a part of the design team, creating the UI. I designed the user interface for the onboarding, vitals, navigation, and science sampling screens. I also created the design style, iconography recognition, and a task-oriented navigation system. The team is comprised of ten team members ranging from designers, developers, public relations, and engineers.

Scope & Constraints

The biggest constraint for the initial concepts were not having a Microsoft HoloLense 2 in person. As well as the entire project will have to be complete and developed online via Zoom calls. At the current moment, our team is operating using Discord to meet as a group. Then individually meeting outside of group time to work on individual parts of the project. 

Process

Human-in-the-loop testing is an essential part of our design process. Our team plans to do HITL testing multiple times during the design process to improve our UI and software. We will be testing the experience and interface designs in a virtual setting due to these unprecedented times. If restrictions due to COVID-19 are lifted, we also have a plan to incorporate in-person usability testing. Testing and outreach events will allow us to iterate the development of the software using feedback and provide our audience with STEM and NASA Exposure. Our team will gather both qualitative and quantitative metrics by using a proctor. We want to focus on the time it takes for the users to get through certain checkpoints and their ability to run through the process we create for them without the proctor’s help. We will also gather information about the overall experience the user had. To gather these metrics, our HITL test will follow a general protocol described below:
  1. Inform the user that data will be recorded and that they will be asked questions regarding their experience after the test is done.
  2. The proctor will help immediately if the user is having trouble with the hardware (e.g., keeping the HoloLens on or if the app accidentally closed.)
  3. The proctor will offer minimal help using the software to see our design flaws with autonomy and iterate.
  4. The proctor will mark down the time it takes for the user to get to certain checkpoints and when the user finishes.

User Testing Results

Results have not been posted yet as this is an ongoing project. We predict to finish this project in May 2021. The user’s testing our AR experience will be astronauts from NASA Johnson Space Center in Houston, Texas. 

Outcomes & Results

Results have not been posted yet as this is an ongoing project. We were selected 3rd out of 10 universities allowed to submit a fully developed project and receive a Microsoft HoloLens 2. 

Shopping Basket