Product Designer

NASA ProtoSpace -- Optimizing an innovative AR product for engineering collaboration

Optimizing an innovative augmented reality product for engineering collaboration


NASA ProtoSpace

Optimizing an innovative AR product for engineering collaboration



How might we optimize groundbreaking technology to fit into existing workflows and mental models? This was the driving force behind our project - helping mechanical engineers at NASA use ProtoSpace, a 3D visualization tool used in collaborative design discussions. Our solution was a new input method that paired with HoloLens to help engineers navigate and manipulate objects within the new tool.

Following a user-centered process, I conducted research, worked with product stakeholders to define project requirements, prototyped and tested our solution, and presented outcomes to faculty and students at our master's final project showcase.

Some details are excluded due to confidentiality reasons.


Problem & Opportunity

NASA found that HoloLens gesture controls cause friction among its engineers using ProtoSpace. In particular, feedback from mechanical engineers reveal that gestures are painstaking to use in group collaborative settings. They wanted a more precise and less strenuous way to manipulate 3D objects in augmented reality.

Design Challenge

How might we help mechanical engineers overcome the challenges of using ProtoSpace with gesture controls?


We introduced a controller input that pairs with the HoloLens to provide a more reliable and precise way to manipulate 3D objects. Our solution reduces the friction of using ProtoSpace and helps optimize the application for both internal and external clients.

My Role

UX Designer, Project Manager

Tools: Sketch, AfterEffects

Timeline: Nov 2017 - Mar 2017 (5 months)

Team: Alexa Alejandria, Trang Dinh, Rigo Ordaz

Deliverables: Interaction schema, concept video, competitive analysis, research findings, usability report

Sponsors: NASA JPL, University of Washington

Awards: HCDE Excellence Award Funding



I first connected with Lillian when she reached out to propose a project sponsorship during Fall of 2017. Immediately, I was impressed by her confidence, clarity in communication and go-getter attitude. 

She and her team helped explore input methods/interaction models for a 3D collaborative tool at JPL – spanning the whole UX process from research, design and testing. Lillian showed an aptitude to take on complex, fuzzy problems and distill them into actionable insights. Given plenty of real world restraints – I was delighted to find creativity and rigor in her process and deliverables.

-- Lauren Wheelwright (Senior UX Designer, NASA JPL)


Controller Input Concept Video


Controller Input Details


Command shortcuts

HoloLens fatigue sets in quickly, and engineers needed an efficient way to navigate ProtoSpace. We designed controller shortcuts to help them access commands while circumventing the need to use gaze or gesture controls.


Precise Object Manipulation

It was important for engineers to manipulate objects in predictable increments - a functionality which isn’t afforded with gesture controls. We designed the controller input to accommodate finer control of 3D objects.


Versatility & Scalability

Engineers wanted the freedom to switch between gestures and controller, depending on task-related needs. A versatile controller enables us to scale our solution to other tasks and scenarios.


The Approach


Product Audit

At the start of the project, we were faced with the challenge of understanding how ProtoSpace works. An on-site product demo revealed that its primary use case is a CAD visualization system, often used in collaboration settings.

My interaction with ProtoSpace revealed that navigating the system architecture using a combination of gaze (which requires extensive head movement due to a limited field of view) and gesture controls quickly led to exhaustion.

An on-site demo of  ProtoSpace

An on-site demo of ProtoSpace


What's wrong with gesture & voice controls?

Initial research with AR experts revealed that HoloLens gesture controls were considered unintuitive, difficult to learn, and quickly led to arm fatigue. I found the voice controls rather easy to use, however its strengths were lost in a group discussion setting.

3 gestures make up the HoloLens interface

3 gestures make up the HoloLens interface


User Research

I observed a team of engineers using ProtoSpace and noticed that the system would falsely identify hand movements as gesture controls, causing a workflow disruption. As a team, we realized that we needed to solve specifically for challenges that arose in a group discussion context.

Lastly, interviews with mechanical engineers revealed the common input methods and commands they used for 3D object manipulation. From this, I gained a better understanding of the mental models and expectations engineers had for AR CAD tools.

Interview with mechanical engineering grad student

Interview with mechanical engineering grad student


Defining a User-Centered Perspective

I worked with my team to distill our research findings into a user persona. Prior to this, we met with project stakeholders to ensure that our problem framing was aligned with product goals.

Stakeholder meeting to present our research, gather requirements, and ask for early feedback

Stakeholder meeting to present our research, gather requirements, and ask for early feedback



As a team, we explored over 20 different input methods to help engineers overcome the challenges of using ProtoSpace with gesture controls. I suggested we evaluate the controllers based on impact vs. implementation effort to determine the top solutions to explore.


Ultimately, we decided upon two input methods to pilot test with engineers. Both were off-the-shelf controllers that met requirements for low-cost implementation.

The Nintendo Joy-Con controller was chosen for its versatility (it can be used in 2 different orientations) and its moderate level of complexity, and the Google Daydream controller was chosen for its simple, straightforward interface. After pilot testing, we decided to abandon the Daydream controller since its functionality was too limited for our use case.

From there, we focused our efforts on mapping the Joy-Con controller inputs to the ProtoSpace system architecture.

Controller Mapping.jpg
Ideating upon controller mappings for the Joy-con controller

Ideating upon controller mappings for the Joy-con controller


User Testing

After refining our testing template, we tested 2 versions of our solution with 12 engineers and designers (4 vertical controller orientation; 4 horizontal controller orientation; 4 baseline gesture controls).

User testing was a true team effort. We decided upon a Wizard-of-Oz approach, which required members of our team to become actors within the AR system. This decision enabled us to rapidly prototype and iterate upon our solution.

We found that providing immediate system feedback was the most compelling factor for helping users feel immersed in our system.

The evolution of our Wizard-of-Oz system

The evolution of our Wizard-of-Oz system


Next Steps

While user testing helped us gather initial feedback, the next step is to work with product stakeholders and developers to implement the solution and to test its impact within a group scenario.


Lessons Learned

  1. So what?” During our feedback session with project stakeholders, we were advised to dig deeper to understand the "So What?" of our project. In order to make a compelling case for our design, we needed to really think about the why in conjunction with the use cases that we were designing for.
  2. Work with specificity, but understand how your design will scale. For this project, it was important to focus on a very specific user group and to narrow down the tasks we wanted to support. However, I learned it's important to understand how the solution would scale to other user groups and use cases.
  3. Paper prototypes communicate assumptions for instant feedback. We decided to borrow a common technique from 2D prototyping to create a 3D prototype of our AR system. Paper - combined with Wizard-of-Oz - allowed us to flexibly communicate our assumptions, and to easily adapt the interface and system behaviors as we learned more about our user group.

Our project sponsor was instrumental in providing us with the resources we needed during this project. I’m truly grateful for her support and the opportunity to work on this project with the team at JPL.