Autodesk Maya - UX Research for Gamified Onboarding
Defining new onboarding experiences with gamification
Role: UX Researcher
Responsibilities: Usability testing, moderator, facilitator, interviewer, research planning, survey design, results analysis
Other team members: 1 UX designer, 1 product manager, 1 scrum master, 1 front-end engineer, 5 back-end engineers, 2 QA, 1 content designer
Context
Autodesk Maya is 3D animation and visual effects software used widely in the media and entertainment industry for movies, TV, and games. While Maya empowers artists to create stunning characters and worlds through powerful and technical tools, in 2020, Maya presented first-time users with a steep learning curve and no onboarding experience.
Maya 2020’s default UI on start-up
My Goal
When I joined the Maya team for my 8-month internship, I worked on the Maya First Experience (MFE) Project. Our team wanted to redesign how first-time users engaged with Maya during their first hour using it.
My goal as the UX Researcher was to create and execute a way to remotely evaluate 3 prototypes related to the MFE Project and understand how they could be improved based on user feedback.
The Maya First Experience Project’s Main Goals
Maya’s start-up feels fast and users know when it’s ready for interaction.
Within the first 3 minutes of use, all users feel welcomed, engaged, and inspired.
Within the first 10 minutes of use, new users feel oriented in the UI, learn how to move around the viewport, and learn to use essential tools.
Within the first hour, new users can create, render, and share content that they’re proud of.
Part 1: My First Experience with Maya - 2 Birds, 1 Stone
I was brand new to Maya so, working with the product manager, we decided that having me become acquainted with the software by running a self-test on my own first experience could serve two purposes:
To understand first-time users’ pain points by documenting my first hour in Maya with the task of animating a bouncing ball (a common first animation project for beginners, a “Hello World” of sorts).
A dry run for creating a usability test plan. This test plan later served as a template which I adapted to the MFE Project.
Task Scenario
You’re new to animation and want to apply basic principles by creating a ‘bouncing ball’ animation in 3D.
A breakdown of my 60-minute self-test:
In the first 10 minutes, I could not access any sort of help and documentation - whether in-app or using external resources
This was done to gauge how intuitive the default Maya UI is with helping a user get started
In the last 50 minutes, I could access any help and documentation needed to complete the task
To examine reliance on internal and/or external help and documentation
Test Objectives
Based on the overall goals of MFE, my self-test aimed to evaluate how Maya does with the following:
Start a task quickly
Create an engaging and positive experience within the first 10 minutes
Help users overcome errors
Improve users’ abilities as they get familiar with Maya
Examine how users progress through the task with external assistance
Participant Profile
Even though it was a self-test, I outlined a participant profile of myself. This later served as a foundation to the screener and pre-session questions used in the MFE Project’s usability test sessions.
Age and gender
To inform how our team creates inclusive learning materials and assets
Language
To facilitate communication during usability test sessions
Internet use
To assess general digital literacy
Software use
To gauge familiarity with other 3D and/or 2D software
Goals
To understand participant motivation to use Maya
Must-have features
To learn about what participants are looking for in 3D software
Methodology - Setting up & Conducting the Usability Test
Recording Data
Since I was conducting a remote self-test in my home office, I decided to capture data for review by screen-, audio-, and video-recording my session.
During the 60 minutes I had with Maya, I would follow the think aloud protocol so that my process, comments, and questions could be recorded.
Evaluation Methods and Data Analysis
Questionnaires
For evaluation, I prepared a questionnaire with:
5-point Likert scales: A quick way to quantify and capture attitudes and opinions
Open-ended questions: Give participants room for elaboration and can lead to deeper insights
System Usability Scale (SUS): A tool that measures the general usability of a system
It has 10 questions that use the same Likert scale - from strongly disagree to strongly agree. SUS is valid, quick and easy to use, and has been shown to have reliable results even with small sample sizes. At the time of research (2020), it had been referenced in over 1300 articles and publications.
Scores are on a scale of 0-100 but they aren’t percentages. They need to be normalized to produce a percentile ranking. A SUS score above 68 is considered above average.
Categorization
To analyze and categorize the data, I would use:
Jakob Nielsen’s 10 usability heuristics as guiding principles
Severity ratings to assign to usability problems
Since ratings take into account the frequency, impact, and persistence of each problem, the ratings would be useful during prioritization planning.
Outcomes of My First Experience
My bouncing ball animated in Maya!
I did it! I animated a bouncing ball … but only because I could rely on help and documentation in the last 50 minutes.
In my first 10 minutes, I made 27 unique observations which can be broken down into:
8 positive features
5 of which are great
3 of which have room for improvement
19 usability issues
1 accessibility problem
2 unusable problems
10 severe problems
6 moderate problems
The 8 positive features were relevant to 5 different heuristics while the 19 usability issues touched on 8 heuristics.
Unsurprisingly, using Maya was very difficult for me. I didn’t know where or how to start because of the size of the user interface and the overwhelming number of features displayed in it.
After completing the System Usability Scale (SUS), the SUS ratings I got were...
10
SUS score
F
letter grade
“worst imaginable”
adjective rating scale
SUS Usability Score scale for reference. Graph prepared by 10up.com
Unfortunately, my experience isn’t unique.
However, I was only “somewhat dissatisfied” with Maya because with the help of external resources, getting closer and closer to achieving bouncing a ball felt exciting and rewarding.
My main takeaway from this experience, and something that is reflected in the work that we did with the MFE Project, is that there is a real need to make Maya friendlier and easier to use for first-time users.
Part 2: Iteration - Planning MFE Usability Tests
Prototypes Overview
The usability tests had two parts:
1: Application Home Page
A Figma prototype created by the team’s UX Designer.
App Home included tabs for users to explore their recent files, learning content, and Maya community forums
Participants were asked to freely explore the interface so we could observe which sections people gravitated towards
We wanted to see if participants were naturally drawn to the tutorial prototypes
Figma prototype for a new Application Home page for Maya
2: Tutorials
Two tutorial prototypes, Maya files designed and developed by the team’s Content Designer and UX Designer.
Each tutorial was a form of interactive gamified learning.
Users learn how to use different parts of Maya by following a mythical character around town and helping her complete tasks.
Each tutorial had a learning portion and a final challenge. In the learning portion, users were given instructions, tips, and an on-screen guide for learning the basics. At the end of each tutorial, users were presented with a final challenge that required users to apply what they just learned without guidance.
Character prototype for the tutorials
Tutorial 1: Maya Basics
Introduces users to parts of Maya’s UI like the Outliner and Attribute Editor
Teaches users how to navigate the viewport (pan, dolly, and tumble)
Shows users how to use essential tools (select, move, rotate, and scale)
Maya’s tool box includes tools for selecting, moving, rotating, and scaling objects. Images by Autodesk Maya documentation.
Tutorial 2: Modeling Basics
Teaches users about constructing poly shapes and builds familiarity with modelling terms like ‘subdivisions height and cap’
Shows users how to use the contextual menu for quick access to common actions (extrude and duplicate)
Exposes users to the Content Browser where they can explore and play with content such as pre-made objects, animations, and scenes
Maya’s Content Browser has pre-made examples for users to play with
Research Toolkit
Qualtrics for the pre-session questionnaire which acted as a screener and a way to gather preliminary information from participants
Microsoft Bookings – a GDPR-compliant method to manage scheduling with external participants
Excel to track my communication with interested participants and external contacts
Zoom to facilitate communication and as a channel to observe and record participants’ use of the prototypes
Moderator Guide and Script that I created and shared with my team to follow each usability session
Target Participants
With the MFE project, we were aiming to help people who never used Maya before AND have an interest in 3D space so I created two profile categories for recruiting participants based on familiarity with 3D tools:
Beginner/Intermediate users who have no to little experience in 3D space based on other software use (e.g. Blender, Houdini, 3ds Max, etc.)
Advanced users who are comfortable working in 3D space based on other software use
Participant Recruitment and Incentive
I used two purposive non-probability sampling methods, convenience sampling and snowball sampling, to connect with a total of 15 participants.
6 participants were internal to Autodesk:
3 with fellow interns who I recruited through Slack
3 with full-time Autodeskers, a Senior Manager of Experience Design from 3ds Max, a Maya Content Designer, and an Associate Vice President of Product
9 participants were based on recruiting I did from various external communities including:
‘Where are the black designers?’ Slack group, a platform for Black creatives that I was invited to by a fellow intern
Art schools and creative communities in the United States that participants I talked with suggested I reach out to
3D graphics, game development, and arts programs in Ontario colleges and universities that I connected with by emailing program coordinators
From my recruitment, our team got a good mix of beginner/intermediate and advanced users from all over the world – a few Canadian students and a recent grad from related programs such as Game Design, and professionals and amateurs from the United States and South Korea.
To compensate participants for their time, I used Autodesk’s internal guide to dispense appropriate awards based on the length of time they spent in the usability session.
Evaluation Methods
Similar to my self-test, I used a combination of the following methods:
The think aloud protocol in which I asked participants to express any thought they had as they were going through the prototypes so that we could follow their thoughts and feelings throughout their experience.
Time on completion – I measured the amount of time it took users to complete a task. I used this for Tutorial 1’s final challenge to gauge how quick participants applied basic knowledge such as using the transform tools and the Outliner to find objects. For this part, I asked them not to follow the think aloud protocol, so it doesn’t affect time on completion.
Likert scale questions - I asked these after each prototype and at the very end to ask questions where users rated their opinions on a scale of 1 to 5.
Open-ended questions helped me capture more insights and feedback from users that may have been missed by the other evaluative methods.
UX Testing
2.5 months
testing period
15 participants
6 internal, 9 external
1.5 - 2.0 hours
length of each session
Outline of Each Usability Testing Session
Welcome, introductions, and an overview briefing
Stating Privacy and Disclaimer statements including important details like letting participants know that they can stop at anytime for whatever reason – they didn’t need to disclose their reason
Asking for participant's consent to record the session
Verbally going through a pre-session questionnaire that built on the screener questions they were asked prior to the session so that our team was aligned on the participant’s background and experiences
Testing and observing period for us to see how users interacted with the prototypes. I and any observers were there to guide participants if necessary and to ask non-leading questions to clarify observations.
Ending with a round of prototype-specific questions and then wrapping up with general questions on their entire experience.
Challenges
Of the 9 external participants we met with, only 7 successfully completed the sessions. The 2 that weren’t able to participate ran into technical constraints with their machines which didn’t have a graphics card and enough RAM to process the graphics in the tutorials.
Although it was unfortunate that two participants couldn’t complete the tutorials, this challenge pointed out a need to work within these technical limitations so that all users who can run Maya (which has its own minimum technical requirements) can also run the tutorials.
This challenged us to adapt to how the remote usability tests were conducted - Zoom’s remote-control feature came in handy whenever we ran into the issue again. Participants assumed control over my computer to go through the prototypes so they could fully participate in the usability session.
Results
Final Demographics
Our team ended up working with many different perspectives from the 15 participants. The 6 that were internal to Autodesk were, for the most part, completely new to Maya, and the 9 external participants were all completely new to Maya.
The 7 external participants who completed the prototypes varied in personas and demographics as well:
Personas: 3 students, 1 recent graduate, 1 amateur, and 2 working creative professionals in target industries (media and entertainment).
Gender: 4 identified as male, 2 identified as female, and 1 identified as transgender male
Age: 4 were in the 18-29 category (1 male, 1 transgender male, 2 female) and 3 were in the 30-39 category (all male)
Country of residence: This category stayed in the North American region with 4 in Canada and 3 from the States
In hindsight, I should have also collected data on race and that’s something I would recommend for future testing to understand trends such as races that are underrepresented in the industry and to continue having conversations regarding diversity.
Outcomes for Iteration
Based on the usability test, I identified 10 themes which helped our team iterate on both the app home and tutorial designs.
Themes included:
Recognizing the need for a quick start tutorial for advanced users
Insights into varying learning styles such as learners who like to explore outside of the tutorial path by looking at other areas of the interface and doing their own process of trial and error
Participants needing more system feedback once they’ve completed a step in the tutorial so that they feel confident moving onto the next step
Identifying opportunities to adjust designs to reinforce learning
… which helped us adjust the learning pathway in areas such as (but not limited to):
The order and language of instructions for more clarity
Including iconography in the instructions for users to easily identify components of the UI
Adjusting limitations at checkpoints to increase freedom of users to play
Adding lesson summaries for users to recall what they learned
Learning for New and Experienced Users
When users open up Maya their very first time, they can now choose their experience level for customized learning content
Improving Clarity
Based on user feedback, instructions evolved over time for clarity by emphasizing certain words through colour and formatting (bold) and using icons and images (alt/option + left mouse button) to connect instructions to user actions
User Reactions
“I feel like this is the most fun kind of tutorial that tries to explain how to use a navigation system ... it's really cool."
— Participant 01
“Main draw for me now is that I can model characters, I can at least embark on that process … I didn't know how to do that before."
— Participant 04
Overall Satisfaction Rating
— 7 external participants who completed the usability sessions
4.57 / 5
While I didn’t ask participants to complete the System Usability Scale, it’s clear that their average rating of satisfaction is a vast improvement to how I found Maya’s usability at the start of my term.