Live Onboarding Evaluation
ABOUT - Crucible was geared up to be Amazon's AAA, competitive shooter PC title. It had been fully released for a week and was not meeting D1 retention expectations. I had 4 weeks to come up with solutions and write an executive report.
MY CONTRIBUTION - Due to my expertise, I was brought over from my main team to help improve D1 retention. I quickly built rapport so that the new team would be open to collaborate on solutions and implement feedback from an outsider. My work consisted of researching and designing specifically for the Onboarding Player Experience with a high level report handed in at the end for executives and team.
DESIGN CHALLENGES - The team had a lot of different goals it was trying to accomplish while working on the critical issue of losing a significant portion of their First Time Players. There were many people trying to solve this problem from multiple angles so we had to be in constant communication and keep adjusting based on other's findings or feedback. I was new to this genre of game so there was a lot of research to get up to speed first.
PROJECT
Crucible - Amazon Games
TIMELINE
4 Weeks
TEAM SIZE
~140 people - Worked mostly solo while consistently synching with 5 Product Owners
MY ROLE
Research, User Experience Design, Executive Summary, Action Plan
DISCOVERY
- The first step was to listen to the development team's POV, look through Tableau dashboards and review past research to understand the current challenges they were facing.
- We noted where the largest drops were during the Onboarding process and I watched play throughs of play tests to see why these particular areas might be pain points.
- Next, I needed to learn about customer expectations. I played and recorded competitors' onboarding process and watched Youtube and Twitch videos of real life players going through competitor experiences. I read through the comments to get a larger sample of quotes and direction.
- Since the team did not have personas, I built Player motivations based off marketing's research and the competitor research I did to help guide the solutions.
SETTING GOALS
1. Their PM (product manager) and I created metric driven goals to help us focus on priorities and set success criteria.
- Improve retention rate by 15 percentage point in between events "Downloaded Game" and "Played first match"
- Improve retention rate by 30 percentage points in between events "Played First Match" and "Played Second Match
2. I created design goals that were focused on Player needs.
- Better fulfill Player motivations
- Focus on teaching the meta game. Players will feel their time is going toward meaningful progression.
- Make the Player power curve more apparent during the tutorial.
ORGANZING FINDINGS
- I storyboarded the competitors' Onboarding and created an insights section for each game. I point out what is working and why based off Youtube and Twitch videos of real Players of the game. I then take all the insights and create a tailored chart to show where they overlap or stand out in MUST HAVES, SHOULD HAVES and NICE TO HAVES.
- Now that I feel I understand what good looks like, next is storyboarding my recorded play through of the game we are making, Crucible. I break down the experience into beats to highlight what is working, suggesting improvements, and questions. This allows the team and I to have conversations that span from high level issues like flow, to granular ones like inconsistency in coloring and icons that cause confusion. The team can now see their game from a holistic view and more deeply understand the experience.
CREATING AN ACTION PLAN
- The Product Owners and I review tech restrictions, get time estimates and collaborate on the final solutions. We do not just look at Player facing improvements, but also backend to improve overall quality and analytics that will help us in the coming months.
- I created a final storyboard that will build the individual tasks we will need done in order to execute the full plan. They were broken up by discipline and priority.
- I wrote an executive summary that went through the Goals, the top Challenges, the Player Motivations, the Hypothesizes we had, the Low Hanging Fruit we were building right away and more Strategic Suggestions that needed higher level buy in. I ended it with the next set of questions we should investigate to be pro-active in our approach.
OUTCOMES
The game was unfortunately cancelled 2 weeks after this work was complete, so we did not get to finish building or measure it's impact. This highlighted the importance of UX methods being leveraged earlier and in different ways which was recognized by leaders across the team. Walking the team through the process then had them thinking in this way on the next projects they went on to. It's imperative to get team buy in by being collaborative and gaining trust from early on so that everyone feels like they were able to contribute and builds healthy work relationships. I use this versatile process consistently for evaluating a longer and complex experience, for example, you can do this with a D7 Player or review by persona.