The Evolution of Miro’s User Onboarding: Why Big Investments Didn’t Stick, and Smart Iterations Won
I joined Miro in 2017 and became the first Product Designer for the Growth team. Over six-plus years, we experienced exponential growth:
- The company grew from 50 to nearly 1,800 people
- The user base scaled from 1 to more than 50 million
- My team scaled from 1 to 11 Product and Content designers
In this post, I will share the evolution of Miro’s approach to user onboarding and activation, divided into 3 phases: Startup, Hyper-growth, and the current Growth at scale phase. This will not be dedicated solely to the successful stories we are used to hearing. I intentionally want to also share the failures and learnings that eventually helped us get to a mature, iterative culture.
I interviewed a PM, Data Analyst, and Product Designer who worked on user onboarding and activation over time. Based on those learnings, I will share how we approached experimentation, resolved challenges, and changed design patterns and activation metrics during pandemic-era hyper-growth. Then, I’ll reflect on where we are today and what’s next.
I hope these insights will help product teams of different sizes and scales take a fresh look at their user onboarding and activation to unlock new perspectives to grow faster.
Phase 1: Startup [50 people, 1 million users]
When I joined Miro (formerly RealtimeBoard) in 2017, we started working on user onboarding right away. I would like to emphasize that onboarding is a foundation of all following funnel success (retention, monetization), and we were seeing the biggest drop in week 1 (W1) retention. At that time we were iterating on the sign-up journey and the user’s first experience with Miro. The direction was inspired by best practices:
- Segmented by role
- Guided, step-by-step experience
As we were rebranding in parallel, we decided to update our sign-up journey and make it more interactive, visual, and value-focused. As a designer, I was very satisfied with the outcome, and even more pleased with the qualitative user feedback.
Positive user feedback on the interactive sign-up flow
We got the results. They were negative.
- Sending email invites from the sign-up flow was lower for the new flow.
- Roles completion was lower for the new flow.
- Fewer users started their work from a template in the new flow.
We quickly learned that the interactive beautifications of the flow distracted users from completing the main tasks. As a designer, I was awakened from my traditional UI/UX-oriented perspective. We had to iterate on that — but in hypergrowth other bigger priorities shifted our team focus.
Looking back, I see that “smart iteration” on that experience could help us find the right trade-off without sacrificing the whole visual part of the experience that was created to show the value. I asked my then-colleague PM in growth to reflect on that together, and she shared some valuable advice for early-stage businesses focused on onboarding:
“I’d highlight two main learnings as a Product Manager: First one — every time when you’re running an experiment, you need to run user interviews after because data says what is happening, but only users could say WHY this is happening. You cannot succeed from the first time. It would be great to learn from users why it doesn’t work and run 1, 2, or even 3 iterations, so you could get the most out of your hypothesis. Launch, learn from users, iterate — that matters a lot to achieve the business impact.”
Let’s see how we applied these learnings at Miro moving forward and started iterating smart on the onboarding flow. But before that, I’ll tell you what we learned from big investments in a difficult and intense period of hyper-growth.
Phase 2: Covid Hyper-Growth [150 → 1000 people, 3 → 10 million users]
In 2019, right after our freemium business model and Miro rebranding launched, another big external factor hit the world and the company — the pandemic. In six months, the company grew from 150 to 1,000 people, and our user base scaled from three to 10 million users. These were less tech-savvy customers who really needed Miro as a tool to continue doing their work efficiently and to stay connected with their colleagues. We learned that this new audience really needed to onboard quickly and the onboarding and tool were complex. We needed to simplify the first user experience.
With that refreshed activation perspective in mind, we started tackling the user onboarding problems to find impactful solutions. From the Product and Design perspective, we started thinking about how to innovate the onboarding experience and take into account the refreshed activation definition. It was a good moment to run a data research and regression analysis to synthesize our existing knowledge about activation and define Setup, Aha, and Habit moments. (Learn more about Miro activation in a bonus part at the end.)
Rethinking Miro Onboarding with a focus on the aha moment
We always try to run qualitative user research in parallel with data analysis. We got additional insights from the user interviews: people didn’t know what they or their teams could do in Miro. We were positioning Miro as a “team-centric” product, which meant that collaboration was the core value on which we needed to deliver. This was the main user problem to solve with a new onboarding experience, and the team came up with a really innovative solution for that: a so-called “robo-collaboration” experience that guides users through the first experience in a human way. The team, leadership, and users were in love with that experience. (Fun fact: We user-tested this flow first with AI-generated videos using Synthesia two years ago, before AI became the new norm!)
Prototype for “robo-collaboration” with AI-generated videos
Final “robo-collaboration” experience for user onboarding
It turned out to be a big bet for the team with over a quarter of investment in the research, design, and validation phase. As you can see from the images above, our Miro Academy manager Matt grew a beard over the course of recording the human video walkthrough, which happened in parallel with the simulation of the collaborative session.
Simulation of collaboration session in a new user onboarding
When the team got buy-in from leadership and a positive reaction from several user validation, we started building. The team managed to work it into the MVP and launch a test in 1.5 months. The results were controversial:
- A quarter of users start a tutorial. The number was not so high for several reasons: users didn’t have time, had a specific task to do, knew how to use Miro, couldn’t watch with audio, or information wasn’t relevant.
- We had a meaningful uplift in users who created content on the first session because the tutorial triggers users to take easy actions.
- Still, no improvement on the aha moment.
The team realized that we needed to double down on personalization and localization. The second iteration showed traction:
- Double the number of users started the tutorial compared to MVP1.
- However, we had a big drop from the second tutorial step.
The team did a great job in moving the needle for the users who create content, but it didn’t lead to a higher number of collaborative sessions. We needed to zoom out and find the root cause, so we stopped iterating on this investment and started brainstorming other hypotheses. In the next section, I will reveal how the team drove the needle at the aha moment through smart iterations.
Phase 3: Growth at scale and the power of smart iterations [~1,800 people, ~50 million users]
As the product grew, it became more and more complex to iterate with different parts of it. The team had to become more mature and thoughtful about experimentation and use the art of connecting the dots between qualitative and quantitative learning.
To uncover more in-depth insights, our UXR team ran “Diary studies” to explore the behavior of prospective users as they trialed Miro for 28 days. We wanted to uncover the main blocker for early churn and activation.
The team analyzed anonymized user behaviors around initial use and early collaboration patterns to help drive deeper understanding of what to emphasize during the first experience.
That established the foundation from which we started exploring hypotheses for new activation experiments and looked deeper into the experiences of two main segments — Creators and Joiners.
The team started focusing on the experience for Joiners as one of the audiences on which to double down. Our hypothesis was: If we break the ice for new board joiners, nudging them to perform an easy, simple, and delightful collaborative action that removes their fear of engaging with a new tool, we’ll increase the aha moment.
Next, we needed to define that easy action that could drive delight and stickiness to the product. It wasn’t that difficult to uncover that Miro reactions provide this simple and delightful effect.
Based on that insight, the team created the solution where new joiners were triggered to “Say Hi” on a board using a reaction. Another collaborator would receive a notification about their reaction, thus encouraging them to start collaborating on the board.
“Say Hi experiment” — first iteration
However, the first iteration didn’t yield any stat-sig results at the aha moment. The team had to dive deeper into what didn’t work and how to iterate further.
To do so, the team ran an in-depth post-analysis study that helped them uncover six theories and prioritize the main improvements for the next iteration, instead of doing a bigger pivot. Here are the main smart iterations the team uncovered:
- Improve discoverability: Increase tip visibility by making the tip background dark
- Decrease cognitive load: Trigger reactions right from the tip button and show a new tip indicating where they can find more reactions later.
After applying the two prioritized theories, the team iterated on that experience and added several improvements — making the tip more noticeable, simplifying reactions, and adding a reminder of where to find more reactions.
“Say Hi experiment” — second iteration
Without investing in post-validation user research, these small changes helped to achieve an uplift in the aha-moment, matching the prediction that was made.
Key tips on how to “iterate smart”
Having spent over six years on Growth teams at Miro, I think the most important and lasting learning I’ve gained is that we need to keep learning and iterate smart. On our Growth Design team, we reflect on each experiment and unpack the WHY behind the result. The “quiz workshop” format helps us uncover further iterations collaboratively — we look at the experiment and try to guess which variation was a winner and the “WHY” behind it, which helps us co-create future solutions with speed.
Quiz workshop — uncover smart iterations
Here are five key tips to help your team iterate on your product onboarding flow with intention:
- Run regular usability tests of your onboarding experience with users and non-users to build product sense muscle (2-3 per week).
- Try to validate your big investments with smaller first iterations by decomposing them.
- Once you get the first results, dive into the behavioral data post-analysis to map the theories, evidence, and solution ideas for iterations.
- Reflect on your first iteration — try the “quiz workshop” format for your retrospective.
- Always run a second iteration.
I hope this case study helps you improve your own user onboarding. Don’t give up after the first iteration, always ask why, and keep listening to your users.
To get even more details on the evolution of Miro’s user onboarding, read the story on Growth Unhinged here. Subscribe to Growth Unhinged to get this story along with bonus content delivered directly to your inbox. Subscribe to Growth Unhinged here.