Design Operations: how to improve visibility and predictability of design work
There comes a time in every design org when the choice to go from good to great is presented. If design is a series of decisions that create elegant usable patterns, then a healthy design org needs to run on clarity and understanding. This means that at a certain size and capacity, it is time to bring in Design Ops—aka the design of how to operate a design org.
As our design team started rapidly growing over the past two years, it became clear that stakeholders didn’t know what their designers were working on at any given moment, and what the status was of each task. It was time to create better visibility and predictability to design work across the organization.
We brought onboard our first Design Ops manager in the company, who had a TPM background, and was familiar with leveraging Agile Scrum methodologies as a tool for solving these kinds of workflow problems. The larger product design team became a pilot group with a Scrum board set up just for them.
Using Story Points to measure design tasks
Before our first sprint planning, we had an overview of Relative Complexity Story Point estimation. Following the footsteps of engineering teams, who have been setting up and running Agile processes for years with increased team velocity, we quickly realized that would fit the design teams as well.
When using time estimates, people tend to think only of “ideal engineering time”—the time they spend writing code—and have trouble accounting for all the other time required to complete a task such as talking to stakeholders, waiting on other teams, etc.
Designers think of estimating their work in the same manner. Giving an estimate, they typically aim for an “ideal design time”—working in Figma/Sketch—and often forget about all the back and forth and feedback loops. Relative complexity story point estimation addresses ambiguity in the process of evaluating product design tasks and creates manageable bite sized stories that are feasible to complete in one sprint.
How to do this estimation:
- Use the Fibonacci numbers (1, 2, 3, 5, 8, 13) as your measuring scale.
These numbers correlate to t-shirt sizes: 1 = XS, 2 = S, 3 = M, 5 = L, 8 = XL (if the number is 13 consider making it an Epic or break it down into smaller tasks)
- Choose a sample task to represent a 3 (medium). It’s best if it’s based on something completed in the past, but can also be a contrived description of a typical task.
- When giving a score, think about the complexity and consider these questions:
- Can I do this in isolation or will I need to consult with others?
- Are there dependencies that will need to be resolved before I can complete this task?
- Is this in an area that I’ve worked in before?
- All the participants of the sprint planning give each task a score. If there are any discrepancies in scores, the team has a discussion around why some people think it has more or less complexity.
It takes about three sprints to get to a stable velocity and get a sense of how many story points the entire team can do in one sprint. Two-week sprints felt right to our design teams who synchronized their sprints with the corresponding engineering team for better efficiency.
We also experimented with having designers participate in the engineering teams sprint planning. It was not ideal because designers were not able to track their own velocity since their story points mixed in with the engineering team’s. The epics that are used to track the overall lifecycle of a project have an engineering-focused workflow that does not accurately represent the design team’s state. And, most often, the designers are a minority in the sprint planning meeting with the engineers taking up most of the time and attention.
What is the best way to go about planning larger projects?
Once the team has reached a stable rhythm, you can move on to planning more complex projects. If a program manager is asking for a due date for design completion as part of the overall project plan, you can:
- Break down all anticipated design work into tasks
- Size all the tasks
- Plot out sprints into the future with tasks that add up to the team’s velocity
This will help predict how many sprints the design team will need to complete their work.
To help designers section their work into discrete tasks, we developed a task breakdown guide. To start, we interviewed our product design teams and asked them describe their existing process. Next, we identified common patterns in the designers’ workflow:
- Discover: The stakeholder alignment where designers ensure they understand the requirements and everyone agrees on the project goals. The activities for this phase include: brainstorming, whiteboard, UX research, and competitive analysis. This often results in updates to the product spec.
- Explore: An opportunity for divergent creative output where designers explore as many ideas as possible to solve the project’s goals. The activities and deliverables for this phase include: wireframing, low-fi mocks, and socializing of ideas for feedback from stakeholders.
- Refine: Design convergence where designers choose the best ideas from the previous phase and refine them. The activities and deliverables for this phase include: prototyping, hi-fi mocks, and formal presentations to stakeholders for final feedback before building begins.
- Build: While the engineering team is developing the project, the designers are on call to answer questions, update mocks as needed, and perform visual QA on the engineering deliverables.
By prefacing ticket titles with one of the above phases, it’s easy to quickly understand how far along the design process the project is.
We implemented this process across the entire design team over 12 months. By the beginning of the next year, perceptions had changed: stakeholders finally felt they had insight into the design process and schedules; designers reported feeling much better day-to-day with clear priorities and expectations of their work. They also no longer felt like they were being pulled in a million directions or constantly having to switch context. Overall feedback on the process has been positive from both the design team and product.