Author: admin

  • Unveiling the Challenges in Discovering KPIs for Designers

    Unveiling the Challenges in Discovering KPIs for Designers

    Introduction:
    Designers play a pivotal role in shaping successful products and experiences that resonate with customers. As organizations strive to optimize their strategies and drive business growth, the identification and measurement of Key Performance Indicators (KPIs) become paramount. However, designers often encounter unique challenges when it comes to discovering and defining KPIs that align with their design goals and contribute to overall business success. This article explores the hurdles faced by designers in this critical endeavor and highlights the importance of a strategic approach to KPI discovery.

    1. Aligning Design Goals with Business Objectives:
    Designers face the fundamental challenge of connecting their design goals with broader business objectives. While design principles such as usability and aesthetics are essential, they must also align with key business drivers such as revenue growth, customer satisfaction, and market share. Striking the right balance between design-centric KPIs and business-oriented KPIs requires a deep understanding of the organization’s strategic priorities and the ability to articulate the value of design in driving tangible outcomes.

    2. Defining Meaningful and Actionable Metrics:
    Designers encounter the complex task of identifying metrics that accurately capture the impact of their design efforts. Unlike quantifiable business metrics, design-related outcomes can be challenging to measure. Metrics such as user engagement, satisfaction, and usability require careful consideration to ensure they provide meaningful insights. Additionally, designers must distinguish between vanity metrics that merely provide surface-level information and actionable metrics that drive meaningful design improvements and inform decision-making processes.

    3. Balancing Quantitative and Qualitative Data:
    Designers rely on a blend of quantitative and qualitative data to inform their design decisions. However, identifying KPIs that effectively capture both aspects can be a formidable challenge. While quantitative metrics offer precise measurements, qualitative insights provide rich contextual information about user behaviors, needs, and preferences. Combining these two data streams to derive comprehensive KPIs necessitates a robust research methodology and a deep understanding of the interplay between data-driven insights and human-centered design principles.

    4. Adapting to Evolving Design Processes:
    The field of design is dynamic, constantly evolving with emerging technologies and changing user expectations. Designers must navigate the challenge of aligning KPIs with evolving design processes. As agile and iterative design methodologies gain prominence, designers must reassess their KPIs throughout the product lifecycle and adapt them to reflect iterative design improvements. This calls for flexibility, agility, and a proactive approach to monitoring and measuring design performance.

    Conclusion:
    Designers face unique challenges when it comes to discovering and defining KPIs for their products. By aligning design goals with business objectives, defining meaningful metrics, balancing quantitative and qualitative data, and adapting to evolving design processes, designers can overcome these challenges and drive measurable value for their organizations. Embracing a strategic approach to KPI discovery empowers designers to demonstrate the impact of their work, enhance user experiences, and contribute to the overall success of their products and businesses.

  • How UX Designer Should Talk

    How UX Designer Should Talk

    As a UX designer, it is essential to understand the business objectives and outcomes that drive success. When discussing and writing about increasing revenues, decreasing costs, increasing new business and market share, increasing revenue from existing customers, and increasing shareholder value, here are some approaches to consider

    1. User-Centric Language: While focusing on business objectives, it’s important to communicate in a user-centric manner. Instead of solely emphasizing financial goals, highlight how user-centered design can enhance the overall customer experience, satisfaction, and loyalty, which ultimately contribute to business growth.

    2. Emphasize Value Proposition: Discuss how UX design can align with business goals by delivering products or services that provide unique value to customers. Showcase how intuitive, engaging, and user-friendly experiences can attract new customers, retain existing ones, and differentiate the company in the market.

    3. Conversion and Engagement: Highlight how optimized user experiences can lead to increased conversion rates, higher customer engagement, and improved customer retention. By improving usability, reducing friction points, and addressing pain points, UX design can enhance customer journeys and drive business growth.

    4. Data-Driven Design: Emphasize the importance of data in UX design. Discuss how user research, usability testing, and data analysis can uncover insights that inform design decisions, optimize user flows, and drive positive business outcomes. Present case studies or examples where data-driven design led to increased revenues, decreased costs, or improved customer satisfaction.

    5. Business Impact Metrics: When discussing business objectives, use relevant metrics that align with UX design. For example, instead of directly talking about increasing revenues, frame it as improving conversion rates, average order value, or customer lifetime value. Similarly, discuss how decreasing costs can be achieved through reducing user errors, support tickets, or training requirements.

    6. Collaborative Approach: Emphasize the importance of collaboration between UX designers and other stakeholders, such as product managers, marketers, and business leaders. Showcase how involving UX expertise in strategic discussions and decision-making can lead to better business outcomes and value creation.

    Remember, as a UX designer, your role is to bridge the gap between user needs and business goals. By effectively communicating how user-centric design contributes to increasing revenues, decreasing costs, and driving overall business success, you can create a compelling case for the value of UX design in achieving these objectives.

  • (Value ÷ Effort) x Confidence = Priority

    When faced with a collection of potential features, design ideas, or research projects to prioritize, how do you move forward? Choosing which design effort the team will work on is especially difficult when every stakeholder, executive, and team member has a different opinion or favorite item they’d like to see tackled first.

    I’ve seen a lot of prioritization schemes in my career. The one I favor most was taught to me by product manager extraordinaire, Bruce McCarthy.

    What makes Bruce’s formula so great isn’t only that it delivers a straight-forward approach to identifying top priorities. It’s the collaborative method we get to those top priorities. Implementing the formula rewards teams that use their UX research strategically.

    I love this. It gives me tingles just thinking about it.

    Bruce’s simple formula looks like this:

    (Value ÷ Effort) x Confidence = Priority

    What Bruce’s formula says is this: we want to prioritize work that offers large value while taking a small amount of effort, assuming we’re confident in both our estimates of value and effort. With this formula, we can put what might otherwise seem like apples and oranges (or worse, apples and orangutans) on a single scale, where the highest results are what we should work on first.

    How might we calculate Value?
    To fill out Bruce’s formula, we need to arrive at a number for the first variable, Value. This number represents how much value we’ll produce when this item is delivered. As UX design leaders, we, of course, want to start with value to our users. Will this item provide a great solution to a challenging problem our users are facing? Does it get us closer to our vision of the ideal user experience? Or, will it be something they don’t really care about?

    The beauty of Bruce’s formula is we can make this as simple or detailed as we’d like. A simple way to represent Value is 1, 2, or 3, for low, medium, or high. If we think the item is a critical solution to a big problem, we give it a 3. If it’s something users won’t care too much about, we give it a 1.

    If we want to get more rigorous, we could estimate cost savings or how much revenue we might generate from implementing this idea. We could add what we believe is the value to the business.

    Whatever we arrive at will be fine, as long as we arrive at every item’s Value using the same process. The rule is simple: the higher the number, the more valuable this item is.

    How might we calculate Effort?
    Next up, how much Effort might this take? Here, we can start with the effort to implement.

    We can use a similar 1, 2, or 3 scale, representing whether it will be easy, medium-difficulty, or really hard to implement. Alternatively, we could use a more rigorous calculation such as the number of people multiplied by the number of weeks to complete the project. We could even use the dollars the organization will spend on it.

    For more detail, we can add in other costs, such as how much effort it will take for our users to switch over. (This is especially important in products where new functionality is disruptive to habits our users have already formed.)

    If we’re implementing this item to attract new customers, we can add in the effort to sell. Plus, we shouldn’t forget the effort to support the feature once it is released.

    Like Value, we can adapt the amount of detail we consider for Effort any way we want, as long as, the higher the number, the more effort we believe this will take.

    Dividing Value by Effort gives us a quick look at how the items rank. A design idea that provides a great solution (Value = 3) and will have an easy implementation (Effort = 1) resolves to 3÷1 or 3. Meanwhile, another idea that has a medium value (2) and medium effort (2) will resolve to 2÷2 or 1. The design idea with a 3 is a higher priority than the one that came out a 1.

    How might we calculate Confidence?
    (Value ÷ Effort) is a great start, but where are all these numbers coming from? That’s where Confidence comes in.

    Bruce wisely puts this on a scale from zero to one. If the Value and Effort numbers are a complete guess, we’d give them a zero. If we’re absolutely sure we know the Value and Effort are correct, then we’ll give them a 1.

    We might use this scale for each variable:

    Complete guess = 0.0
    We’ve found a little evidence = 0.25
    We’re fairly sure = 0.5
    We’ve done a ton of research and are very sure we’re right = 0.75
    We’re absolutely, incontrovertibly sure = 1.0
    We can rate our confidence in both Value and Effort separately. If we’ve talked with a ton of customers who all told us basically the same thing, we can give Value-Confidence a 0.75. If we’ve worked with developers on a technical proof of concept that showed it can be done but didn’t explore edge cases, we can give Effort-Confidence a 0.5. By averaging them, we get (0.75 + 0.5) ÷ 2 or a Confidence of 0.625.

    Using Bruce’s whole formula, we can see how this plays out.

    (Value ÷ Effort) x Confidence = Priority

    (3 ÷ 1) x 0.625 = 1.875

    1.875 is our calculated priority for this design item. By itself, it’s a meaningless number. However, when we calculate other design item priorities the same way, we can see what we should work on first.

    The biggest benefit? The discussion while rating.
    It’s great that Bruce’s formula gives us a clear calculation to determine our highest priority items. However, what I love most is how it gets everyone talking about what should go into calculating Value, Effort, and Confidence.

    We involve stakeholders, executives, and other key team members in coming up with each number. The numbers can mean whatever we want them to mean. Yet, to fill out the formula, we have to have an essential discussion about what these numbers mean.

    We discuss the evidence we’ve collected for Value, hopefully from research we’ve conducted with our users. We discuss the evidence we’ve collected for Effort, hopefully from experimentation, iteration, and prototyping projects, that give us real insight into what it will take to deliver. We discuss our Confidence from how much evidence we have, versus how much we’ve had to guess.

    Bonus: The reward from solid research.
    The Confidence number penalizes us when we’ve wrongly guessed the value or effort. It rewards teams that invest in research to collect data that becomes the basis of our confidence. We’re not happy with a low Confidence number? Ok, let’s do more research and boost it.

    Any process that pushes our teams to appreciate the contribution of research is a winner in my book. Bruce’s formula does just that.

    This is how we make research a strategic ingredient in delivering better-designed products and services. (Ooh! There are those tingles again.)