Case Study: Designing a Hybrid XD Benchmarking Score

Role: UX Researcher

Team: XD Research & Strategy

Timeline: April 2025

Skills: Benchmarking Strategy, Mixed Methods Research, Quantitative UX, Usability Testing, Metric Design, Internal Alignment

The Challenge

Our team lacked a consistent and repeatable method to assess the quality of our digital experiences. While we had previously implemented a benchmarking score through Projekt202, the high lift required made it difficult to scale. Without a robust framework, it was hard to measure improvements, align stakeholders, or prioritize UX investments across journeys.

The Opportunity

To address this, I led the development of a Hybrid XD Score — a new UX benchmarking framework combining behavioral performance, user attitudes, and alignment with brand values. This score was designed to be:

Holistic (covering usability, perception, and brand alignment),

Actionable (easy to interpret and apply),

Repeatable (low effort for ongoing benchmarking).

The Approach

We combined elements from:

Projekt202 XD Score (experience-focused anchor measures),

UserTesting’s QX Score (behavioral + attitudinal),

into a single composite metric scored out of 100.

Scoring Breakdown:

Task Success (40%) — Can users complete tasks?

Attitudinal Measures (30%) — How do users rate usability elements (ease, readability, effort)?

Anchor Measures (30%) — Do users feel the experience reflects our brand goals (trust, confidence, purpose)?

This mix ensured objective task performance was emphasized while still reflecting subjective user sentiment and brand integrity.

Evaluation Criteria

Behavior Score (2+ task minimum):

NS = number of successful outcomes

NT = total number of tasks

Task Success Score = NS / NT * 100

Ex: (1/2)*100 = 50

Attitude Score Metrics (1–5 scale):

Statement agreement:

  • Navigability

  • Attractiveness

  • Ease of Use

  • Readability

  • Cognitive Load

  • Level of Effort

Anchor Score Metrics (1–5 scale):

Statement agreement:

  • Supports User Goal

  • Facilitates Purposeful Movement

  • Builds Confidence

  • Inspires Trust

Performance Bands:

  • 90–100: Great (A)

  • 80–89: Good (B)

  • 70–79: Needs Improvement (C)

  • <70: Change Needed (F)

For one user flow:

• Task Success = 50%

• Attitude = 90

• Anchor = 90

Hybrid XD Score = (0.4 × 50) + (0.3 × 90) + (0.3 × 90) = 74

Validation & Credibility

Establish both construct and face validity through internal stakeholder reviews:

• Evaluate if each metric accurately captured UX dimensions (relevance, clarity, reliability).

• Asks stakeholders to rate user journeys (without seeing scores) and compares these ratings to our calculated benchmarks (from pilot).

This alignment boosted confidence in the score’s representativeness and strategic utility.

Impact

  • Created a repeatable UX benchmarking methodology

  • Enable clearer prioritization of design improvements

  • Help facilitate alignment between UX, Product, and Design

  • Empower quarterly tracking of experience quality over time

  • Pave the way for post-benchmark qualitative deep dives

Previous
Previous

Guided Checkout

Next
Next

NBA heatmap