Oct 29, 2025
Raising the Quality Bar for Experience at Scale
I established an org-wide UX review process adopted by 15+ teams that improved design quality, identified UX risks, and prevented broken experiences across 40+ quarterly experiments and launches.
Role: Director of User Experience
Background
While designers were conducting design reviews within their product teams, our organization lacked a consistent way to evaluate quality across the broader employer journey. This gap resulted in customers encountering disjointed and broken experiences, with teams reacting to quality issues rather than proactively addressing them. The problem was significant enough to become a top OKR for a company-wide quality initiative.
Approach
I led the initiative to establish a scalable quality review process. My goal was to reduce broken user experiences in our current product while building toward a more cohesive, usable product experience over time.
My approach included discovery research to understand existing problems, developing a heuristics framework to consistently evaluate designs, and introducing structured checks at key stages of development. I designed the program with three types of checks layered into our existing product development cycle:
UX Critique to provide early feedback on designs
UX Review to evaluate detailed design solutions before development, documenting issues and confirming quality standards
UX QA to review coded experiences before launch
I developed a heuristics framework to create shared standards and a common language for discussing experience quality consistently, organized around three themes:
Usability: Does it work? Is it easy to use and understand?
Value: Does it offer value? Is it clear what the user receives for time or money spent?
Trust: Does it instill trust and confidence? Does the user have choices and control?
Issues were assigned a severity rating to assess UX risk and inform prioritization decisions. Lead designers owned the review process, and all team functions shared accountability for quality before release. To streamline adoption, I designed a single worksheet that captured the heuristics, guidance, and risk documentation.
I launched the program within my organization first, and after running it for a full quarter, expanded it to other teams.
Outcome
The program delivered measurable improvements in its first year:
Improved experience quality across 40+ experiments and launches each quarter, resulting in fewer broken experiences, as measured by team OKRs and ongoing user research
Scaled standardization across 15+ teams, with product teams taking ownership of the process and designers organically extending it to additional teams and quality checks
Accelerated decision-making through reliable methods for executive review discussions and documented risk assessments for leaders
