UX Research & Insights
QuaLITATIVE Research | USER RESEARCH
Leveraging my background in cognitive and human factors psychology, I lead and contribute to research efforts — from SUS scoring and concept testing to feedback loops — that bring clarity to complex internal tools.
Contextual Inquiry & Stakeholder Interviews
Context:
For the Pre‑Onboarding experience, we needed to understand why so many new hires were struggling to complete tasks before day one — especially around IT setup, identity provisioning, and benefits enrollment.
My role:
I conducted interviews with recent hires and HR partners, mapping the end‑to‑end workflow to uncover communication gaps between recruiting, HR, and IT. These insights helped prioritize which friction points to tackle first and informed our task flow redesign.
Impact:
The resulting changes boosted self‑service success from 15% to 40% and significantly reduced IT tickets, improving the overall pre‑day‑one experience for new employees and reducing support burden.
Seen in: Pre-Onboarding
Concept Testing
Context:
Across several projects, we needed to test and align on experience structure — especially when there were competing perspectives on workflow organization. One key debate: program‑first vs. people‑first models for manager‑driven reward and promotion flows.
My role:
I designed and helped run early concept testing with low‑fidelity A/B mocks and flow comparisons. When research resources were limited, I took more ownership of the testing end‑to‑end — scoping lightweight studies, facilitating feedback sessions, and synthesizing insights to align stakeholders.
Impact:
Feedback revealed friction points early and helped us land on a hybrid structure balancing business needs with manager usability. Later concepts scored 3.7+ in UX evaluations (usability, clarity, confidence), unblocking teams that had been stuck in directional gridlock.
Seen in: Manager Rewards Tool
System Usability Scale (SUS Scoring)
Context:
As part of a broader redesign for a manager‑facing HR tool, we needed to validate whether changes to the information architecture and justification flow improved usability.
My role:
I created and ran SUS surveys using before‑and‑after walkthroughs of the experience. Without a dedicated researcher, I analyzed quantitative scores to surface problem areas and confirm improvements post‑iteration.
Impact:
The redesign raised SUS scores by 54.5 points (24.5 → 79.0) — a significant lift in confidence and usability that helped secure stakeholder buy‑in for rollout
Seen in: Manager Rewards Tool
Heuristic Evaluation
Context:
At multiple points in the design process, we needed quick ways to identify usability issues and guide iteration — especially when formal research support wasn’t available.
My role:
I ran informal heuristic reviews using UX scorecards focused on clarity, navigation, and confidence. These reviews informed early design adjustments and were often paired with lightweight testing later in the process.
Impact:
Several key issues were caught and resolved before engineering handoff, saving time and reducing rework. In one case, findings shaped the structure of a new workflow that later tested well and met target UX score thresholds.
Seen in: Manager Rewards Tool
Quick UX Reviews (Tenets & Traps)
Context:
When teams needed fast alignment on early concepts, I used Microsoft’s “Tenets & Traps” cards — a lightweight framework for spotting common UX pitfalls — to facilitate grounded discussions with PMs, engineers, and business partners.
My role:
I regularly led or supported these sessions, walking partners through flows to identify traps that could undermine understanding, trust, or task completion.
Impact:
This approach helped raise the UX bar early without formal studies. It influenced key efforts like rewards workflows, pre‑onboarding, and AVA, and introduced many partners to foundational UX principles for the first time.
View on UITraps