2026 · Healthcare UX / UX Research

CEP Clinical Tools Redesign

Redesigning clinical tool website for better wayfinding and clearer hierarchy.

CEP Clinical Tools Redesign — case study cover
Year
2026
Category
Healthcare UX / UX Research
Role
UX Researcher & Product Designer
Deliverables
Stakeholder interviews, user interviews, synthesis, information architecture, interaction design, prototyping, usability testing
  • UX research
  • Stakeholder synthesis
  • Information architecture
  • Interaction design
  • Usability testing
Project overview
This UX capstone focused on redesigning Centre for Effective Practice's clinical tool pages to improve information architecture, wayfinding, and interaction clarity in time sensitive clinical contexts. The work was grounded in research, translated into actionable design decisions, and validated through testing.
Research and strategy
Planned and synthesized stakeholder and user interviews to identify needs and opportunities. Translated findings into clearer information architecture and interaction priorities, helping reduce friction for clinicians navigating guidance under time pressure.
Design validation
Used prototype based testing and feedback to validate concepts and reduce ambiguity before implementation. Comparative usability testing against the existing CEP experience showed stronger wayfinding, faster related tool discovery, and a less disruptive feedback flow.

Challenge

Clinical pages were dense and hard to navigate quickly during time-sensitive care workflows.

Decision

Grounded design decisions in planned research synthesis, then validated with prototype-based testing.

Impact

The redesign direction improved wayfinding, related-tool discovery, and feedback experience efficiency.

At a glance

Project overview

  • Partner: Centre for Effective Practice (CEP)
  • Role: UX Researcher & Product Designer
  • Timeline: 12 weeks · Winter 2026

Overview · Context

Background

  • Who CEP is
  • Centre for Effective Practice is a nonprofit translating complex medical research into point-of-care guidance for clinicians. Primarily funded by the Ontario government.
  • Vision
  • Close the gap between medical evidence and clinical practice.
  • Target users
  • Primary care providers who use CEP for validation, preparation, and fast point-of-care reference.
  • Environmental constraints
  • Three forces compress every visit.
  • Time pressure
  • Fragmented workflow
  • Cognitive load

Research stage · Discovery

How we conducted discovery research

Interview-led discovery paired with live walkthroughs and secondary review, designed to capture behavior, not just opinion.

  • 🎙️
  • Discovery research
  • 6 clinicians + 3 stakeholders
  • Primary care clinicians (family physicians and nurse practitioners) plus 3 CEP stakeholders. In-depth interviews on usage patterns, workflow constraints, organizational limits, and implementation feasibility.
  • Observation
  • Live walkthroughs during interviews to capture navigation behavior and friction moments.
  • Secondary review
  • Competitive references and internal content structure review to frame feasibility.
  • Research timeline
  • Tap any phase to see what happened in it.
  • Kickoff
  • Alignment and technical context
  • Recruiting
  • Primary care clinicians familiar with CEP tools
  • Fieldwork
  • Interviews and observations
  • Analysis
  • Evaluating feasibility gap

Research stage · Synthesis

User reality

One quote pair captured the gap we needed to close: between how the tool is used today and how clinicians wish they could use it.

  • Current state
  • 📖
  • I read this as review or when I need a refresher.
  • Pre/post visit
  • Educational ref
  • Self exploration
  • Ideal state
  • I use this as an in-time decision support tool during or between patient encounters.
  • Real-time decision support
  • Quick scanning/navigation

Research stage · Friction audit

Current live site constraints

Annotated snapshot of the existing CEP clinical tool page. Each numbered marker points at the specific UI element surfacing friction in research.

  • tools.cep.health / tool / menopause-management
  • No persistent return path
  • The page has no back-to-directory control. Users relied on browser back or external search to switch tools.
  • Search hard to locate
  • A faint magnifier is tucked into the top-right corner with no label. Search is absent during deep-page navigation.
  • Feedback module disruption
  • A pop-up feedback prompt slides in and interrupts attention during focused reading.
  • Need to expand sections
  • Key detail is hidden behind an Expand All click. Nothing preview the content scope before commitment.
  • Scroll-heavy page
  • The tool flows as one long scroll. Clinicians lost orientation and time-to-answer increased.
  • Jump-to ambiguity
  • Inline jump-to links do not indicate coverage. Users were unsure whether a link represented the full section scope.

Research stage · Architecture shift

From hidden hierarchy to instant scent

Mapping the structural shift that removed the biggest source of navigation fatigue, where box-based reveals were replaced by a persistent, expanded table of contents.

  • Box-Based Layout
  • Zero information scent. Clinicians had to click into each box to see what existed.
  • Expanded TOC
  • Maximum information scent. Full hierarchy previewed before any interaction.

Research stage · Insight 01 · Wayfinding

Evidence: clinicians expected breadcrumbs

Across the 6 primary care clinicians interviewed, the strongest frustration signal was disorientation: not knowing where they were within the tool hierarchy.

  • Participants attempted to use breadcrumbs or expected them to exist
  • 5/6
  • Used the browser back button as a workaround (frustration signal)
  • 4/6
  • Needed to understand their location within the clinical hierarchy
  • 100%
  • DESIGN CONCEPT
  • Proposed solution: implement breadcrumbs
  • Show a persistent, clickable hierarchy on every clinical tool detail page.

Research stage · Insight 02 · Information depth

Clinical workflow reality

Primary care decisions happen in 5 to 7 minutes, often with a patient waiting in the room. Clinicians need an actionable answer first, not a wall of text, and they may return later for depth. Needs also shift by the moment: a quick check may only need a summary, prescribing needs clear dosage detail, and uncertain cases need full evidence one layer away. The live page still stacks everything in one long scroll, so every moment gets the same depth by default.

  • DESIGN CONCEPT
  • Proposed solution: progressive disclosure via tabs
  • Separate content into tabs so clinicians land on the answer first and dig only when they need to.

Research stage · Insight 03 · Discovery

Observed search patterns

When search failed, clinicians abandoned the platform. The cause was less about the search box and more about vocabulary mismatch and missing topical entry points.

  • Participants experienced failed searches due to synonym mismatches
  • 4/6
  • Gave up after 2 failed search attempts and left the platform
  • 2/6
  • Expressed preference for topic-based browsing when search failed
  • 6/6
  • DESIGN CONCEPT
  • Dual solution approach
  • Improve the search index clinicians actually type, and add a browsable topic layer for when search falls short.

Research stage · Insight 04 · Feedback

Feedback without interrupting the visit

Clinicians were open to giving input when the prompt was easy to defer and stayed small on screen. Full-screen or timed takeovers were skipped during visits; compact, dismissible pop-ups were fine when the cost in time and attention stayed low.

  • Direction
  • Follow common in-product feedback patterns: a persistent entry point, optional flow, and a small overlay or sheet instead of a takeover. Pre-fill tool and timestamp, keep fields optional, and avoid timed interruptions.

Design stage · Prototyping

Prototyping phase focus

  • From insight to interactive prototype
  • Translated the four research-backed IA solutions into interactive, high-fidelity prototypes so we could validate "speed-to-evidence" improvements before implementation.
  • Week 1
  • Week 2
  • Week 3
  • Week 4
  • Breadcrumb component design
  • Built a responsive breadcrumb with full hierarchy, keyboard access, and click-to-jump on every segment, giving clinicians constant orientation inside the clinical tree.
  • Tabbed layout prototype
  • Restructured each tool page around Assessment · Management · Resources tabs with progressive disclosure, so primary clinical content loads first.
  • Enhanced search + topic browse
  • Prototyped search with synonym expansion, fuzzy matching, and an entry-point topic grid for when clinicians don't know the exact term.
  • Feedback modal redesign
  • Built a persistent feedback button and minimal modal that pre-fills tool context and timestamp, so clinicians can flag issues without leaving their flow.

Design stage · Design guidelines

Design System

Built inside the existing visual language and HTML / WordPress constraints. No new colors, no new type scale. Six reusable components carry the whole tool-page experience. Tap through each layer.

  • Existing brand palette, nothing added, nothing dropped
  • Per stakeholder feedback, no new colors were introduced. The design uses the primary and secondary palettes already documented in the brand guide. Brand Blue and Sky Blue carry headers and active nav states, Teal signals positive clinical indicators, Red is reserved for alerts and citations.
  • Color
  • Existing heading hierarchy, kept intact
  • Typography follows the established heading hierarchy so content remains coherent with the rest of the platform. No new sizes or decorative variants. After testing, breadcrumb text was raised to body size (it was 2px smaller in the first iteration and clinicians missed it).
  • Typography
  • Six reusable components, shared across every tool page
  • A template of interface components so future collaborators can apply the same patterns across different tool pages. Each was built as a Main Component with Default, Hover and Active variants, and each maps directly back to a research finding from discovery.
  • Implementation guardrails, not ornaments
  • The redesign inherits the existing visual language and works within WordPress / HTML constraints. These are the non-negotiables every future tool page has to honor.
  • Principles

Bonus · Out of scope exploration

Parallel study: re-imagining the tools directory

The live directory dumps 40+ clinical tools into a single long feed with flat filters. This study tested a clinician-first layout with search tuned to medical mental models, topic cards, and cards that surface status and usage at a glance.

  • The /tools/ listing sits on WordPress outside the engagement. Explored alongside the tool page redesign and shelved once the client marked it out of scope, kept here in case the WordPress constraint is ever lifted.
  • Today · Live tools page
  • Proposed · Clinician-first directory
  • Same directional evidence as the tool detail redesign: clinicians pattern-match on condition and patient type, not alphabet or raw traffic numbers.
  • Topic browse uses one CEP-blue system and highlights matches while the full catalog stays in view, so nothing disappears mid-scan.
  • Search suggestions reuse the synonym work from the tool detail page, so both pages speak the same language.

Research stage · Live baseline

Before: the current CEP live site

The same-topic high-fidelity prototype is also embedded at the top of this case study. Here, production comes first as the benchmark, then the Figma prototype used in comparative usability testing (Menopause Management). Scroll each frame to compare wayfinding, depth, search and feedback behaviour.

  • Before · production CEP site: Live production experience (tools.cep.health).
  • After · design prototype (usability test build): Use the embedded frame for the prototype.
  • stack

Validation stage · Usability testing

Comparative usability testing process

Five-step protocol run with 5 of the 6 clinicians from discovery research (one could not be rescheduled), comparing matched tasks across the legacy CEP page and the redesigned prototype, with a preference rating at the end.

  • Usability testing
  • 5 returning clinicians (of 6)
  • Running the comparative test with the returning participants controls for familiarity bias and lets each clinician speak directly to the before / after experience.
  • Testing protocol
  • Tap any step to see what happened in it.
  • Orientation
  • Task briefing and consent.
  • Benchmark
  • Legacy CEP page testing.
  • Prototype
  • Redesigned prototype testing.
  • Comparison
  • Direct preference rating.
  • Interview
  • Qualitative debrief.

Validation stage · Response

What testing actually confirmed

Compared holistically (not task-by-task), the redesigned page was consistently described as cleaner, easier to navigate, and less frustrating under time pressure.

  • Overall response
  • Cleaner, easier to navigate, and less frustration under time pressure.
  • What improved most
  • Wayfinding, related-tool visibility, and feedback interaction.
  • What remained unresolved
  • Search scope clarity (local vs. global). Retained as follow-up refinement.

Validation stage · Measured impact

Measured impact snapshot

5-point efficiency rating across key navigation and discovery tasks, averaged across the 5 returning clinicians in the testing cohort. Search clarity improved but still showed ambiguity between local page search and broader site search.

  • Navigation structure (tabs + contents)
  • Wayfinding and return navigation
  • Related tool discovery
  • Feedback experience
  • Search clarity (local vs global scope)

Closing · Case takeaways

Outcome

A case study of how planned discovery and prototype-based validation reduced ambiguity before implementation, producing a clinical tool page that clinicians can actually use at point-of-care.

  • Research
  • Planned and synthesized stakeholder and user interviews
  • Discovery interviews were translated into a structured friction map and an IA opportunity list. Discovery drove the design brief, not assumption.
  • Usability testing
  • Prototype-based testing reduced ambiguity before implementation
  • A high-fidelity prototype was evaluated against the legacy page with the same cohort, back-to-back. Ratings plus qualitative debriefs validated wayfinding, discovery, and feedback moves while flagging search scope as a follow-up rather than shipping it blind.
  • Design shifts delivered
  • Persistent breadcrumbs
  • Global path replaces browser-back workarounds.
  • Tabbed content
  • Summary / Dosage / Evidence / Related tools.
  • Scoped search
  • Explicit local vs. global CEP-wide search.
  • Persistent feedback
  • Non-intrusive button and pre-contextualized modal.
  • An end-to-end research, design, validation, and refinement arc at staff-designer level: discovery grounded in evidence, IA solutions framed as trade-offs, prototype tested against reality, and a clear implementation path for CEP.
  • What this demonstrates

Closing · Key takeaways

What I carry forward from this project

Two lessons that changed how I plan clinical UX work, made visible through the evidence they produced.

  • Discovery is a design instrument, not a deliverable: Planning 9 interviews (6 clinicians + 3 stakeholders) with a single, falsifiable hypothesis per insight meant the design brief wrote itself. The team stopped debating opinions and started choosing between evidence-backed options.
  • Information architecture is the cheapest lever in clinical UX: Breadcrumbs, tabs, scoped search, and a persistent feedback entry point did more for clinicians than any visual polish. IA moves are fast to prototype and measurable to test.