CDCP / SunLife
​01
Discovery & Alignment
02
User Testing & Validation
03
Design Decisions & Iteration
04
Delivery & Optimization
01. Understanding the Big Picture
When we kicked off the CDCP Public Site, I knew we were designing for impact. This wasn’t just about making a site look nice—it was about making sure Canadians could actually get the care they needed. So my goal was simple: make signing in, searching for providers, and getting answers as easy and intuitive as possible—for both members and providers.



02. Setting the Testing Foundation
I crafted an unmoderated testing plan for speed and depth. We used an active prototype, which helped us move fast without losing quality.
We tested with 15 participants across 6 rounds of testing and made sure we had a good range of professional users, aged 30 to 65+. I was particularly focused on two things: how confident people felt using the site, and how easily they could complete key tasks.


03. What We Found (and What I Took Away)
​
Sign In:
This tested really well. Users found it easily and felt confident using it (6.6/7 on both ease and confidence). This gave me peace of mind—it means we built a solid foundation here.
Registration:
This one gave us trouble. Users couldn’t find it easily because it was tucked under Sign In. The score (5.0/7 on ease) reflected the confusion. Unfortunately, due to platform limitations, this stays as a modal for now—but I flagged it for future Helios design updates.
About Sun Life CTA:
Honestly, this didn’t land. Low confidence, low success rate, and low page traffic confirmed that it’s not delivering value right now. After reviewing Adobe analytics, we’ve decided to deprioritize it in the next phase.​
​​
​
Provider Search Tool:
Overall strong performance. A few users hesitated, but with an 84% score and growing traffic, this one stays as-is.
FAQs (Members & Providers):
Both sets of FAQs tested well. Some users struggled slightly with specific tasks, but everyone eventually found what they needed. Search functionality came up as a nice-to-have—something we should consider for future improvements.


04. Key Steps I Took
01. Framed the Problem:
I made sure the testing objectives aligned with actual user pain points—sign in confusion, content overload, and unclear pathways.
02. Built a Smart Plan:
I chose an unmoderated method for speed and scale, making sure we’d see real patterns fast.
03. Captured and Analyzed Feedback:
I looked beyond scores. I read every comment, mapped pain points, and tracked user paths to understand not just what failed—but why.
04. Collaborated with Teams:
I connected with analytics, design, and dev teams (especially re: AEM constraints and Helios roadmap) to flag what’s working now vs. what needs long-term redesign.
05 Planned the Next Phase:
I kept an eye on the future—tracking how Phase 4 insights feed into Phase 5 and Helios implementation. I also flagged analytics gaps we need to solve going forward.

05. Final Thoughts
This phase of CDCP was all about finding the friction and making it smoother. It’s one thing to say we’re creating an accessible public site—it’s another to watch real people use it and uncover what’s getting in their way. That’s the part I love most—closing that gap, one small fix at a time.
06. Design Outputs





