EF Product pages
At EF, teachers aren’t just buying a trip — they’re making a high-stakes decision on behalf of their students, their school, and themselves. Our challenge wasn’t to redesign a product page. It was to re-architect how an institution earns trust, communicates educational value, and builds confidence at the moment of choice.
As Creative Director, I led the transformation of EF’s tour pages from a fragmented template into a decision-driven product experience — balancing aspiration with accountability, and story with proof.
Role
Focus
How we defined success
We deliberately avoided treating this as a pure conversion exercise (though we were aiming to grow new teachers by 18% year-over-year and increase conversion rates from 0.8% to 1.5%). We defined success across three layers of impact:
First, behavioral:
Would more teachers meaningfully progress into planning, rather than skim and leave?
Second, cognitive:
Would teachers understand the full shape of the experience earlier, with fewer late-stage surprises?
Third, emotional:
Would teachers feel excited by this experience?
Would teachers feel empowered to act and not just pressured to convert?

Personas as decision frameworks
We did not treat personas as demographic segments or marketing profiles. We treated them as decision frameworks, models of how different teachers evaluate risk, value, and readiness at the moment of commitment.
The purpose of these personas was not personalization. It was prioritization.
When we faced design tradeoffs — what to surface first, what to defer, what to simplify, what to make explicit — these two decision patterns anchored the work.
Primary persona: Ari — the Believer
Ari arrives inclined to say yes. She's inspired by purpose and impact, but hesitates later when she begins to imagine the consequences of committing. Her core tension is not desire. It is the fear of making a poorly defended decision. For Ari, the design problem was late-stage reassurance.
This led us to:
Let story and destination lead early.
Delay operational detail until motivation was activated.
Make itinerary, inclusions, and learning outcomes most legible when her questions shifted from why to how.
Goal:
Let her feel inspired first — then remove every reason she might talk herself out of acting.
Secondary persona: Ellen — the Analyst
Ellen arrives skeptical by default. She leads with accountability and will not allow herself to feel inspired until the experience is defensible. Her core tension is not fear of commitment. It is the fear of being irresponsible. For Ellen, the design problem was early credibility.
This led us to:
Surface the itinerary early and make it the structural backbone.
Pair inclusions directly with price and benefits.
Elevate standards and learning outcomes into the core narrative.
Avoid vague or purely emotional claims.
Goal:
Let her trust the system first — then give her permission to feel inspired.
Strategy and plan
Our persona research revealed something critical: sequencing mattered more than content. We weren't just designing components—we were designing a decision journey that matched how different teachers actually think through high-stakes commitments. Purpose-driven teachers needed to feel inspired before we hit them with logistics. Risk-averse teachers needed proof points up front before we asked them to trust us.
That shift reframed the entire page. It wasn't a product description anymore—it was a tool that helped teachers decide. The three phases that followed were built to test that thinking: first by aligning on the vision, then by stress-testing it against real constraints, and finally by proving it could work at scale.
PHASE 1
Strategy and design exploration
We started by creating high-fidelity desktop designs—not as something to ship, but as a way to get everyone on the same page. We needed senior leadership to see where this product could go, especially since people would be using it to make really important decisions.
What we were after:
A vision for where this could go long-term, not just a quick facelift
A way to make the experience feel both inspiring and trustworthy at the same time
Different ways to organize itineraries, frame pricing, and guide people through their decisions
This phase gave us the principles and logic that would guide everything that came after.
PHASE 2
Mobile adaptation and behavioral validation
Once leadership was aligned, we translated everything to mobile and got it ready for usability testing. This wasn't just about making things responsive—it was about making sure the experience still worked when space and attention were limited.
Our goals were to:
Make it easy to scan and find the important actions quickly
Help people discover itinerary details earlier
Keep the right emotional tone while optimizing for smaller screens
Adapt the designs so we could test them with real users
PHASE 3
Iterate, launch, and scale
After refining the designs based on testing feedback, we rolled out the new experience on five tour pages first. We wanted to validate that it actually worked before expanding to the full portfolio of 250+ tours.
Goals for the rollout:
Measure impact on leads and engagement, not just conversions
See how real users actually navigated the experience
Make sure the system could scale across different destinations, prices, and tour types
Monitor and iterate on both design and content
Phase 1
Strategy and design exploration
To align stakeholders and stress-test our strategy, we developed three distinct desktop prototypes—each one exploring a different theory of how emotion, clarity, and sequencing could shape decision-making and drive conversion.
The Mosaic concept
Mosaic was built around a simple idea: let teachers enter the experience however they want, without forcing them down one prescribed path. We intentionally used existing assets and code from other EF products to see if we could build something decision-focused without starting from scratch.
This let us test a few things:
Could modular patterns support different decision styles?
Could we increase our velocity by reusing existing code and patterns?
Would familiar components build trust and speed for risk-averse teachers?
Could a modular system work across 250+ tours without breaking?
What we learned
We didn't choose this direction. It was efficient and scalable, but it didn't create enough momentum. Teachers could find information easily, but the experience didn't tell a story. It felt safe, but flat—built for browsing, not believing.
The immersive concept
Immersive was built around one belief: the order you encounter information shapes how you feel about it. Every section was designed to move teachers through a clear arc—from inspiration, to proof, to trust, to commitment.
It wasn't immersive just because of big visuals or fancy interactions. It was immersive because:
We controlled the sequence tightly
We moved learning outcomes earlier in the scroll
We limited early choices so teachers wouldn't get lost or overwhelmed
We introduced story and credibility in a rhythm that built momentum
We balanced dreaming with grounding details
We optimized for confidence, not exploration
What we learned
This direction resonated well with stakeholders—they understood more and the tone felt right. But some sections still felt flat, and the itinerary wasn't getting the attention it deserved. We had the right arc, but not enough texture. The experience moved people forward, but it didn't always hold their attention along the way.
The EDitorial concept
Editorial was built on a different premise: what if we could build confidence through bringing the feeling of being to tour to life?
Instead of controlling the sequence, we leaned into atmosphere and visual storytelling. The idea was to let teachers explore what the trip felt like before getting into logistics or structure.
It wasn't about guiding decisions—it was about creating immersion:
We prioritized discovery over direction
We used imagery and editorial tone to build emotion
We let teachers wander and absorb rather than follow a prescribed path
We tested how far desire could carry the decision on its own
What we learned
This direction created a strong emotional pull, but it didn't guide teachers clearly enough through a high-stakes choice. Scroll depth was high—people were engaged—but they weren't converting. It made them want to go, but not necessarily know how to decide. The feeling was there, but the framework wasn't.
Phase 2
Mobile adaptation and behavioral validation

Mobile adaptation
Based on stakeholder feedback and research insights, we adapted the experience for mobile with a few critical shifts:
We elevated the itinerary—making it easier to access and more detailed. We paired pricing directly with inclusions to build trust earlier. We reduced large hero images that slowed load times and created cognitive friction. We tightened explanatory copy to maintain momentum without losing clarity. And we made CTAs more visible and added microcopy to explain what happens next, so teachers never felt lost or uncertain about the next step.
The goal was to preserve the emotional arc of the desktop experience while honoring the constraints and behaviors of mobile decision-making.
Testing with users
We put the mobile prototype in front of 24 teachers through remote, unmoderated sessions—about 30 minutes each, over 500 minutes of recorded feedback.
The focus was simple: does this build clarity? Does it earn trust? Does it move people toward action?
We tested comprehension, usefulness, and flow—not visual polish. We needed to know if the structure worked before we refined the surface. And we mixed teachers who knew EF with those who didn't, to see how the experience performed across different levels of familiarity and skepticism.
What we learned
Itineraries matter most
Teachers expect a full day-by-day breakdown—travel times, meals, accommodations, activities. The interactive map tested well, but felt disconnected from the itinerary itself. We needed to merge them—letting users click through days or filter by activity type so the two work together instead of competing for attention.
Inclusions vs. benefits created confusion
Users couldn't tell the difference. They want to know what's included, who it's for (teachers vs. students), and how it ties to pricing. The solution was to build a clear, structured inclusions section with expandable details and transparent pricing context—so nothing felt hidden or vague.
Educational value needs to be explicit
Teachers want to see curriculum ties, academic credit options, and learning outcomes—not just implied educational value. We explored using badges or icons to surface academic connections, and creating a dedicated section that speaks directly to learning goals instead of burying them in body copy.
Pricing builds or breaks trust
Hidden pricing kills confidence. Teachers want transparency before they reach out. We saw an opportunity to introduce dynamic pricing tools and show cost alongside inclusions—so value is clear upfront and trust isn't conditional on reaching a sales rep.
"Start Planning" felt like a black box
Teachers didn't know what happens after they click, and that uncertainty stalled action. The fix was simple: add microcopy that explains next steps ("We'll reach out within 24 hours"), unify CTA language across the page, and make actions more visible so teachers always know where they stand.
—
View the full research report synthesized by my team.
Phase 3
Iterate, launch, and scale
Launch as MVP, not a finish line
With the core experience validated through testing, we intentionally treated the initial rollout as a Minimal Viable Design—a complete, coherent decision system designed to learn in the real world. We launched the experience on five tour pages as a controlled pilot before expanding across the broader portfolio, allowing us to observe real teacher behavior under real stakes.
This pilot was designed to validate three things:
Whether the experience improved meaningful engagement and progression.
How different decision behaviors actually played out in production.
Whether the system could adapt across destinations, prices, and tour types without losing clarity or trust.
Current Impact
The MVP is already showing promising results. Conversion data shows these tours are either the top or second-best performers in their categories. The new template isn't hurting conversion—if anything, it's holding its own against established pages. We're continuing to track how leads from these pages move through the pipeline to get a fuller picture of impact over time.
Built for now and next
Along with the MVP launch we built a future-state version that shows where this could go. A lot of what we imagined for the future needs deeper work with both content development and engineering—things like new backend logic, more flexible CMS tools, and components that don't exist yet. By splitting it into phases, we could be ambitious without forcing our engineers to build workarounds or one-off hacks.
The future version takes what we learn from the pilot and layers in smarter content strategy and richer storytelling that feels more like the brand EF is becoming—while keeping the core decision-making experience intact.
Together, these two versions turn the tour page into something that can grow. We're launching with a solid foundation, learning from real people, and evolving as both our users and our platform mature. This wasn't just about shipping a new page—it was about setting up the design, data, and systems we need so the experience can get better over time.
Contributors
Creative Direction: Adam Schwartz
Writing: Maddie Poulin
UX/UI: Amanda Bentley
Marketing Strategy: John Cowan
Web Content Manager: Brian McQueen








