5 Surprising AB Test Results That Changed Our Understanding of User Behavior

In growth marketing and conversion optimization, "best practices" often act as guiding principles—until they don’t. Every now and then, an A/B test comes along that challenges established wisdom, revealing unexpected insights about user behavior. These findings aren’t just quirky anomalies; they illuminate how real users think, act, and make decisions. Below, we delve into five real-world case studies where the results defied expectations and taught us invaluable lessons about what truly drives clicks, conversions, and long-term engagement. Together, these examples tell a powerful story: even the most trusted "rules" can fall short, which is why testing remains essential.
1. When Adding Friction Improved Conversions
Conventional wisdom tells us to reduce friction—whether that’s simplifying forms, cutting steps, or streamlining user flows. But sometimes, making things too easy can backfire. CRO expert Shiva Manjunath shared a surprising case study involving a multi-step lead generation form. Initially, the form had six steps, which the team reduced to three, expecting a boost in sign-ups. Instead, conversions plummeted.
Rather than abandoning the test, Shiva’s team went against intuition and reintroduced an additional step. The result? Conversion rates increased. Adding friction seemed counterproductive but ended up building trust and commitment. This additional step reassured users by giving them more context, making the process feel more credible.
This wasn’t an isolated incident. A UK job search site experienced similar results after removing a “distance” filter from its search bar, assuming it would make the process simpler. Instead, total searches dropped. Users missed the option to refine results by commute radius, as it gave them more control over their searches. Similarly, Booking.com found that adding a “number of children” filter improved relevance for families, boosting satisfaction and usage.
These examples challenge the idea that "less is more." In some contexts, perceived complexity can actually add value by providing reassurance or relevance. The takeaway? Never blindly follow simplicity for its own sake—test to understand how users truly interact with your design.
2. Value Before CTA: How Delaying the Call-to-Action Increased Sales
“Always put your call-to-action (CTA) above the fold” is a common mantra in web design. The logic? The sooner users see your CTA, the more likely they are to act. But an experiment by Lydia Ogles, an experimentation manager, proved that timing and context matter more than placement alone.
Working with a company offering architecture exam prep, Lydia tested two versions of a landing page. One had a prominent CTA at the top, while the other delayed the CTA, opting instead to lead with value propositions: what the product offered and why it mattered. The result? The second version, which postponed the CTA, increased revenue by 70%.
The reason was simple: by the time users encountered the CTA, they were already sold on the product's value. Instead of rushing customers to act, the page took time to educate and build trust. For complex or high-stakes products, this approach can be transformative. Users don’t always want to be pitched immediately—they want to understand why they should care. This test highlights a valuable insight: a rushed click isn’t always a ready click. Slowing down the pitch can sometimes speed up conversions.
3. Removing Cross-Sells Boosted Checkout Revenue
Cross-sells and upsells are staples of e-commerce, designed to increase average order value by tempting users with complementary items. But Claire More, head of optimization at an agency, discovered this approach isn’t always effective—especially on mobile.
In an A/B test for a client, Claire tested three variations of the mobile checkout flow: one with the usual two cross-sell modules, one with a single cross-sell, and one with no cross-sells at all. Surprisingly, the no-cross-sell version outperformed the others, delivering the highest conversion rate, revenue per user, and average order value.
The reason? On small screens, cross-sells distracted users and made the checkout page feel cluttered. Without these interruptions, users completed their purchases more quickly and confidently. Interestingly, on desktop, where screen space is less limited, the difference was much smaller. This highlights the importance of context: what works on one device may fail on another. Sometimes, the best way to increase revenue is to simplify the path to purchase.
4. Scarcity Messaging That Scared Customers Away
Urgency and scarcity are powerful motivators in marketing. Countdown timers and “only 3 left in stock!” notices are proven tactics to create FOMO and drive action. Yet, when used in the wrong context, these strategies can backfire.
Jason O’Dwyer, a conversion specialist, tested adding a “Product in High Demand” alert during the checkout stage of a gaming e-commerce site, hoping to reduce cart abandonment. Instead, conversions dropped. Customers seemed put off by the last-minute urgency, interpreting it as pushy or anxiety-inducing rather than helpful.
This result underscores the importance of timing and user mindset. Early in the customer journey, scarcity messaging can nudge undecided users toward action. But at checkout, where users need reassurance, the same tactic can feel intrusive or even suspicious. Persuasion tactics must match the user’s stage in the funnel—what motivates at one point may deter at another.
5. Visual Proof Outperformed Textual Assurance
In today’s fast-paced digital world, how you communicate trust can matter as much as what you’re saying. A high-end gemstone retailer learned this lesson the hard way, facing an astonishing 90% drop-off rate on product pages for expensive items, despite including detailed certification information in the descriptions.
Tania Bhattacharya, an experimentation strategist, hypothesized that users weren’t noticing or trusting the text. Her team added a simple “Certified” badge with logos from leading certification authorities near the Add to Cart button. The result? Conversions jumped by 177%.
Why? Many users skim rather than read, and a visual trust symbol was far more impactful than paragraphs of explanation. This discovery highlighted a crucial insight: “seeing is believing.” Visual cues like badges, icons, or lifestyle images can convey trust and value instantly, often outperforming text-heavy descriptions. For users, a compelling image can be worth far more than words.
Conclusion: Embrace the Unexpected to Truly Understand Users
Each of these case studies started with a reasonable assumption—supported by best practices or research—that ultimately proved wrong. More form fields can boost conversions. Delaying CTAs can increase sales. Removing cross-sells can improve checkout performance. Scarcity can backfire. And visual proof can far outweigh textual reassurance.
These surprises remind us that user behavior is complex, nuanced, and sometimes counterintuitive. A tactic that works in one context may fail in another, influenced by factors like device type, user intent, or even subtle psychological cues. That’s why testing is essential—not just to validate ideas, but to uncover deeper insights about what users truly value.
Ultimately, the only "rule" that always holds true is to continuously experiment, learn, and adapt. Each unexpected test result is an opportunity to better understand your audience and craft experiences that meet their needs. By embracing the unexpected, we grow closer to delivering meaningful, effective designs that truly resonate with users.