Recipe for success
The impact of different messaging strategies on SaaS demo signups
Collaborators
Laura Lubben | Richard Lumpi
In a Nutshell
This is a Causal Inference project using a real-world A/B test to measure how changing homepage copy from an accountant focus to a restaurant-operator focus affects demo sign-ups for a restaurant SaaS platform.
Robust statistical methods were employed to establish a clear cause-and-effect relationship between the new copy and user behavior.
Objectives
Examine Messaging Impact: Assess whether a restaurant-focused, operations-oriented messaging strategy could meaningfully elevate the click-through rate (CTR) on the “Get a Demo” button.
Boost Demo Sign-Ups: Determine if emphasizing the product’s value for restaurant professionals encourages more visitors to complete the demo sign-up form, a critical step in the sales funnel.
Overview
Product: DineLogic*, a B2B SaaS tool for restaurants (owners, chefs, general managers, and accountants)
Experiment: Tested two homepage copy versions—an accountant-oriented control and an operations-focused treatment—to examine engagement differences.
Outcome: Measured user behavior in terms of clicks on “Get a Demo” and subsequent demo sign-ups across both conditions.
Methodology
Between-subjects A/B design with random assignment of ~57,000 sessions.
Applied rigorous data cleaning to exclude bot traffic and login-only sessions.
Deployed regression analysis with robust standard errors to quantify and confirm causal effects.
Key Results
After adjusting for anomalous data, the treatment copy did not consistently outperform the control in driving demo sign-ups.
This suggests that while copy changes can influence user behavior, they may not alone produce significant lifts in conversion for this niche SaaS product.
This project is part of the curriculum for “Course 241 — Causal Inference and Experimentation” at the UC Berkeley School of Information.
If you enjoyed this project and would like to explore the code or learn more about my work, you can find all the details at the links below
*DineLogic is not a real company; it is a fictional name used to anonymize the identity of the actual organization referenced in this project. Additionally, the images of the landing pages for the control and treatment groups have been altered to preserve the anonymity of the real company and its proprietary materials.
Project Introduction
This project explores whether changing the homepage messaging on DineLogic*’s website can boost click-through rates and demo sign-ups.
The experiment tests different messages aimed at restaurant operators, drawing on consumer psychology to tailor copy for high-awareness visitors and highlight unique features like automated inventory management.
By comparing click-through and demo sign-up rates across variations, the team seeks the most effective messaging to increase engagement, especially since the product’s high setup costs and lack of a free trial mean demos are essential for evaluation.
The research employs a between-subjects A/B test, assigning visitors to either the current accountant-focused copy or a newly revised version targeting restaurant operators.
Insights gleaned from this test will inform broader marketing strategies, ensuring the website’s messaging resonates with different customer segments and drives more demo conversions.
Control copy shown on DineLogic landing page, targeting an accountant audience in tone and content.
Treatment copy shown on DineLogic landing page, targeting restaurant operators in tone and content.
Concept under investigation
This experiment examines how copywriting impacts user behavior, specifically the likelihood of clicking the “Get a Demo” button and completing a demo sign-up.
It focuses on two critical sales funnel metrics: click-through rate (CTR) and conversion rate.
In this study, CTR is defined as the percentage of visitors who click the “Get a Demo” button, while conversion rate refers to the percentage who complete the demo sign-up form.
These metrics are essential for evaluating the effectiveness of different messaging strategies in driving user behavior during a critical stage of the sales funnel.
Experimental Design
This experiment employs a between-subjects A/B test design, where participants are randomly assigned to either the control or treatment group without being aware of which version they are viewing.
The control group is exposed to the existing homepage copy, while the treatment group sees a modified version.
The primary focus is on the click-through rate (CTR), directly influenced by the updated copy and serving as the key measure of the experiment.
Additionally, we assess whether there is a significant change in demo sign-ups—a crucial step further along the customer acquisition funnel, making it a metric of particular interest to the company.
Hypothesis
This experiment tests whether a high-awareness-focused copy, tailored to different customer subgroups, improves click-through and demo sign-up rates on DineLogic’s website.
The hypothesis suggests that high-awareness customers prefer solution-oriented content, and subgroup-specific language can influence decision-making.
First Hypothesis
A copy targeting restaurant operators has NO different CTR than a copy targeting accountants
H0
HA
A copy targeting restaurant operators has different CTR than a copy targeting accountants
Second Hypothesis
A copy targeting restaurant operators DOES NOT encourage a different demo sign up rate than a copy targeting accountants
H0
A copy targeting restaurant operators DOES encourage a different demo sign up rate than a copy targeting accountants
HA
We didn’t test a one-sided hypothesis, as there is no prior internal company research or experiments in this context to suggest a specific direction.
However, to justify the cost of implementing changes, only improvements in performance are considered relevant.
What we tested
As compared to
What really matters for DineLogic
Recruitment and Data Collection
The participants in this experiment were visitors to the DineLogic website.
Using HubSpot, a customer relationship management (CRM) platform, website visitors were randomly assigned to either the control or treatment group.
The CRM facilitated tracking of the primary outcome variable, the click-through rate (CTR) on the “Get a Demo” button, as well as demo sign-up completions. While session-level data could not be directly downloaded, estimates were derived from aggregated data on page views and click-throughs.
To enhance the validity of our results, we excluded sessions where users visited the homepage solely to log into the application, as these represented activity by existing clients or staff rather than potential new leads.
While individual-level exclusions were not possible due to system and data collection limitations, random assignment to the Control and Treatment groups ensures that existing clients and staff were evenly distributed between the two conditions.
Such sessions accounted for approximately 33% of all homepage views.
To adjust for these sessions, we applied their proportional share to both the Control and Treatment groups, reducing the count of visitors who did not click the “Get a Demo” button accordingly.
After adjusting for login sessions, we addressed anomalies in the click data.
While both the Control and Treatment copies showed consistent daily click patterns, an unusual spike in the Treatment copy’s clicks on November 8, 2024, was identified as bot activity. To mitigate its impact, the anomalous value was replaced with the average of the preceding and following days.
This adjustment ensured the data’s reliability and allowed for a fair comparison between the Control and Treatment conditions.
Statistical Analysis
We used regression analysis with robust standard errors to estimate how the new operations-based copy (Treatment) compares to the existing accounting-focused copy (Control) in influencing click-through and demo sign-up rates.
Overall, these findings underscore how a small set of anomalous user behaviors can significantly affect estimates, emphasizing the importance of careful data adjustments when interpreting treatment effects.
Power Analysis
Simulation analyses show that our adjusted sample size (38,428) provides enough statistical power to detect average treatment effects (ATEs) on the click-through rate that would warrant further investment in revising the copy.
We assumed a baseline click-through rate of 1% with an anticipated 0.5% increase, giving us nearly 100% power for this sample size.
Though the actual baseline rate was lower than expected—and the tested copy produced a negative effect—its relative impact is still around 50%.
Despite ongoing challenges in detecting demo sign-up effects, the strong power for the click-through rate demonstrates that the experiment effectively meets its primary goal.
Conclusions
Key Findings
After initial data suggested an increase in click-through rates (CTR) for the operations-focused copy, adjusting for likely bot activity revealed a slight decrease in CTR and no significant change in demo sign-ups.
Overall, CTR and demo sign-ups remained low, indicating that simple copy changes alone may not drastically shift user behavior.
Insights
Although the accounting-focused copy appeared more appealing, it did not yield meaningful gains further down the funnel.
A broader strategy may be necessary to drive stronger engagement, with alternative acquisition methods (like outbound sales or affiliate marketing) potentially offering higher returns than copy changes on the homepage.
Limitations
Adjusting for bot activity may not have fully eliminated data distortions, and session-level analysis did not consider repeat visitors who might have viewed both versions.
Time and resource constraints also prevented linking session data to individual users or incorporating additional covariates such as device type or location.
Furthermore, the low number of demo sign-ups reduced statistical power, limiting generalizability to other contexts outside of this specific restaurant software setting.