Optimize
A/B Testing
Test what works and pick winners
Video Tutorial
A/B Testing — Find What Works Best
Video tutorial
What Is A/B Testing?
A/B testing lets you compare different versions of an email to see which one performs better. The platform splits your audience automatically and tracks the results for you.
“Your weekly update is here”
Sent to 50% of audience
“5 tips you don't want to miss”
Sent to 50% of audience
A/B testing removes the guesswork. Instead of wondering which subject line is better, let your audience tell you with real data.
Creating a Test
Go to A/B Tests
Click A/B Tests in the sidebar.
Click New Test
Give your test a name that describes what you're testing (e.g. “Subject Line Test - March Newsletter”).
Create variants
Add 2–4 variants. Each variant can have a different subject line, email content, or both.
Choose your metric
Select how to measure success: open rate (best for subject line tests) or click rate (best for content tests).
Select audience and send
Choose a segment, and the platform splits your audience evenly between variants.
Create A/B test
Set up variants and choose how to measure success
Reading Results
Once emails are sent, the results page shows side-by-side stats for each variant:
| Metric | Variant A | Variant B |
|---|---|---|
| Sent | 500 | 500 |
| Opened | 120 (24%) | 185 (37%) |
| Clicked | 45 (9%) | 72 (14.4%) |
A/B test results
Compare performance across variants at a glance
Wait at least 24–48 hours before declaring a winner. Early results can be misleading since not everyone opens email right away.
Picking a Winner
After enough data comes in, you have two options:
- Manual — Review the stats and click Pick Winner on the variant you prefer
- Auto-select — Let the platform automatically pick the best performer based on your chosen metric
Use the winning variant for your next send. Over time, you'll learn what resonates with your audience and improve all your emails.
What to Test
Subject Lines
Test short vs long, question vs statement, emoji vs no emoji, personalized vs generic.
Call-to-Action
Test different button text: “Learn More” vs “Get Started” vs “Try Free”.
Email Length
Test a short, punchy email vs a detailed, longer version to see what your audience prefers.
Best Practices
Change only one element per test (e.g. just the subject line). Testing multiple changes at once makes it impossible to know what caused the difference.
Test with at least 200+ contacts per variant. Smaller audiences can give unreliable results.
What works today may not work next month. Run A/B tests on your most important emails regularly to keep improving.