Run an A/B Test on a Display
With A/B testing, you can compare multiple versions of a display to see which one performs better with your visitors. When anyone visits a relevant page, a predefined ratio of the traffic will see each variation, and you'll see how each version performs by comparing engagement analytics such as the views, signups, and conversion rate.
Plan your test
It might seem obvious, but planning your test ahead of time is crucial. Changes that often result in the most significant changes in visitor behavior include the following:
- Headline: The headline is usually the first thing your visitors will notice when viewing a display, so it's essential to create one that attracts attention. Consider testing different lengths, tones, or design elements like the headline's font, size, or color.
- Offer: Once the headline has the visitor's attention, the offer needs to get them to convert. Consider making different offers to determine what your visitors find most appealing. Maybe it's a set discount instead of a percentage discount or a free add-on instead of free shipping.
- Form: A display's form can easily convert new customers or discourage conversions if it is too lengthy or arranged poorly. Aim for a short form (3 fields or less) that's easy to use.
- Call-to-action: A compelling CTA entices visitors to click or submit your display. Consider experimenting with its color, alignment, font, and copy.
- Images: Images can make a display more visually appealing and help your visitors understand the display's offer. However, that doesn't mean an image is always the answer. Consider testing displays with and without images. For displays with images, consider comparing image sources (e.g., stock vs. custom), types (e.g., still vs. gif), and placements.
- Display Type: The Convert tool allows you to present similar information in various ways. Consider offering the display in different formats (e.g., Fly-out vs. Banner).
- Display Settings: The Convert tool allows you to target visitors based on various rulesets. Consider experimenting with the When To Show settings.
Whatever you decide to test, each variation should reflect a single change as it makes it much easier to know which change led to more engagement. Additionally, it's recommended you limit the number of variations in play as it also makes understanding your results easier. Consider starting with the original versus a single variation before branching into more complex tests.
Create an A/B test
First, access the A/B testing dashboard by selecting Convert > A/B Tests from the main navigation.
To create an A/B test for a display from the dashboard:
- Select the New A/B test button.
- In the prompt, provide an internal name for the experiment, select the display you'd like to test from the dropdown menu, and then click the Create experiment button.
- Click on Test Settings and decide how you want to configure the experiment. You can choose to end the testing after a set amount of time or after a set number of participants. Save.
- Click into the Original variation, provide a name, set a traffic ratio, and Save. Leaving the traffic ratio to 1 will divide the traffic evenly across all variants. Once you update the traffic ratio of one variant, the rest of the traffic percentages will update automatically.
- Click the Add Variation option, provide an internal name for the variant, and then click the Create Variation button. The variant will start as an exact copy of the original display so that you can control what changes, from the settings to the design.
- Make the desired changes, save, and then select the experiment's name next to the Variation label to return to the A/B test builder.
- Add additional variation if desired by repeating the previous steps.
Start the A/B test
Once you've configured the test's settings and created the desired variations, select the Start test button at the top-right of the A/B test builder to launch the experiment.
Review the results
To review the result of an ongoing or finalized A/B test, navigate to A/B Tests, and select the desired experiment. A summary of the experiment will outline various performance metrics, including the views, signups, and conversion rates broken down by participation and views. If an experiment is complete, you will also see the variation with the highest conversion rate identified as the winner.
That's it! You're ready to incorporate your findings into even more effective future displays.