You can add items of content to an A/B test, then, after a period of time, results can be reviewed, and if required, a single item from the test can be deployed. If you have the Optimization module, you can also set the system to automatically end an A/B test and deploy the winner.
Some key points about A/B testing with Fresh Relevance:
The core metrics recorded against the items within an A/B test are the same metrics we record elsewhere in the system, for example, Impressions and Attributed revenue.
If you have the Optimization module, you can set the test to automatically end and have access to additional reporting metrics.
Once a shopper views a piece of content, they always see that same item in that A/B test for the next 90 days. After 90 days they may be shown a different content to their original visit.
Some use cases of A/B testing could include:
Recommendations SmartBlocks with different tactics being used, to determine the most successful tactics for your website, or the most successful position.
Trigger programs, to determine the best time to send an email after abandonment.
Different homepage layouts using Experiences, to determine the most successful page layout.
Create and manage an A/B test
Before setting up an A/B Test, it’s useful to create the content that you want to include in the optimization; this could be multiple SmartBlocks — to be added to a Slot — Trigger Programs, or Experiences.
If you don't have the Optimization module, then any SmartBlocks in your test must be clickable in order to get useful data at the end of your test, as revenue is only attributed to SmartBlocks if they are clickable.
Once you have created your content:
Go to Optimize in the left hand navigation menu.
Select CREATE OPTIMIZATION.
Expand the Location drop-down menu, and select where you want to add the A/B Test:
Experiences
Website slots
Email slots
Triggered email slots
Triggered email programs
Select the option (Slot, program, or Experience) you want to use, then select CONTINUE.
Drag the items (SmartBlocks, Trigger programs or Experiences) that you want to use from the left into the decision tree on the right. Position the items on top of each other.
When you have a group of multiple items, select the top row to open the Optimization panel on the left.
Select A/B Split.
Enter the percentage of visitors you want to see each variation.
Expand the Select when to end the split test drop-down menu and select from:
Don’t end automatically
End on specific date
Enter the date you want the A/B test to complete.End when number of impressions reached
Enter the number of impressions you want the A/B test to reach before completion.
Expand the Select the goal for this Optimization drop-down menu and select from:
Increase Conversion
Increase Average Order Value
Increase Identification Rate
Decrease Bounce Rate
Increase Site Visits
Increase Average Time On Site
Increase number of items in purchase
Increase revenue
Select SAVE.
To change the name of the test, select the pencil icon at the top of the A/B Test configuration panel.
The A/B Test is now set up and ready to go. If the Slot, Trigger or Experience Rule is live, then the A/B Test becomes live as well.
Your new A/B Test now appears in the Optimize Center.
The coloured dot indicates if it is active.
Select the row to return to the optimization setup.
Reporting
To access the A/B Test reporting:
Go to Optimize in the left side navigation menu.
For the A/B Test that you want to see reporting for, select the options (three dots) menu.
Select the Open report (graph) icon.
This takes you to the reporting for the A/B Test. The items in the A/B Test are automatically displayed and you can select which data you want to see from a range of metrics.
Core data is based on attributed revenue.
If you have the Optimization module, you also see data points starting with Optimization; these additional data points are based on the item in the test being seen by the visitor, then data is calculated based on the 24 hour period after it was seen.
Or seven days for Triggers being sent.
These data points are useful if the item is not clickable, but you still want to know which is more successful towards a goal.