As the name implies, A/B testing is essentially running an experiment testing two (or more) versions of something in order to learn which version of that something will work best according to a particular goal over time. An important note here is that in order to run an A/B test most effectively, you will want to ensure there is only one variable being tested. For a basic Sponsored Brand ad, this would be testing different versions of copy and keeping the product lineup the same or vice versa.
Establish your copy or product lineup–which variable are you planning on testing? Do you want to better understand if your customers resonate with certain pieces of copy more than others, or if they care about the star ratings and review count of a certain set of products more than another set?
Focusing on one variable will allow you to know with greater certainty what worked and what didn’t. If you test multiple variables, you won’t know which variable caused the success.
A generic name for a product test could be naming the campaigns in a typical fashion but adding “ – Product Test A” or “ – Product Test B” to the end. This will make it easier to quickly search “Product Test” in the Seller Central Ad Console search bar and find them, as well as to see at a glance how each is performing over time side by side.
Having a great variable to test won’t help you until you actually test it. Create any needed assets and get into Seller Central to build out your A/B test.
After two months (a solid test period in which a good or bad one-month period can be ruled out), test your next set of variables to determine the best possible performers within your product and copy sets.
For a Dog Toy Brand, an A/B test was run testing which sets of products featured in a Sponsored Brand would resonate with customers and perform with the most efficient outcome in that particular market – two different version of a stuffed animal toy, two different versions of a treat, and their flagship dog toy in both.(An A/B test had been run previously testing different versions of copy, with an outcome that indicated which piece of copy to use for the Dog Toy Brand’s Sponsored Brand ads going forward.)
Beginning the product test, the copy for each Sponsored Brand remained the same, and the test was run for two months targeting the same keywords with the same bids and budgets. Over the course of the test period, Test A outperformed Test B handily with an overall higher return on investment of 3.42 compared to Test B’s 1.26, a higher click-through-rate (2.61% vs. 2.22%), and overall more consistent performance. After analyzing the learnings, Test B was turned off in favor of Test A which was then allocated the remaining budget.
While running an A/B test may seem like a small detail, it can save advertisers and brands a lot of heartache in the long run. In the test above, for example, running the A/B test allowed us to see what people were actually looking for and were more likely to convert on - simply choosing what products you think will be best without testing options leaves room for potential lost sales and a disconnect with the market.
If only the products/copy contained in Test B had been built out and we had not tested it alongside Test A, we may have been left with an underperforming Sponsored Brand ad and drawn incorrect conclusions about what needed to change. In the advertising space, adaptability will always be necessary as markets and circumstances change, and with A/B testing you can rest assured knowing that the best possible Sponsored Brand ad for your brand is the one that’s left standing at the end.
Want to learn more about advertising best practices? Check out our blogs on Keyword Match Types on Amazon and Hidden Tricks to Maximize Efficiency on Amazon DSP, or download our ebook: The Experts' Guide to Amazon Advertising Strategy.
Find relevant content to accelerate your ecommerce business. Stay on top of industry trends and best practices.