• avi

    A/B Testing Tutorial

    A/B Testing is one of the best way to compare two or more versions of an application or a web page. It enables you to determine which one of them performs better and can generate better conversion rates. It is one of the easiest ways to analyze an application or a web page to create a new version that is more effective. This is a brief tutorial that covers the fundamentals of A/B Testing with suitable examples to illustrate how you can put it into practice.

    A/B Testing (also known as Split testing) defines a way to compare two versions of an application or a web page that enables you to determine which one performs better. It is one of the easiest ways to analyze an application or a web page to create a new version. Thereafter, both these versions can be compared to find the conversion rate, which further helps in finding the better performer of these two.

    Example
    Let us assume that there is a web page and all the traffic is directed to this page. Now as a part of A/B Testing, you have made some minor changes like headlines, numbering, etc. on the same page and half of its traffic is directed to the modified version of this web page. Now you have version A and version B of the same web page and you can monitor the visitor’s actions using statistics and analysis to determine the version that yields a higher conversion rate.

    A conversion rate is defined as the instance, when any visitor on your website performs a desired action. A/B Testing enables you to determine the best online marketing strategy for your business. Take a look at the following illustration. It shows that version A yields a conversion rate of 15% and version B yields a conversion rate of 22%.



    • What is Mobile A/B Testing?
    An A/B test is an experiment that simultaneously compares two different versions of the same website while measuring key metrics. Mobile A/B testing is the processing of using these experiments to optimize a mobile app. For a quick explanation by Leanplum’s very own CEO, read this post.

    Usually A/B tests only change one variable at a time, in order to correlate the variable with the result. An A/B test that changes multiple variables at the same time is called a multivariate test, or MVT. These tests are useful for measuring the interaction between different variables on the same page, but they must be designed carefully in order to find meaningful correlations.
    • What Should I Mobile A/B Test?
    For mobile, nearly everything can (and should) be A/B tested. You can refer to this handy list of in-app content that should be A/B tested.

    Many people think of A/B testing in terms of superficial design changes, like tweaking the color of a login button. Ideally, your A/B testing process would encompass much more than UI elements.

    Leanplum’s A/B testing system is powerful because it can test any variable in and outside the app. This includes:

    If you want to optimize your marketing and content, it makes sense to test every variable. Statistically significant changes will be automatically highlighted on the Leanplum analytics dashboard, so don’t worry about being overwhelmed by results. We offer out-of-the-box data science to make analyzing test and campaign results easy.
    • What Do I Gain From Mobile A/B Testing?
    Besides the obvious upside of a higher conversion rate, there’s one big reason why you should A/B test every change to your app. Testing is the only way to make sure that your change didn’t accidentally make the user experience worse.

    In our case study with App Annie, product manager Eric MacKinnon explains that enterprise apps worry more about breaking something than about improving conversions. In his own words, A/B testing for popular apps is all about “ensuring that necessary changes (such as updating and modernizing the app) don’t have an adverse effect on user behavior.”

    Foregrounding both negative and positive effects is a big part of Leanplum’s mobile A/B testing platform. We’re proud to offer two-tailed testing, which means that the negative changes of an A/B test are displayed alongside the positive ones. For example, if you’re A/B testing a messaging campaign, you don’t have to worry that one push notification increased your open rates, but also increased app uninstalls. You get a holistic picture of app performance, to understand campaign tradeoffs.

    We can daydream about optimizations that boost conversions without any consequences, but in reality, every change has a tradeoff. You can’t make good optimization decisions without knowing what you’re giving up.
    • What Makes a Good Mobile A/B Test?
    Deciding to A/B test your product is a step in the right direction, but it’s not the last step. You need to design an effective test in order for analytics software to help you.

    For quick reference, we made a list of the four commandments of mobile A/B testing.

    In general, A/B tests should be thorough. Take the time to design a test for every meaningful variable you can imagine. Part of our analytics philosophy is that you don’t know what you don’t know, so we automatically show users all statistically significant changes in their campaigns. Even if a variable doesn’t seem crucial to you, the data might tell a different story.

    Once your test is set up, you’ll have to consider which segment of your users to target. This decisions depends on the goals of your test.
    If you’re implementing a brand new feature, it might be best to only expose the feature to your most dedicated users. You can send that segment a message announcing the new feature, and offering a channel for them to provide feedback. They’ll have an easier time understanding the feature, and they’ll provide valuable usage data. Using our Time Estimator, you’ll know exactly how long the campaign will take to reach statistical significance, before you press go on the test.

    Once you’ve tested user reactions and you’re ready to roll out that feature to your full audience, you can send all your users a message about the update.

    Alternatively, if you’re making a minor layout change, you can place frequent users and infrequent users into different segments. Perhaps infrequent users will be more confused by the change than frequent users (or vice versa). If the two groups are lumped together, you’ll be given a less-than-helpful mean that doesn’t tell you much about whether the layout change was good or bad.
    • Ready to Start Your First Mobile A/B Test?
    A/B testing is the crux of a successful mobile marketing campaign. If you’re using mobile marketing software like Leanplum, it’s worth taking the time to set up detailed tests, especially if you’re making frequent changes to your app. Fortunately, Leanplum’s tools make implementing tests a breeze. Start small, and build up to more sophisticated tests once you’ve gotten the hang of the platform.



    No comments