A/B Testing in Chatbots
Currently, there are not so many tools or platforms that offer A/B testing for chatbots. However, A/B testing is one of the most important marketing strategies to use when deploying your chatbot to production and making it accessible to real users.
A/B testing for chatbots having its own peculiarities essentially serves the same purpose. It helps clients to understand which part of the flow, for example, a particular block or card with certain copy and assets, converts much more users than another variation of the same block, card, etc.
With chatbots it is not only specific blocks that need to be A/B tested, but even sequences of blocks, like different types of “onboarding” scenarios or “buy” flows. By doing that we can determine the drop off rates and make important changes that will improve user experience.
The most important metrics that need to be measured in A/B testing of chatbots are:
- Retention rate
- Drop offs
- Main conversions
Retention rate is the ratio of the number of retained chatbot users to the number at risk. The purpose of the retention rate is to monitor the chatbot’s performance in attracting and retaining its users. A well-structured and engaging UX in messenger chatbots may boast retention rates up to 70%, unlike widely-used email marketing campaigns with up to 40%.
Drop off reflects the place in the flow where the visitor left an intended main flow and went down a path different than the intended flow. The main takeaway from this metrics is that a “drop off” indicates where your chatbot’s flow has holes through which people are “dropping off”. This means that you may have some content in the blocks or any other elements of the UX that visitors are not reaching because they get bored or frustrated before reaching them.
Conversations usually will represent important user actions that are counted when someone interacts with those points of the UX in your chatbot that are considered the most valuable to your business, for example, tapping on a particular CTA of the user flow, starting a live chat with an agent, completing the checkout process, etc.
Some of the chatbots building platforms already offer ready-made solutions for A/B testing of the chatbots. You can use them for this purpose. If you are not satisfied with the tools or need a custom solution and you can code, then setting up custom tracking and visualizing these data in the form of simple graphs or tables will do the trick for you.
When A/B testing it is important to set up proper experiments that will provide valid information you can use to improve your chatbots flow and performance in the future. Every experiment that is set up for A/B testing may include two plus variations of the same feature or a particular UX element that requires validation. For example, we can set up three different cards that prompt a user to make a purchase with different copy on the main CTA button. Alternatively, these may be variations of the onboarding flow, like a welcome message that is a simple image and a text card with copy and buttons or a menu in the form of a gallery or a list view, etc. The options are numerous here.
So here is the step by step plan on how to A/B test your chatbot:
- Choose an A/B testing tool/platform or get ready to set up custom tracking with the help of JSONs, etc if you know coding.
- Take a look at the UX flow and determine with points of the UX should be A/B tested before going live. If you go live without A/B testing the most crucial points of the user journey, you may lose a lot of valuable data that could have helped you with setting up more engaging flows in the future.
- Set up the first experiment and measure incoming data after deploying your bot to production and opening it to real-world users.
- Compare and analyze received data and make changes in the flow based on the insights.
- Create another experiment based on the assumptions from the previous one and repeat the cycle.
By doing this, your chatbot retention and conversion rates should be gradually increasing and providing more value to the business or company it serves.
Being an active Chatfuel user, I would recommend using sequences in Chatfuel for A/B testing that allow you creating flows consisting of single and multiple cards that can be served to subscribed users based on specific or random rules and timing.
If you are interested in A/B testing of your chatbot, feel free to contact me as well as MoC team offers its own solutions available for A/B testing of cross-platform chatbots.