Your Bot Doesn’t Need Personality, It Needs User-Centric Design

Justin Kestelyn
Chatbots Magazine
Published in
3 min readJan 5, 2018

--

“Personality” is one of the more controversial topics in the world of bots. For some, successfully simulating that mysterious aspect of human behavior makes all the difference between a bot that works, and one that doesn’t.

That emphasis may be misplaced, however. According to 2016 research by Retale, most users (reflected by 55% of survey respondents) who had interacted with bots ranked the desire for “improved accuracy” above all other requirements. Twenty-eight percent wished their interactions had been more “natural.” In other words, where bot characteristics are concerned, most users value productivity over personality. (It’s reasonable to presume that most of these users had encountered bots in the context of customer service, not entertainment.)

That said, with some exceptions (sense of humor, for example), personality can be easily confused with user-centric design. As an example, a bot that asks your name and then customizes the conversation, or that can anticipate your questions or needs due to machine learning, is not being personable but rather user-centric. In fact, user centricity should be the ultimate goal of all learning systems because those learnings are based on knowledge of past behaviors (and thus can enable anticipation of future behaviors).

Even a bot that learns probably won’t get you all the way to that goal, however. (A scripted or rule-based bot most certainly won’t even get you part of the way there.) Unless you have access to multiple thousands of conversations for training your model, there simply won’t be enough data available to ensure bot accuracy and reliability out of the gate.

Here are some examples of gaps that bot builders are likely to miss, and that are unlikely to be completely resolved by models with limited data for training. At Chatbase, we call those gaps “UMM” errors:

  • Unsupported-request errors
    Are users asking questions that your bot wasn’t designed to answer?
  • Mishandled-request errors
    Is your bot misfiring on user intent (i.e., not considering the context of the question, or answering a slightly different question than the one asked)?
  • Missed-request errors
    Is your bot correctly understanding user intent but failing to anticipate certain phrases, terminology, or even products and services that you don’t offer? Or is it misfiring on requests with compound intents?

The only realistic way to address these errors is to continuously analyze user interactions (through message data) and then optimize your bot to fill the gaps. But unless you have a team of data scientists or analysts on the payroll to piece together these insights from logs, your best bet is automated tools, particularly those that rely on well-trained and well-designed machine learnings models. (And users that do have those resources may find them not as efficient as hoped or want them working on other things.)

Solving these issues is the quickest path toward a user-centric bot that meets high expectations for reliability and accuracy. As a result, you may find more use cases for your bot than previously imagined, even if it has no “personality”.

About Chatbase

Chatbase gives builders of conversational interfaces (or bots) sophisticated tools for creating better, and stickier, consumer experiences than ever before — leading to better conversion rates and retention. Chatbase is a cloud service that easily integrates with any bot platform and type, voice or text, and is free to use.

Among other features, Chatbase uniquely relies on Google’s machine learning capabilities to automate the identification of bot problems and opportunities that would otherwise take a lot of time, leading to faster optimizations and better bot accuracy.

Chatbase is brought to you by Area 120, an incubator for early-stage products operated by Google.

--

--

Product marketing polymath for Google Cloud. Burrito Addict, Clash City Rocker, Romanophile. Views expressed = mine.