Will the chatbot’s bubble deflate as fast as it swelled? In any case, this is what some converging signals can suggest.
The chatbot: at the crossroads of uses and conversational interfaces
To go back to the origins, the chatbot has (or had) everything from the excellent idea that was designed to work, for two reasons.
The first is that the chat has become the killer app in which we spend most of our time. It has replaced email or even telephone calls in many situations and has even become a “community” tool of choice for many. Why? Because by constantly adding contacts on Facebook, for example, the “between us” dimension of the beginning has disappeared and group chats make it possible to recreate this intimacy limited to a limited number of people.
The second is the inevitable evolution of man-machine interfaces. Button-based interfaces that control actions as we have always known them can and will in a number of cases be replaced by more “natural” interfaces. And what could be more natural than a conversation instead of having to go specifically into a tool to trigger an action? A conversation that can be oral or written, but it is the latter case that interests us here.
And so the combination of these two tendencies gave the chatbot or a way to interact with an application or service via a conversation held in the tool where you spend the most time: the chat. In addition, it allows you to rationalize your uses: from a single front door to talk to your friends and use your favourite services at the same time.
Too much conversation kills the conversation
But if the use case of the chatbot is indisputable, we see the limits of the system when the initial marginal use increases exponentially and a fortiori when the services become proactive.
Typical use case: take for example what an airline chatbot could do.
1°) You ask him for the departure time, the boarding gate or your boarding pass. It’s simple: a request, an answer and a priori you will not interrupt the exchange along the way because not only is it short but you also need the answer. So the exchanges happen in a “one question, one answer” mode.
2°) Then the chatbot becomes proactive. He informs you that the registration is open and asks you if the seat you have been allocated is suitable for you. He reminds you that you have blocked an option on a flight, that it is expiring and would like to know if you will confirm your purchase. And since you have a flight the next day he asks you if you need to book a car or a hotel room.
In this case you are not the initiator of the conversation so it may be that being busy with other things you do not answer right away. And, in the meantime, the chatbot may ask you two, three or four different things.
When you return to the application, these questions come to your mind :
– If I answer which question will he associate my answer with? If I say “yes”, will he consider that the seat suits me or that I confirm my purchase?
– Logically, the answer is the last question. So how do I answer the ton of “open” questions that I didn’t answer while another one came in between?
For having looked at the subject more than enough, I think it’s even worse in a context of internal chatbot in a company where a single thread can lend itself to a large number of uses and be quickly saturated if you’re not careful
In short, an increase in the use of chatbots ends up creating a real problem of experience or even making the tool unusable.
The chatbots: not such a good idea?
While such a thought might have seemed iconoclastic just a year ago, it is not incongruous today to wonder if chatbots were such a good idea. And even at Google they now frankly recognize that chatbots were not a good idea. Or at least on their current form. And exactly for the reason I mentioned above.
The future of chatbot according to Google? A more visual system with “maps” between which the user could navigate. One subject = 1 card and thus could solve the problem of entanglement of conversations.
Before voice interfaces close the debate ? That’s another story.