Repetition in Chatbot
Let's imagine the following situation: a co-worker asks if you would like to have a salad for lunch, and you answer him: I'd love to eat together, but I'd rather eat a pasta. The morning goes by, noon arrives, and he asks you again: "Would you like to have a salad?"
What would you think about that co-worker? If someone repeats the same question, all of our human instincts tell us that he or she is not exactly "okay". This happens because humans have an amazing ability to "remember" a given state of discourse, and to continue in it based on the situation. Repeating the same question in the same words is perceived as very problematic in communicating with people, even if it is something small and seemingly unimportant.
Would you like coffee?
Would you like coffee?
Chatbot software architecture
The ability to keep the content of a conversation even when the conversation is interrupted, is very complex to convert to a software architecture feature. In practice, this is one of the biggest challenges facing companies and organizations that develop Chatbots. To be honest, it is hard to believe that someone will be able to mimic a machine up to a human level. But this complexity shouldn’t necessarily cause the developers of Chatbots to give up the challenge.
Not every repetition is problematic
There are situations in which the Chatbot will repeat the same answer over and over in the discourse and it will be accepted, and there are situations where repetition is a serious blow to the end user's belief that this conversation is effective.
First, let's present the types of repetitive discourse separately. There are situations in which repetition is not very problematic.
for example:
User: "How are you?"
Chatbot: "Excellent, I'm a robot, I'm always good."
Later on in the conversation, for some reason, the question rises again:
User: "How are you?"
Chatbot: "Excellent, I'm a robot, I'm always good"
Is this a serious damage to the conversation? It does not seem to be. After all, it was the user who asked the same question, and he/she shouldn’t be surprised they received the same answer. However, there are situations in which repetition will cause the Chatbot user to believe that he/she is "talking to a wall" and that the conversation is totally pointless. What characterizes these types of situations?
First, when the Chatbot is presents the end user with a question. for example:
Chatbot: "Where do you want to rent your car from?"
User: "From Tel Aviv"
Hence the conversation continues to other topics. After a while, the Chatbot asks again:
Chatbot: "Where do you want to rent your car from?"
This is a very difficult situation, which will likely cause the end user to lose trust in the conversation. If the Chatbot did not understand where the customer wants to order the car from, will it understand anything at all? Is it not a total waste of time? The damage of repeating the same question again is enormous and unnecessary. The recurring question is one of the most difficult problems in the "decision tree" Chatbot model. You can read more about this in the article "Who asks and who answers - a look at free conversation with the Chatbot".
To avoid this kind of repetition, we developed the Smart Data Collection. This mechanism is in fact the ability to "remember" what information was received from the user, both in the form of an answer to the question, and in the form of a statement by the customer, and to ensure that the chat will not repeat it unless the end user specifically requests to return to that piece of information. In that way, at least on a basic level, it is possible to ensure that the Chatbot does not make serious errors in the conversation that harm the user's trust.