How will the DialogFlow market evolve in the coming years?

A training course student recently asked me:

Would be interested in hearing your views on how the DF market will develop in the coming years and scope for profit. 

Let me start with a disclaimer.

I am only answering this question because someone asked me about it, and I feel like I am probably not the best person to answer it. I am far too concerned about the technical aspects of chatbot tooling (and NLU in general) to be considered any kind of authority when it comes to the Dialogflow “market”, which in my view includes knowledge of market forces and such. If you completely disagree with my commentary at the end, that’s fine too.

So instead of giving a direct answer, I will point you to one of the best interviews I have heard regarding the chatbot ecosystem.

Recently, Jeff from Software Engineering Daily interviewed the founder of BotPress, Sylvain Perron. It was balanced, and really set expectations where they should be. And it also helped that Jeff seemed quite skeptical of the future of chatbots, because that produced some really excellent questions.

Here are the list of questions Jeff asked on the show (thankfully, there is a PDF transcript available, so I just copied the questions over):

[00:02:09] JM: You are the founder of Botpress, which is a system for building chatbots. And whenever I do a show about chatbots, the first thing I always like to ask is what happened? Like there was the period of great chatbot hype. And then things basically died down. But you still see chatbots everywhere, when you hit an airline website or a customer service center. So how big is the market for chatbots? Is there still a big demand for chatbots? What's going on there?

[00:04:49] JM: What are those struggles of getting chatbot systems to work?

[00:07:16] JM: Can you give me a little bit idea of the tooling around chatbots? Like there's Facebook Messenger, and Intercom, and Drift, I know, that can be used for various chatbot applications. But are you providing infrastructure that plugs into those different systems? Or just give me the lay of the land in terms of tooling that people use and what you're plugging into.

[00:09:27] JM: Can you give me an example of a chatbot application that somebody might build using Botpress?

[00:10:54] JM: And how does the human in the loop play a role? Like how do you build a system where, if necessary, the chatbot can kick a request to a human?

[00:12:42] JM: What role does natural language understanding and natural language processing play in building an effective chatbot?

[00:15:41] JM: And can you tell me more about your NLP, NLU stack? Like are you taking libraries off the shelf? Or like how do you train your libraries? Just tell me more about what you've built or what you've taken off the shelf?

[00:18:45] JM: What are some of the prototypical problems that exist for chatbots today that you think will be ironed out in the near future?

[00:21:47] JM: Describe how you've differentiated yourself from the other bot platforms out there.

[00:23:50] JM: So you have a system, like a UI for designing chat workflows. Can you talk a little bit more about the design of the UI and how people within a company use it?

[00:25:56] JM: So what aspects of building my chatbot do I need to use a workflow editor for and what do I need to use actual programming for?

[00:27:57] JM: How has Botpress evolved over time as you've engaged with different users of the system and you've seen people build different kinds of chat bots? How has the product evolved?

[00:30:23] JM: How do you see the chatbot category evolving in the near future? What predictions do you have about what people want out of chatbots and how consumers will be engaging with them?

[00:34:53] JM: So what's the pace of innovation there? Because it feels like – I mean my sense is that the state of the art or bot interactions as improved at a snail's pace ever since bots were a thing, four years ago, or a mainstream thing. When do we get this kind of next level step function integration? How long till that actually comes to fruition?

[00:38:17] JM: Well, since you mentioned Siri, is there a connection between chatbot applications and voice interfaces?

[00:39:53] JM: As we begin to wrap up, maybe we can talk a little bit about your infrastructure. Can you just tell me about the building blocks of Botpress? Tell me about some infrastructure decisions you've made and the architecture for the company.

So once you listen to that episode, you will actually get a fairly good idea of how things are evolving with respect to the entire chatbot ecosystem.

I will add the following commentary:

Slot filling

One of the biggest problems, in my view, of initial chatbot adoption was the obsession over “slot filling”. I don’t recommend people use slot filling in Dialogflow ES. But the problem is not the actual concept, which is genuinely useful, but the initial implementations. In fact, the way slot filling works is one of the biggest improvements in Dialogflow CX.

State Machine

I think multi-turn conversations will have to incorporate some kind of state machine. That is already possible in Dialogflow CX, and my expectation is that other chatbot frameworks will soon follow suit. While the other bot frameworks may “simulate” state machine like behavior, I am talking about exposing state machine primitives in the API.

Machine Learning

Trying to turn everything into a Machine Learning problem is detrimental. You can often get better results by re-prompting with buttons (i.e. suggestion chips) instead of re-prompting for text, when the bot doesn’t understand what the user said in the first attempt. The former requires only good tooling, the latter requires “better” Machine Learning. If you still prefer the latter, then you have a solution in search of a problem.

Voice Bots

My biggest disagreement with Sylvain’s responses was the suggestion to “decouple” voice and NLU at [38:17]. I don’t think that is a good approach. The auto-speech adaptation feature in Dialogflow is a step in the right direction. It is effectively an acknowledgement that if you can reduce the search space of possible things the user might say based on what they said before, you will produce better ASR (Automated Speech Recognition), the step that precedes the NLU.

Open Source

The gap between open source chatbot frameworks such as RASA NLU and closed source bot frameworks such as Dialogflow is really too wide for non-technical folks at the moment. So to the extent that BotPress is open source (I haven’t used it till now), I don’t think merely being open source is a sufficient selling point for greater adoption. It should be easier for non-technical folks to use open source software in general when the incumbent closed source tools are clearly much easier to use.

Open problem

An interesting unsolved (or at least poorly solved) problem in chatbots is a way to distinguish between multiple intent fragments and single intent. If you see my article on automatically inferring Dialogflow intents from chat logs, you can see that there are a lot of people who use long sentences spanning multiple intents and none of the current bot frameworks do a particularly good job of handling this. Since we cannot ask people to stop writing long sentences, we need to do better at identifying multiple intent fragments. It is probably better to treat multiple intent fragments as they are, instead of asking the machine to synthesize them. Not only does this provide more flexibility to the user, it also makes it easier to reason about the bot’s behavior