“Dialogflow CX provides a new way of designing agents, taking a state machine approach to agent design,” Google explained in the documentation for CX. “This gives you clear and explicit control over a conversation, a better end-user experience, and a better development workflow.”
Stand Out Features
Any language other than English (en)
System entity extension
Training data import
As Google technology partners we are really excited about this new version of Dialogflow; if you want to learn more about Dialogflow CX and how an advanced chatbot can help your company please contact us to discuss further.
The Rapid Response Virtual Agent program includes open source templates for companies to add coronavirus content to their own chatbots.
Artificial intelligence and machine learning are continuing to take a front-row seat in fighting COVID-19, with Google Cloud launching an AI chatbot on Wednesday. The chatbot, which it calls the Rapid Response Virtual Agent program, will provide information to battle the COVID-19 pandemic, as announced in a Google blog.
The program will Google Cloud customers to respond more quickly to questions from their own customers about the coronavirus. It’s designed for organizations who need to be able to provide information related to the COVID-19 pandemic to their customers, such as government agencies, healthcare and public health organizations, as well as travel, financial services and retail industries.
Google also offers Contact Center AI for 24/7 self-service support on COVID-19 questions via a chatbot or over the phone. Google also allows for businesses to add COVID-19 content to their own virtual agents with the ability to integrate open-source templates from organizations that have already launched similar initiatives. For instance, Verily partnered with Google Cloud to launch the Pathfinder virtual agent template for health systems and hospitals. It enables customers to create chat or voice bots that answer questions about COVID-19 symptoms and provide guidance from public health authorities such as the Centers for Disease Control and Prevention and World Health Organization (WHO), according to the Google blog.
The Contact Center AI’s Rapid Response Virtual Agent program is available in any of the 23 languages supported by Dialogflow.
We’ve been looking in more detail at this template and created our own chatbot. This is a work in progress and will be something which we are updating and improving daily. You can interact with this chatbot in the bottom right of this page.
A Mega Agent..so what?!
Ok so you could argue that I need to get out more…but I was excited to notice yesterday that there is a new feature which has sneaked into the Dialogflow console. This is the concept of a Mega Agent. It’s the ability to set an agent type to mega agent so that you can combine multiple agents into one single agent.
So why is this so important? At The Bot Forge, some of our Dialogflow agents can have 1000’s of intents, particularly if they are providing an information service for a knowledge base. Unfortunately, the knowledge base functionality can be limiting as looked at in my post: Dialogflow Knowledge Connectors so it’s often necessary to create one intent per FAQ to get the required accuracy and control. This can quickly use up an agents 2000 intent limit.
We have recently had to look at creating our own version of a mega agent. This was to be used in a website chatbot implementation which would serve as a gatekeeper to initial enquiries so that we could hand over a conversation to a specific chatbot overseeing a specific knowledge domain. So not really ideal and involving more middleware complexity particularly as we were planning to handle some sort of context between all the agents.
There are some caveats, its still one GCP project and there is a maximum of 10 sub-agents per mega agent.
A Quick look at Mega Agents
It’s also important to remember this feature is in beta! You can read more about setting up the new Mega Agent here. At the time of writing the link on the add agent page is incorrect.
I took a really quick look at the new mega agent functionality.
Adding a mega agent
Adding a Mega Agent is pretty straightforward, when you add a new agent then you just select the switch:
Your mega agents are then listed in the agent list:
Adding a sub-agent
Once the agent is selected then a Sub Agent button is enabled:
After selecting the sub-agents button I had already created a test agent to use as my sub-agent so I connected it.
When choosing adding sub-agents you can select an environment or whether to include or exclude the knowledge Base. There is also a handy link to the sub-agent:
My test agent was a simple default agent with one added intent:
Does_mega_agent_work with one training phrase “does mega agent work”
Testing it out
So far so good. Just to recap I have created a mega agent and another agent to act as my sub. So now for a test drive of my Mega Agent in the Dialogflow simulator
Unfortunately, I didn’t get the result I hoped for:
Basically, to interact with a mega agent in the Dialogflow simulator, the service account that is linked to your mega agent in the Dialogflow Console needs a role with detect intent access for all sub-agents. To achieve this I went to the IAM permissions page for the sub-agent and added the mega agent’s service account email address as a member of the project with a role of Dialogflow API Client.
Going back to the simulator and trying out does mega agent again resulted in the correct response from the sub-agent!
Where to go from here with mega agents.
For me, this is a major step for chatbots which have big numbers of intents > 2000. Or where different teams need to manage a particular knowledge area for one chatbot subject, use-case or topic area.
This post has really only taken a quick view of the new Dialogflow mega agent functionality. In a later post, I want to investigate leveraging contexts between agents and use a more complex example.
There are still some areas which need work though. The biggest one which springs to mind is that the training pages area of the console for a mega agent needs to be able to support the concept of sub-agents to assign sub intents. It’s still just a beta feature so hopefully, more to come!
In this post, I’m going to look at the new Knowledge Connectors feature in Google Dialogflow. As I look at the features in more detail I’m assuming you understand the more common Dialogflow terms and features – agents, intents & entities.
It’s also important to remember this feature is in beta.
We’ve been working on chatbot projects for 2 years now and a large number of our chatbot project have shared a similar requirement: the ability to answer a large number of questions on a particular subject. This may be to answer technical questions about a product offering or to offer information for a particular service.
Often the information related to these types of questions is held on our chatbot customer’s own websites as FAQ pages or in specific PDFs or unstructured text documents.These types of knowledge bases can often hold large amounts of information and so technically they will provide answers to thousands of chatbot questions.
The challenge for a successful chatbot is utilising this often unstructured information to understand a question and provide the correct answer. To meet this challenge we can look at 2 approaches; the traditional one and using the new Dialogflow Knowledge Connectors.
Stepping back a bit it’s important to briefly go over the traditional approach to creating chatbot conversational ability. There are a number of different chabot frameworks out there such as Google Dialogflow, IBM Watson, Microsoft Bot, Rasa etc and they all largely use the same concepts. A user submits a voice or text query and this utterance will be matched to an intent and any entities extracted. The matched intent would either provide a static response or rely on some form of application layer to perform the required action to provide the response to the user.
This approach can be easy. However, things can get complex and difficult to manage if the scope of intents is very large and or/ the information is constantly being updated. If we want to support questions with knowledge base information then each question needs to be created as an intent and the correct response formulated. This can lead to problems such as:
- Problems with the Intent Classification model growth causing more incorrect classifications.
- The amount of effort required to keep adding more training data to the model to ensure that the accuracy of the Intent classification remains high. Fortunately, Dialogflow provides a training UI in the web console to help keep track of any misclassified utterances, analyzing them and adding these to the training data, however, this does take time.
- Creating and managing intents to support new information in documents stores.
Enter Knowledge Connectors
Knowledge connectors are a beta feature released in 2019 to complement the traditional intent approach. When your agent doesn’t match an incoming user query to an intent then you can configure your agent to look at the knowledge base(s) for a response.
The knowledge datasource(s) can be a document(currently supported content types are text/csv, text/html, application/pdf, plain text) or a web URL which has been provided to the Dialogflow agent.
Using Knowledge Connectors
To be able to use knowledge connectors, you will need to click “Enable beta features and APIs” on your agent’s settings page.
Its also worth mentioning that Knowledge connector settings are not currently included when exporting, importing, or restoring agents. I’m hoping that this is something currently being put in place by the Dialogflow team.
Knowledge connectors can be configured for your agent either through the web console or using the client library that is available in Java, node.js & python. You can also configure from the command line.
To create a knowledge base from the web console, login to Dialogflow & then go to the knowledge tab. The process is fairly straightforward and involves providing a knowledge base name then adding a document to the knowledge base.
After you’ve done that then you just need to add an intent and return the response. It’s also worth keeping in mind you can send all the usual response types and that means including rich responses which I think is pretty cool.
Trying out knowledge connectors
Ok, so its time to try out these wondrous new knowledge connectors. There are 2 different types of knowledge base document type: FAQ & Extractive Question Answering. These choices govern what type of supported content can be used. There are also a number of caveats for each content type which you can read more about this here
Based on these 2 document types I looked at a couple of common use cases which we often encounter at The Bot Forge and correlate well with the document types supported:
- Chatbot FAQ functionality using an existing FAQ webpage in a fairly structured format to provide answers from.
- Chatbot FAQ functionality using information in an unstructured format to provide answers from.
I carried out my tests using a blank Dialogflow agent with beta features enabled.
1- An FAQ Knowledge Base (Knowledge Type: FAQ)
For my knowledge base I used the UCAS Frequently asked questions webpage and used the following URL as my data source. This processes the URL which is in the correct format and creates a series of Question/Answer pairs which can be enabled or disabled in the console, pretty neat!
So giving this a spin my first test was “how do I apply” and the result was spot on,
matchConfidenceLevel: HIGH matchConfidence: 0.97326803
Whilst different variations on the same question also returned a good result.
"im not sure how to apply" matchConfidenceLevel: HIGH matchConfidence: 0.9685159 "can you tell me about how I can apply" matchConfidenceLevel: HIGH matchConfidence: 0.968346
Unfortunately, when I try something a bit less obvious. I get an incorrect result as it matches the wrong intent.
"how do I submit my application" matchConfidenceLevel: HIGH, matchConfidence: 0.9626459
In this case, it’s matching the “How can I make a change to my application” intent with a high confidence but unfortunately it’s the wrong intent. So the problem here is we need to fine-tune the model and re-assign the training phrase (utterance) to the intended intent. The limitation is that in the knowledge base you can’t fine-tune responses. If you want more control you will need to move this faq over to its own intent.
This problem is compounded by the fact that the training feature of the console just lists each response intent as “Default Fallback Intent”. It’s hard to check which responses have been answered incorrectly. One way round is to look in the History area of the console and look at the Raw interaction log of each response.
One really useful feature is that you can assign a specific extracted FAQ from the knowledge document and assign to an intent. Just click on view detail in the document list -> select the question and click the “convert to intents button”. At the same time, this will create a new intent and disable the current Question/Answer pair. So overall pretty impressive if you have webpage or doc of structured FAQs you can use this to power an FAQ chatbot pretty effectively with some monitoring.
2-A more unstructured FAQ Knowledge Base (Knowledge Type: Extractive Question Answering)
In this use case, I wanted to try out the ability of the knowledge connectors to return answers from more unstructured data.
Again there are caveats about what data source you can use you can read more about this here.
For my test, I used a standard drug leaflet with MIME type PDF covering Priorix, from www.medicines.org.uk. I created a new knowledge base, added a new document and made sure I selected the knowledge type as “Extractive Question Answering”. Once imported the PDF is listed in the document list. My aim was to validate if Dialogflow could extract some fairly simple answers from the document. Now for some testing:
"What is Priorix" matchConfidenceLevel: HIGH matchConfidence": 0.88257504 answer : "Priorix, powder and solvent for solution for injection in a pre-filled syringe Measles, Mumps and Rubella vaccine (live)"
Unfortunately, although the response had a high confidence and match score it was actually an incorrect response. Ideally, the answer should have been:
“Priorix is a vaccine for use in children from 9 months up, adolescents and adults to protect them against illnesses caused by measles, mumps and rubella viruses.”
I tried another test:
"how is priorix given" matchConfidenceLevel: HIGH, matchConfidence: 0.8826 answer: The other ingredients are: Powder: amino acids, lactose (anhydrous), mannitol, sorbitol
Again this was an incorrect response. I would have expected the correct response to be:
“How Priorix is given
Priorix is injected under the skin or into the muscle, either in the upper arm or in the outer thigh.”
So unfortunately not great results in extracting answers from the PDF I used. It would be interesting to look at a selection of other types of documents and corpora.
Do Knowledge Connectors work?
Again its important to point out this is a beta feature. There are definitely challenges and in some functional area much more to be done with Knowledge Connects. In conclusion, It’s also important to recognise that I looked at 2 different types of use cases and knowledgebase document types which provided very different results so its worth looking at each one separately.
Chatbot FAQ functionality using an existing FAQ webpage in a fairly structured format.
If you want to convert your FAQ page into a chatbot or if you have a similar structured document such as a PRFAQ for a product or service then Connectors work well.
Just supplying the URL of the FAQ page as a data source to the knowledge connectors is fantastic and provides fairly good results. However, it’s worth keeping in mind there may still be match errors so the history log is invaluable in checking for them. Thankfully it’s fairly easy to manage any question/answer pair which has been handled incorrectly by converting to its own intent.
Chatbot FAQ using a document in an unstructured format.
I found my test results with this use case rather disappointing. The accuracy of the extracted answers was fairly poor for my test case. Although for different document sources you may be able to get better results.
The extracted answers look more like a match based on keywords with some additional coverage but it does not appear to consider the context in which the question is asked. Also, this type of knowledge connector does not provide any full control like intents in terms of context and priority of matching training phrases etc so there is no way of fixing bad responses. A feature where you can evaluate and train responses would be a great addition to the knowledge base so hopefully, that is in the Dialogflow team pipeline.
Should I use Dialogflow Knowledge Connectors?
If you have some FAQ information in a structured format then Knowledge connectors are worth a try with some caveats.
If you have unstructured documents which you want your chatbot to use to extract answers to questions then at the moment knowledge connectors are not a magic bullet. It’s a big ask, but for me, this is where the real value will lie particularly if you want to support large knowledge bases with a chatbot. Knowledge connectors are an experimental feature, so hopefully as the technology advances then they will improve.
Dialogflow is Google’s human-computer interaction developer which is based on natural language conversations. At The Bot Forge, Dialogflow is our platform of choice for chatbot construction.
There’s three main reasons for why we’re amongst companies such as Domino’s and Ticketmaster who make Dialogflow their chatbot platform of choice.
- Flexible coding: Thanks to Dialogflow’s in-line code editor, the time taken to complete code-related tasks is quicker than with other platforms. The prime benefit here is that we’re then able to spend more time perfecting the conversational experience.
- Scalability: Whether you start with 1,000 or 100,000 users, the platform can scale to your needs. As Dialogflow is hosted on the Google Cloud Platform, this allows the potential to support a user base of hundreds of millions, if required.
- Inbuilt machine learning: Arguably the biggest benefit of the platform in comparison to others is the availability of machine learning and natural language processing technologies. The access to these features allow us to create a richer and more natural conversational experience for your users. Dialogflow makes this possible by allowing us to extract data from a given conversation, in order to train our agents to understand user intents. Plus, as the technologies are already built into the platform, we’re able to construct your application much faster.
To ensure that we’re using the right platform for our clients’ needs, we continuously refresh our knowledge of other bot construction tools, such as The Microsoft Bot Framework. A benefit of using this platform from a developer’s perspective is the availability of templates to choose from, which allow for a more time efficient development. The IBM Watson Assistant is another platform that a developer may favour, as the testing the bot is simpler than it is on other competing platforms. If a priority is to feature your bot over a wide range of locations, Recast.AI may be a good option for its availability on 14 different platforms.
But, these platforms aren’t without their weaknesses. Unlike Dialogflow, Microsoft Bot Framework is lacking in the tools which help to create the “brains” of the bot, which is important for the sophistication that users are beginning to expect. Also, a downside of IBM Watson Assistant is the unintuitive relationship between intents (representation of user’s meaning) and entities (expressions recognised in categories). If you’re interested in how Dialogflow utilises intents and entities, we will be covering this in a future blog post.
Although we understand that there are features of other platforms which can make the development process more efficient, the inbuilt machine learning features of Dialogflow means we can deliver a bot that can produce a much richer conversational experience.
The Dialogflow team announced that they would be deprecating their V1 version of the Dialogflow api in Oct 2019
The Bot Forge have been following the progress of the latest V2 api since its official launch in April this year, it’s no surprise that the Dialogflow team have contmade this announcement as they concentrate their efforts on the new API. However it does have some serious implications for existing chatbots utilising the v1 API.
You can see some more details about upgrading from V1 to V2 in the official guide here. We also aim to provide some more detailed information about carrying out an upgrade on this blog so watch out for that.
Anyone who already has built out their website chatbots using v1 API, then they should start planning for the migration sooner rather than later. Any new features should be added after the upgrade. The migration is potentially a non-trivial task, considering some chatbots have some fairly complex code driving their fulfilment. If you have a live bot in production our advice is to set up an upgrade chatbot as a copy of your existing bot project and then work through the upgrade there. You can guarantee that changing to V2 will mean that fulfilment and API calls may stop working. Once the upgrade is complete re-testing all bot functionality is strongly advised before setting live.
Chatbot Web Interfaces
We would recommend everyone who is creating custom website chatbots to do so using the v2 API. All our new chatbots are built using the v2API.
The big change for v2 is that it uses Google’s OAuth2 for its authentication, with v1 you could simply use the client access token when calling the v1 API. Implementing the features required to authenticate against the new v2 API means some significant extra development effort.
If you need assistance or advice with your own chatbot v2 upgrade please get in touch, we are Dialogflow experts and would be happy to help!
At the Bot Forge, we specialise in building chatbots, book a free consultation to learn more.