What is Conversational UI

So what is Conversational UI? Historically interfaces or UI has been made up of visual elements: buttons, dropdown lists, date pickers, carousels. Now we have the technology to provide conversational ones as well via voice or text-enabled interfaces such as Amazon Echo or Facebook Messenger. Some applications also leverage the best of both worlds, combining traditional UI elements with conversational capability: for example Facebook Messenger.

The most important advancement in Conversational UI has been Natural Language Processing (NLP). This is the field of computing that deals with deciphering the exact words that a user and parsing out of it their actual intent and in what context. You can read about some of the terminologies here. If the bot is the interface, NLP is the brain behind its conversational ability.

Natural Conversational Experiences

The Bot Forge creates natural conversational experiences for voice and text-based applications leveraging the latest AI technology.

Our team is made up of natural language processing (NLP) and Natural language understanding (NLU) experts, artificial intelligence specialists, conversational architects, project managers, and interaction designers. Focused on forging engaging voice and text-based Conversational UI powered by NLP.

Our conversational interfaces can be deployed on websites, mobile applications, messaging applications and voice-enabled devices.

We use the most advanced machine learning technology to power our solutions so that recognising user intent and context works reliably and seamlessly. Our goal is to create awesome conversational experiences.

Google give an amazing demo of their google assistant making a phone call

, , , ,

ONSTAGE AT I/O 2018 Showcasing Google Assistant


In the first day, they have shown some of the amazing new capabilities of Google Assistant. One of them is being able to make phone calls on your behalf. You ask Google Assistant to make an appointment and it makes the call in the background. The demo has to be seen to be believed.

CEO Sundar Pichai played back a phone call recording that he said was placed by the Assistant to a hair salon to book an appointment.

With a voice which sounded totally natural; the person at the salon had no idea they were talking to an automated AI assistant. The Assistant even managed some small talk; dropping “mmhmmm” into the conversation.

Pichai reiterated that this was a real call using Assistant and not some staged demo. “The amazing thing is that Assistant can actually understand the nuances of conversation,” he said. “We’ve been working on this technology for many years. It’s called Google Duplex.” Pichai also made the point that Duplex was still under development and that Google plans to conduct early testing of Duplex inside Assistant this summer. Their aim is “The technology is directed towards completing specific tasks, such as scheduling certain types of appointments”

Google has a blog post with more Duplex information here which has a lot more examples of Duplex in action using different voices, for example, a really interesting one making a call to a restaurant to book a table.

Google again states that these are real-world examples.:

“While sounding natural, these and other examples are conversations between a fully automatic computer system and real businesses.”

This post also does a good job of highlighting some of the real complexities of having a conversation successfully. With many sentences having different meanings depending on the current context. In the same conversation early on the assistant also handles misinterpretation when the person called mentions a table number taken from what she has misheard. Google Assistant seems to handle this perfectly.

This looks set to be groundbreaking technology:

For users, Google Duplex is making supported tasks easier. Instead of making a phone call, the user simply interacts with the Google Assistant, and the call happens completely in the background without any user involvement.

We are looking forward to seeing more of it in summer and using the technology in our projects.

With Google also announcing their rebranding of its Google Research division to Google AI. The move shows how Google has increasingly focused R&D on natural language processing and neural networks.

It looks like Google are setting their sights on being the world’s biggest artificial intelligence (AI) company.