Everything looks promising in the world of bots: big players are pushing platforms to build them (Google, Amazon, Facebook, Microsoft, IBM, Apple), large retail companies are adopting them (Starbucks, Domino’s, British Airways), press is excited about movies becoming reality; and we users are eager to use. However, one dark hole remains in this scenario. The bot development process.
To Automate the Generation of Training Data for Conversational Bots, We combine our Natural Language Generation solution to automatically expand a sample sentence into hundreds of variations while using our Slot generation technology the sentence is automatically tagged with the relevant intents and entities. Below in this post, we will better explain the process in 3 Steps.
Conversational bot development takes time and the final delivery of a bot with good understanding is not guaranteed. This happens because creating a bot relies on manual work, and that is time-consuming and error prone. We end up with expensive projects that are hard to monetize and unhappy customers that feel disengaged.
One of the key areas in bot development is bot training, or making the bot understand user requests to be able to match them to answers accurately. The training involves feeding the bot with different variations of what the bot users may say, and hand tagging the relevant information or entities. For example, if you take the sentence "turn on the lights in your living room" it can be asked in different ways:
- turn on the lights in the living room
- turn on the living room lights
- I’d like to turn on the lights in the living room
- can you turn on the living room lights?
- please, turn on the living room lights
For each sentence, we will have to hand tag “turn on” as the action to be performed, “lights” as an object, and “living room” as a place.
Imagine how much time we could reduce the training time if were able to teach the bot that all these requests are variations of the same intent and have the same meaning. Bitext NLP middleware for bot training un automates the process of corpus creation and collection and the manual coding of the hundreds of sentences your Machine Learning Algorithm needs to train your chatbot.
How do we do it?
1. Step one, the original sentence or a description “I want my assistant to be able to control alarms”
enable the alarm
2. Step two, expand the sentence
enable the alarm
enable the alarm , please
can you enable the alarm?
i want to enable the alarm….
3. Step three, automatically tag the sentences
"polarity": "affirmative", }
The resulting tagged corpus is directly importable every major bot training platform like Api.ai, Wit.ai, LUIS, Lex, Watson, and other Machine Learning powered systems. Through the described process Bitext NLP middleware for bot training reduces bot development times from months to weeks and can be integrated with existing bots to expand their levels of understanding quickly.
Bitext works in improving the understanding between human and machines and having great conversational bots that engage with users is fundamental. We believe that the best way to achieve maturity in the bot market is with short and transparent bot development cycles that deliver great results and have a positive impact on revenue from day one of deployment.
You can check my team's publications at Chatbots magazine over here: