Building an agent

In Activechat, you can use external NLP providers to build natural language chatbots. There’s simply no point to invest time and money in complex AI and machine learning when you have advanced solutions from giants like Google, Facebook, or Microsoft readily available. So, the primary point for Activechat was to simplify the integration process and make this amazing technology accessible and easy to use.

In this manual, we’ll walk through the process of building a simple Dialogflow agent and connecting it to your chatbot. Each of the intents in your Dialogflow agent will become a skill in Activechat and you will be able to build that skill visually and use our powerful integrations to connect it to your business assets.

Building a Dialogflow agent

Good news – Dialogflow is free until you reach the limit of 180 requests per minute. For many chatbot applications, it’s enough – and when you need more, you can easily upgrade to the Enterprise edition with 600 free requests per minute and $0.002 per request over that limit. Refer to Dialogflow pricing and quotas pages for more info.

To start using natural language in your chatbot, first, go to the Dialogflow homepage and click the “Sign up for free” button. Sign in with your Google account and give Dialogflow permission to access your account. Confirm your country and accept Terms of use.

Once you do this, the Dialogflow console will open and you will be able to create your first NLP agent. Click “Create agent” to do so.

Choose the name for your new agent (do not use whitespaces, replace it with underscores if necessary), set the time zone and default language, and click “Create”.

Default Dialogflow intents

When a new agent is created, the list of intents will open (click the “Intents” item in the menu on the left to open it anytime). There will be two default intents in the beginning – Default Welcome intent and Default Fallback intent.

Default Welcome intent will be triggered when users greet the bot or start interacting with it for the first time (this is similar to the “start” skill in Activechat). You will not need that intent for Activechat integration, so let’s skip it for now.

Default Fallback intent will be triggered when Dialogflow can not recognize any of the intents in the user’s utterance. Usually, it contains some pre-defined responses like “Sorry, could you say that again?” or “What was that?” Each intent in Dialogflow can contain multiple responses (we’ll get to this in a moment) and these responses will be chosen randomly to make the chatbot feel more human and less repetitive.

Anatomy of a Dialogflow intent

Let’s start with a simple example and create two intents for our chatbot. One will be telling the current time (we’ll name it “check_time”) and another will return the current date (we’ll name it “check_date”). Later we’ll be building some more advanced features on top of these intents.

Click the “CREATE INTENT” button in the top right corner.

Give your intent a name (in our case – “check_time”). Although Dialogflow allows whitespaces in intent names, we advise you to use underscores or dots instead of whitespaces. When you connect your Dialogflow agent to Activechat, intent names will be transcribed into skill and event names, and whitespaces can lead to confusion.

There are six main sections in the intent definition in Dialogflow:

  • Contexts – defines input and output contexts for the intent. We’ll get to this in more detail in the “Context management” section of the manual.

  • Events – allows you to trigger intents from within Dialogflow. Somehow similar to Activechat’s events, this part is not necessary for Activechat integration and we’ll not be using it.

  • Training phrases – the heart of your intent definition and food for the machine learning algorithm. Here you should add various utterances that can be used by your customers to express this specific intent.

  • Action and parameters – here you can define which entities can be used in that intent and which action should be identified with that intent. The “action” part is not necessary for Activechat integration, and we’ll discuss parameters in the Defining and using entities part of this manual.

  • Responses – again, not necessary for Activechat integration. Usually, this section contains response phrases that your chatbot should send to the user when this intent is identified. But with Activechat, a chatbot skill will be triggered for each intent instead. If you’re connecting an existing Dialogflow agent that you designed before Activechat, there can be responses already available. If that’s the case, you can use these responses in Activechat skill – the response sent by Dialogflow will be available as $_nlp_speech system attribute in the chatbot.

  • Fulfillment – this part of the intent definition is not used with Activechat integration. It’s for custom-designed chatbots built with Dialogflow and server scripts, and Activechat removes that heavy lifting from the chatbot development process.

For Activechat integration, you will need only the most essential part of the intent definition – training phrases.

Adding training phrases (utterances)

Click the “Add training phrases” link in the “Training phrases” section of the intent definition to add some user utterances that should trigger that intent. Since we’re building the “check_time” intent, the phrases could be something like this:

  • What time is it?

  • Tell me the time

  • I want to know the time

  • … etc

Add as many relevant phrases as possible, but try to avoid utterances that can be mixed with other intents in your chatbot conversation. Google will use machine learning to guess similar phrases and detect that intent automatically even if your customer is using other (but similar) utterances or mistypes it.

Type phrases one by one and hit “Enter” to start the new one.

When you add enough training phrases, click the “SAVE” button in the top right corner and give Dialogflow some time to update the database (the gear icon to the left of the agent name will be rotating while the process is running, usually it takes just a couple of seconds).

Testing the intent in the console

Once the training phrases are in place, you can test your new intent in the Dialogflow console (it’s on the right). Type the utterance that you’d like to test (or even click the microphone icon and speak it) and see which intent will be detected.

Notice that the phrase we used for testing (“tell the time”) is not in the list of training phrases that we’ve set in the intent parameters. That’s the power of AI – correct intent was detected automatically even there was no direct match. Mistypes, typos, and phrase variations will work similarly.

The more training phrases you add, the better for your agent – AI needs data to work with. Once your chatbot goes live and starts having actual conversations, you will be able to review actual utterances that your customers are using and add them to the list of training phrases.

Adding simple Dialogflow responses

Although this is not necessary for Activechat integration, let’s add some responses to our intent to see how it works. Open the “Responses” section in the intent definition and add a couple of phrases. Don’t forget to click “SAVE” when you’re done!

Repeat the process for the second intent – “check_date”. Click the “Intents” item in the menu on the left to return to the list of your intents and click “CREATE INTENT” again.

Connecting Dialogflow to Activechat

When you’re done with building a simple Dialogflow agent, it’s time to switch back to Activechat and connect that agent to your actual chatbot.

First, switch from our built-in NLP engine to Dialogflow in the bot settings:

Next, go to “Automation - Dialogflow” in the main menu (on the left) and click the blue “Log in” button to connect your Dialogflow agent to Activechat.

Again, authorize with your Google account (make sure it’s the same account that you’ve been using when building a simple Dialogflow agent before) and give Activechat permission to access your Dialogflow agents (click “Allow”).

Once you do this, you will be able to see the list of your Dialogflow agents in the “Connected to:” dropdown list in the top left corner of the integration settings screen. Choose the agent that you’ve just created from the list.

There will be a warning that connecting a Dialogflow agent will create skills in your Activechat bot for every intent that you have in Dialogflow. That’s exactly how the integration works – Activechat is sending a user’s message to your Dialogflow agent to detect the intent, and then triggers one of these skills to provide a response to that intent. It allows you to use the power of AI and NLP to understand what the user wants, and then use the simplicity of the visual chatbot builder to handle the conversation and fulfill that intent, delivering actual value to the user.

Click “Create skills” and a list of all your Dialogflow intents will appear. For each of these intents, Activechat will create an event and automatically create the skill to handle that event. The skill and event names will mimic the intent names in Dialogflow, prefixed by the “_DF_” to help you filter them in the list of skills. Once the intent is detected, this skill will be triggered automatically.

You can click the event names to open related skills in the visual builder. Let’s click “_DF_check_time” to see the skill.

The skill is very simple, just two blocks – CATCH to listen to the event from Dialogflow, and TEXT to display the response text obtained from Dialogflow. This simple structure allows you to start using your existing Dialogflow agents immediately. But once you want to customize the conversations triggered by specific intent, you can delete the TEXT block and build the conversation visually, with all the power provided by Activechat.

IMPORTANT: when you make changes to your Dialogflow agent, don’t forget to click the “Refresh intents” button in the integration settings window to make Activechat aware of the changes made. Do this when you add new intents or update existing ones with new entities. Refreshing is NOT necessary when you add more training phrases to the agent in Dialogflow.

When you remove the connection between Activechat and Dialogflow (or switch to another Dialogflow agent in the integration settings), existing skills that were created (when the connection was established) will NOT be removed. Delete them manually if you do not need them anymore for your new agent.

Sending user messages to Dialogflow

There’s one important step left before we set our Dialogflow to Activechat integration live – we have to start sending users’ messages to Dialogflow for intent detection. This is done in the “default” skill which is triggered automatically every time there’s a new message from the user to the chatbot.

Open the “default” skill in the visual builder and add the NLP block there. Use the $_last_user_input system attribute to pass the message received by the chatbot to the connected Dialogflow agent.

Click the “RUN” button in the visual builder to deploy your new skills to the cloud, and you’ll be ready to test your new natural language understanding chatbot.

Testing the integration

Open the conversation with your new chatbot on Messenger (you can use the “Test your chatbot” button at the top of the visual builder) and try some phrases that we’ve used for intents while building a simple Dialogflow agent.

Notice how Dialogflow responses are being forwarded to the bot user through Activechat. Here’s what’s going on:

  1. When the user sends a message to the chatbot, the “default” skill is triggered.

  2. NLP block that we’ve connected to the “default” skill is sending that message to the connected Dialogflow agent (from the $_last_user_input system attribute).

  3. An intent is detected by Dialogflow and a corresponding event is fired into Activechat.

  4. An automatically created by Dialogflow integration chatbot skill that is processing that event is triggered.

  5. A TEXT block in that skill is sending the content of the $_nlp_speech system attribute which contains the response from the Dialogflow intent.

Adding Activechat features to Dialogflow chatbot

Now let’s improve this simple bot slightly, adding some Activechat features to our simple intents. We’ll add actual date/time data so that the responses from the bot would be bringing real value to the user.

All we have to do is use some system attributes for date and time in our “check_date” and “check_time” skills. Here’s the actual skill for the “check_time” intent:

What’s happening here? Once the intent is detected by Dialogflow, the skill will be triggered and the TEXT block will display the current time to the chatbot user. Since there are no blocks connected to that TEXT block, the chatbot will go to the idle mode after this, so once the user sends another message, the “default” skill will be triggered again and the new message will be sent to Dialogflow for intent detection.

Notice that we’re not using the $_nlp_speech attribute anymore. It’s just not needed since the actual bot response is completely generated within Activechat. So, you don’t have to set responses for your intents in Dialogflow – $_nlp_speech is used only if you wish to connect an existing Dialogflow agent with responses already set there.

Obviously, this is a very simple example. Use the power of the visual chatbot builder to create complex conversations for each of your intents (if necessary). For example, your “order_pizza” intent can trigger a complex skill that will ask the user a number of questions and send the order data to your chatbot back-end (like Google Sheets or custom JSON endpoint).

Overall integration overview

So, the overall process for building a smart and valuable natural language chatbot usually looks like this:

  1. Build a list of intents that the chatbot should be able to help with.

  2. Prepare as many utterances for each of the intents as possible.

  3. Build skills in Activechat for each of these intents.

  4. Create a Dialogflow agent and prepare the intents there.

  5. Connect the agent to Activechat and add the NLP block to your “default” skill.

We’ll dive deeper into more complex NLP tech in the next part of this manual, stay tuned!

Last updated