Have you ever been stuck at work while a pulsating cricket match was going on? You need to meet a deadline but you just can’t concentrate because your favorite team is locked in a fierce battle for a playoff spot. Sounds familiar?
I’ve been in this situation a lot in my professional career and checking my phone every 5 minutes was not really an option! Being a data scientist, I looked at this challenge from the lens of an NLP enthusiast. Building a chatbot that could fetch me the scores from the ongoing IPL (Indian Premier League) tournament would be a lifesaver.
So I did just that! Using the awesome Rasa stack for NLP, I built a chatbot that I could use on my computer anytime. No more looking down at the phone and getting distracted.
And the cherry on top? I deployed the chatbot to Slack, the popular platform for de facto team communications. That’s right – I could check the score anytime without having to visit any external site. Sounds too good an opportunity to pass up, right?
In this article, I will guide you on how to build your own Rasa chatbot in minutes and deploy it in Slack. With the ICC Cricket World Cup around the corner, this is a great time to get your chatbot game on and feed your passion for cricket without risking your job.
Table of Contents
- Why should you use the Rasa Stack for Building Chatbots?
- Anatomy of our IPL chatbot
- Extracting User Intent from a Message
- Making Interactive Conversations
- Talking to our IPL Chatbot
- Getting IPL Data using CricAPI
- Bringing our Chatbot to Life (Integrating Rasa and Slack)
Why should you use the Rasa Stack for Building Chatbots
The Rasa Stack is a set of open-source NLP tools focused primarily on chatbots. In fact, it’s one of the most effective and time efficient tools to build complex chatbots in minutes. Below are three reasons why I love using the Rasa Stack:
- It lets you focus on improving the “Chatbot” part of your project by providing readymade code for other background tasks like deploying, creating servers, etc.
- The default set up of Rasa works really well right out of the box for intent extraction and dialogue management, even with lesser data
- Rasa stack is open-source, which means we know exactly what is happening under the hood and can customize things as much as we want
These features differentiate Rasa from other chatbot building platforms, such as Google’s DialogFlow. Here’s a sneak peek into the chatbot we’ll soon be building:
Anatomy of our IPL Chatbot
Let’s understand how our Rasa powered IPL chatbot will work before we get into the coding part. Understanding the architecture of the chatbot will go a long way in helping us tweak the final model.
Overview of the Rasa Chatbot
There are various approaches we can take to build this chatbot. How about simply using the quickest and most efficient method? Check out a high-level overview of our IPL chatbot below:
Let’s break down this architecture (keep referring to the image to understand this):
- As soon as Rasa receives a message from the end user, it tries to predict or extract the “intent” and “entities” present in the message. This part is handled by Rasa NLU
- Once the user’s intent is identified, the Rasa Stack performs an action called action_match_news to get the updates from the latest IPL match
- Rasa then tries to predict what it should do next. This decision is taken considering multiple factors and is handled by Rasa Core
- In our example, Rasa is showing the result of the most recent match to the user. It has also predicted the next action that our model should take – to check with the user whether the chatbot was able to solve his/her query
Setting up the IPL Chatbot
I have created two versions of the project on GitHub:
- Complete Version – This is a complete chatbot that you can deploy right away in Slack and start using
- Practice Version – Use this version when you’re going through this article. It will help you understand how the code works
So, go ahead and clone the ‘Practice Version’ project from GitHub:
git clone https://github.com/mohdsanadzakirizvi/iplbot.git && cd iplbot
And cd into the practice_version:
A quick note on a couple of things you should be aware of before proceeding further:
- Rasa currently only supports Python version <= 3.6. If you have a higher version of Python, you can set up a new environment in conda using the following command:
conda create -n rasa python=3.6 conda activate rasa
- You will need a text editor to work with the multiple files of our project. My personal favorite is Sublime Text which you can download here
Installing Rasa and its Dependencies
You can use the code below to install all the dependencies of the Rasa Stack:
pip install -r requirements.txt
This step might take a few minutes because there are quite a few files to install. You will also need to install a spaCy English language model:
python -m spacy download en
Let’s move on!
Extracting User Intent from a Message
The first thing we want to do is figure out the intent of the user. What does he or she want to accomplish? Let’s utilize Rasa and build an NLU model to identify user intent and its related entities.
Look into the practice_version folder you downloaded earlier:
The two files we will be using are highlighted above.
- data/nlu_data.md – This is the file where you will save your training data for extracting the user intent. There is some data already present in the file:
As you can see, the format of training data for ‘intent’ is quite simple in Rasa. You just have to:
- Start the line with “## intent:intent_name”
- Supply all the examples in the following lines
Let’s write some intent examples in Python for the scenario when the user wants to get IPL updates:
You can include as many examples as you want for each intent. In fact, make sure to include slangs and short forms that you use while texting. The idea is to make the chatbot understand the way we type text. Feel free to refer to the complete version where I have given plenty of examples for each intent type.
- nlu_config.yml – This file lets us create a text processing pipeline in Rasa. Luckily for us, Rasa comes with two default settings based on the amount of training data we have:
- “spacy_sklearn” pipeline if you have less than 1000 training examples
- “tensorflow_embedding” if you have a large amount of data
Let’s choose the former as it suits our example:
Training the NLU classifier
If you have made it this far, you have already done most of the work for the intent extraction model. Let’s train it and see it in action!
You can train the classifier by simply following the command below:
Using Windows? You can run the following python code:
python -m rasa_nlu.train -c nlu_config.yml --data data/nlu_data.md -o models --fixed_model_name nlu --project current --verbose
Predicting the Intent
Let’s test how good our model is performing by giving it a sample text that it hasn’t been trained on for extracting intent. You can open an iPython/Python shell and follow the following steps:
>>> from rasa_nlu.model import Interpreter >>> nlu_model = Interpreter.load('./models/current/nlu') >>> nlu_model.parse('what is happening in the cricket world these days?')
Here is what the output looks like:
Not only does our NLU model perform well on intent extraction, but it also ranks the other intents based on their confidence scores. This is a nifty little feature that can be really useful when the classifier is confused between multiple intents.
Making Interactive Conversations
One of the most important aspects of a chatbot application is its ability to be interactive. Think back to a chatbot you’ve used before. Our interest naturally piques if the chatbot is able to hold a conversation, right?
The chatbot is expected to extract all the necessary information needed to perform a particular task using the back and forth conversation it has with the end user.
Designing the conversational flow
Take a moment to think of the simplest conversation our chatbot can have with a user. What would be the flow of such a conversation? Let’s write it in the form of a story!
Me: Hi Iplbot: Hey! How may I help you? Me: What was the result of the last match? Iplbot: Here are some IPL quick info: 1.The match between Rajasthan Royals and Delhi Capitals was recently held and Delhi Capitals won. 2.The next match is Warriors vs Titans on 22 April 2019 Iplbot: Did that help you? Me: yes, thank you! Iplbot: Glad that I could help! :)
Let’s see how we can teach a simple conversation like that to Rasa:
The general format is:
## news path 1 <--- story name for debugging purposes * greet <--- intent detected from the user - utter_greet <--- what action the bot should take * current_matches <--- the following intent in the conversation
This is called a user story path. I have provided a few stories in the data/stories.md file for your reference. This is the training data for Rasa Core.
The way it works is:
- Give some examples of sample story paths that the user is expected to follow
- Rasa Core combines them randomly to create more complex user paths
- It then builds a probabilistic model out of that. This model is used to predict the next action Rasa should take
Check out the data/stories.md file in the complete_version of the project for more such examples. Meanwhile, here is a nice visualization of the basic story paths generated by Rasa for our IPL chatbot:
The above illustration might look complicated, but it’s simply listing out various possible user stories that I have taught Rasa. Here are a few things to note from the above graph:
- Except for the START and END boxes, all the colored boxes indicate user intent
- All the white boxes are actions that the chatbot performs
- Arrows indicate the flow of the conversation
- action_match_news is where we hit the CricAPI to get IPL information
Write the following in your stories.md file:
Now, generate a similar graph for your stories using the following command:
python -m rasa_core.visualize -d domain.yml -s data/stories.md -o graph.html
This is very helpful when debugging the conversational flow of the chatbot.
Defining the Domain
Now, open up the domain.yml file. You will be familiar with most of the features mentioned here:
The domain is the world of your chatbot. It contains everything the chatbot should know, including:
- All the actions it is capable of doing
- The intents it should understand
- The template of all the utterances it should tell the user, and much more
Rasa Core generates the training data for the conversational part using the stories we provide. It also lets you define a set of policies to use when deciding the next action of the chatbot. These policies are defined in the policies.yml file.
So, open that file and copy the following code:
Here are a few things to note about the above policies (taken from Rasa Core’s policies here):
- KerasPolicy uses a neural network implemented in Keras to select the next action. The default architecture is based on an LSTM (Long Short Term Memory) model
- MemoizationPolicy memorizes the conversations in your training data. It predicts the next action with confidence 1.0 if this exact conversation exists in the training data, otherwise, it predicts ‘None’ with confidence 0.0
- FallbackPolicy invokes a fallback action if the intent recognition has confidence below nlu_threshold or if none of the dialogue policies predict action with confidence higher than core_threshold
- One important hyperparameter for Rasa Core policies is the max_history. This controls how much dialogue history the model looks at to decide which action to take next
Training the Conversation Model
You can train the model using the following command:
Or if you are on Windows, you can use the full Python command:
python -m rasa_core.train -d domain.yml -s data/stories.md -o models/current/dialogue -c policies.yml
This will train the Rasa Core model and we can start chatting with the bot right away!
Talking to your IPL chatbot
Before we proceed further, let’s try talking to our chatbot and see how it performs. Open a new terminal and type the following command:
Once it loads up, try having a conversation with your chatbot. You can start by saying “Hi”. The following video shows my interaction with the chatbot:
I got an error message when trying to get IPL updates:
Encountered an exception while running action 'action_match_news'. Bot will continue, but the actions events are lost. Make sure to fix the exception in your custom code.
The chatbot understood my intent to get news about the IPL. So what went wrong? It’s simple – we still haven’t written the backend code for that! So, let’s build up the backend next.
Getting IPL Data using CricAPI
We will use the CricAPI for fetching IPL related news. It is free for 100 requests per day, which (I hope) is more than enough to satiate that cricket crazy passion you have.
You need to first signup on the website to get access to their API:
You should be able to see your API Key once you are logged in:
Save this key as it will be really important for our chatbot. Next, open your actions.py file and update it with the following code:
Fill in the API_KEY with the one you got from CricAPI and you should be good to go. Now, you can again try talking to your chatbot. This time, be prepared to be amazed.
Open a new terminal and start your action server:
This will activate the server that is running on the actions.py file and will be working in the background for us. Now, restart the chatbot in the command line:
And this time, it should give you some IPL news when asked. Isn’t that awesome? We have already built a complete chatbot without doing any complex steps!
Bringing the Chatbot to Life (Integrating Rasa and Slack)
So we have the chatbot ready. It’s time to deploy it and integrate it into Slack as I promised at the start of this article. Fortunately for us, Rasa handles 90% of the deployment part on its own.
Note: You need to have a workspace in Slack before proceeding further. If you do not have one, then you can refer to this.
Creating a Slack Application
Now that we have a workspace to experiment with, we need an application to attach your bot. Create the app on the below link:
1. Click on “Create App”, give a name to the app, and select your workspace:
This will redirect you to your app dashboard. From there, you can select the “Bots” option:
2. Click “Add a Bot User” –> Give a name to your bot. In my case, I have named it “iplbot”. Now, we need to add it to our workspace so we can chat with it! Go back to the above app dashboard and scroll down to find the “Install App to Workspace” option:
Once you do that, Slack will ask you to “authorize” the application. Go ahead and accept the authorization.
3. Before we are able to connect any external program to our Slack bot, we need to have an “auth token” that we need to provide when trying to connect with it. Go back to the “app dashboard” and select the “OAuth & Permissions” option:
4. This will open the permissions settings of the app. Select the “Bot User OAuth Access Token” and save it (I have hidden them for security reasons). This token is instrumental in connecting to our chatbot.
Our work isn’t over yet. We need another useful tool to deploy our chatbot to Slack. That’s ngrok and you can use the following link to download it:
We are now one step away from deploying our own chatbot! Exciting times await us in the next section.
Pushing the Chatbot to Slack
We need only five commands to get this done as Rasa takes care of everything else behind the scenes.
- Open your slack_credentials.yml file and paste the “Bot User OAuth Access Token” in place of the Slack token:
- Go to a new terminal and start the action-server:
- You’ll see that the server is run on port 5055 so let’s use ngrok on this port. Open another terminal and type the following:
ngrok http 5055
This will give an output like the below image:
The highlighted link is the link on the internet that is connected to your computer’s port 5055. This is what ngrok does – it lets your computer’s local programs be exposed on the internet. In a way, this is a shortcut for using a cloud service to deploy your app.
- Open your endpoints.yml file and replace the “http://localhost:5055/webhook” with the above URL like this:
- Deploy the Rasa chatbot using the following command:
python -m rasa_core.run -d models/current/dialogue -u models/current/nlu --port 5002 --connector slack --credentials slack_credentials.yml --endpoints endpoints.yml
You will get a message like this:
Notice that the Rasa Core server is running at port 5002.
- Now, deploy port 5002 to the internet:
ngrok http 5002
- Go to your app dashboard on Slack, click on Events Subscription, and then on the “Enable Event Subscriptions” button. Paste the ngrok URL of your Rasa Core Server in this format under the Request URL field:
In the above URL, replace the ngrok part with your ngrok URL:
- Under the Subscribe to Bot Events, click on the Add Bot User Event button. It will reveal a text field and a list of events. You can enter terms into this field to search for events you want your bot to respond to. Here’s a list of events that I suggest adding:
Once you’ve added the events, click the Save Changes button at the bottom of the screen.
Now you can just refresh your Slack page and start chatting right away with your bot! Here’s a conversation with my chatbot:
Where should you go from here?
You’ll find the below links useful if you are looking for similar challenges. I have built a Zomato-like chatbot for Restaurant Search problem using both Rasa Core and Rasa NLU models. I will be teaching this in much more detail in our course on Natural Language Processing.
The links to the course are below for your reference:
- Certified Course: Natural Language Processing (NLP) using Python
- Certified Program: NLP for Beginners
- The Ultimate AI & ML BlackBelt Program
I would love to see different approaches and techniques from our community. Try to use different pipelines in Rasa Core, explore more Policies, fine-tune those models, check out what other features CricAPI provides, etc. There are so many things you can try! Don’t stop yourself here – go on and experiment.
Feel free to discuss and provide your feedback in the comments section below. The full code for my project is available here.
You should also check out these two articles on building chatbots:
- A Guide to Building an Intelligent Chatbot for Slack using DialogFlow API
- Building a FAQ Chatbot in Python, The Future of Information Searching