Build an AI Research Assistant Using CrewAI and Composio

Sunil Kumar Dash 22 May, 2024
10 min read

Introduction

With every iteration of the LLM development, we are nearing the age of AI agents. An age where AI agents are deeply integrated into software workflows, handling the majority of tasks, from automating personal productivity tasks like scheduling meetings and managing emails to providing personalized reminders and organizing daily to-do lists. On an enterprise level, AI agents can streamline operations by automating customer support, optimizing supply chain logistics, enhancing data analysis, and improving decision-making processes. This allows businesses to operate more efficiently, reduce costs, and focus human efforts on more creative tasks. But to build agents that work is not an easy task, especially in production. Much effort is being spent to build the tooling ecosystem to make agents useful and reliable. This article will explore two such tools, CrewAI and Composio, for building useful AI agents using Claude Sonet.

CrewAI and Composio

Learning Objective

  • Understand AI agents.
  • Learn about CrewAI – a tool for orchestrating AI agents.
  • Explore Compoiso – a platform for integrating tools with Agents.
  • Build an AI research assistant with Slack and Notion integration.

This article was published as a part of the Data Science Blogathon.

What are AI Agents?

The term AI agents is being used in every modern AI discourse lately. So, what are AI agents? The agents are pieces of software that can dynamically interact with their environment via a set of tools, and the “AI” in “AI agents” refer to the Large Language Models or Large Multi-modal Models.

As we know, LLMs possess a condensed form of human knowledge in their weights,  which enables them to analyze and reason a complex task step by step. When the LLMs have access to the right tools, they can break down a problem statement and use the right tools to execute tasks as and when needed. The best example of this would be Chatgpt app itself. It has access to the code interpreter, the Internet, and Dalle. Based on the given prompt, it decides what to use and when. So, AI agents are LLMs augmented with tools and goals.

However, straightforwardly using tools with LLMs may not be enough for performing complex tasks. We need to orchestrate multiple agents with definite tools and goals.

What is CrewAI?

CrewAI is an open-source tool for orchestrating multiple AI agents to accomplish complex tasks. It provides a collaborative approach where agents can assume roles, delegate tasks, and share goals, akin to a real-world crew. These are the core features of CrewAI.

  • Agents: Agents are autonomous units responsible for reasoning, task delegation, and communicating with other agents akin to the team members in a real-world team.
  • Tasks: Tasks are specific assignments given to agents. It details all the steps and actions an agent needs to take to perform a required objective.
  • Tools: Tools are necessary to perform tasks that are beyond the LLMs, such as web scraping, responding to emails, task scheduling, etc.
  • Process: The processes in CrewAI orchestrate the execution of tasks by agents. This ensures tasks are distributed and executed efficiently by agents in a predefined manner. The process is either sequential, where tasks are completed sequentially, or hierarchical, where tasks are executed based on a managerial hierarchy.
  • Crews: The Crews in CrewAI are collaborative agents with tasks and tools working towards accomplishing complex tasks.

Here is a mind map for CrewAI.

CrewAI and Composio

What is Composio?

Composio is a platform that provides 100+ tools, such as Slack, GitHub, Discord, etc, with actions and triggers to integrate with the AI workflows. It can be integrated with LangChain, Autogen, and CrewAI to make agents useful and reliable. This makes it much easier for agents to work with external apps. In this article, we will use the Slack and Notion tools from Composio to build an AI assistant that listens to a Slack channel and writes a full report on the topic to a Notion file. So, let’s hop on to the coding part.

Building an AI Research Assistant

Let’s define the goal of our AI research assistant. So, our AI research assistant consists of a set of agents working in collaboration to accomplish the task. The assistant listens to a specific Slack channel via a Slack bot, and when a message is sent to the channel, the AI crew springs into action. It takes the Slack message as the main task, sequentially distributes the tasks to the research agent to provide pressing points, an analyst to describe and expand on those points, a Notion agent to create and write the contents to a text file, and finally, the Slack agent to respond in a Slack channel with the text document.

Here’s the visual representation of the workflow.

CrewAI and Composio

So, here’s how we will be building.

  • Set up a development environment using Venv or Poetry.
  • Set up the Composio toolset for Notion and Slack.
  • Build agents using CrewAI, Composio toolsets, and Anthropic’s Claude Sonnet (can use other models).
  • Build a Flask server with Ngrok tunneling that listens to the Slack bot.

Step 1: Set-Up Environment

To start, create a virtual environment and install the dependencies listed below.

crewai[tools]==0.*
composio-crewai==0.2.*
composio-langchain==0.2.*
composio-core==0.2.*
flask==3.0.3
python-dotenv==1.0.1
langchain-anthropic

If you use Poetry, then clone the repository and run 

poetry install
poetry shell

We also need a secure Ngrok tunnel for the local Flask server to receive triggers from the Slack bot. Ngrok is a secure tunneling service that exposes the local server to the internet for development and testing activities. So, install Ngrok, depending on your system OS. You may need to create an account with them. Once downloaded, connect port 2000 to Ngrok, where we will host our Flask server.

ngrok http 2000

This should start a ngrok tunnel to port 2000.

CrewAI and Composio

Also, we need to define a few environment variables, an Anthropic API key, a Slack Channel ID, and a trigger ID. So, create these variables in a .env file.

ANTHROPIC_API_KEY=

CHANNEL_ID=

TRIGGER_ID=

Add your API key and Slack channel ID. You can find the Slack channel ID from its URL, which usually starts with “C*”; for example, C074RUF0UQ5 is the channel ID of the channel https://app.slack.com/client/T074SRB4FGS/C073RUF0UQ5. We will set the TRIGGER_ID in the next section.

Step 2: Set up Tools

Now, we need to set up our Notion and Slack toolset from Composio. To do so, use the Composio CLI to add tools.

poetry run composio-cli add notion
poetry run composio-cli add slack

Now, set the trigger callback to the Ngrok URL. This will connect the Slack bot to the Flask server via the Ngrok tunnel. 

poetry run composio-cli set global-trigger-callback "<ngrok-url>"

Replace <ngrok-url> with the .app URL from Ngrok shown in the terminal.

Now, enable the Slack receive trigger. This will enable the Slack bot to receive messages from the Slack channel and send them to the tunneled server.

poetry run composio-cli enable-trigger slack_receive_message

It will output a TRIGGER_ID. Now, update the same in .env.

Build Agents with CrewAI

Now, let’s build the agents. In this setup, we define four different agents: a researcher to break down the tasks, an analyst to analyze the tasks, a notion agent to write content to a text file, and a Slack agent to send confirmation of the task along with the text file to the Slack channel.

Define Agent roles and Tasks

First, we need to define Agent roles and tasks. We can define these in two different Yaml files: the Yaml config files for Agents.

researcher:
  role: >
    {topic} Senior Data Researcher
  goal: >
    Uncover cutting-edge developments in {topic}
  backstory: >
    You're a seasoned researcher with a knack for uncovering the latest
    developments in {topic}. Known for your ability to find the most relevant
    information and present it clearly and concisely.

reporting_analyst:
  role: >
    {topic} Reporting Analyst
  goal: >
    Create detailed reports based on {topic} data analysis and research findings
  backstory: >
    You're a meticulous analyst with a keen eye for detail. You're known for
    your ability to turn complex data into clear and concise reports, making
    it easy for others to understand and act on the information you provide.

notion_agent:
  role: >
    Notion Updater
  goal: >
    You take action on Notion using the Notion API
  backstory: >
    You are AI agent that is responsible for taking actions on Notion on 
    users behalf. You need to take action on Notion using Notion APIs

slack_agent:
  role: >
    Slack Updater
  goal: >
    You take action on Slack using the Notion API
  backstory: >
    You are an AI agent that is responsible for taking actions on Slack on 
    users behalf. You need to take action on Slack using Slack APIs

Define a similar file for tasks.

research_task:
  description: >
    Conduct a thorough research about {topic}
    Make sure you find any interesting and relevant information given
    the current year is 2024.
  expected_output: >
    A list with 10 bullet points of the most relevant information about {topic}.

reporting_task:
  description: >
    Review the context you got and expand each topic into a full section for a report.
    Make sure the report is detailed and contains any and all relevant information.
  expected_output: >
    A fully fledge report with the main topics, each with a full section of information.
    Formated as markdown without missing anything.

notion_task:
  description: >
    Write a document on the Notion of the summary of the given content.
  expected_output: >
    A Notion document with a title and contents.

slack_task:
  description: >
    Write a message on slack channel 'random' that summarizes the complete Crewai 
    research activity. 
    Write a summary 
    of your findings and attach the report.
  expected_output: >
    A Slack message with a summary of entire actions done and final output.  

Build Agents

Create a new file and import libraries and env variables.

import os

from composio_crewai import App, ComposioToolset
from crewai import Agent, Crew, Process, Task
from crewai.project import CrewBase, agent, crew, task
from langchain_anthropic import ChatAnthropic


ANTHROPIC_API_KEY = os.environ.get("ANTHROPIC_API_KEY")

if ANTHROPIC_API_KEY is None:
    print("Please set ANTHROPIC_API_KEY environment variable in the .env file")
    exit(1)

Define LLM and Composio toolset.

llm = ChatAnthropic(
    model_name="claude-3-sonnet-20240229",
    api_key=ANTHROPIC_API_KEY
)

notion_composio_toolset = ComposioToolset(apps=[App.NOTION])
slack_composio_toolset = ComposioToolset(apps=[App.SLACK])

CrewAI provides decorators for agents, tasks, and crew to define them conveniently.

@CrewBase
class ClientCrew:
    """Class representing the Client crew"""

    agents_config = "config/agents.yaml"
    tasks_config = "config/tasks.yaml"

    @agent
    def researcher(self) -> Agent:
        """Create a researcher agent"""
        return Agent(
            config=self.agents_config["researcher"],
            verbose=True,
            llm=llm,
        )

    @agent
    def reporting_analyst(self) -> Agent:
        """Create a reporting analyst agent"""
        return Agent(
            config=self.agents_config["reporting_analyst"], verbose=True, llm=llm
        )

    @agent
    def notion_agent(self) -> Agent:
        """Create a notion agent"""
        return Agent(
            config=self.agents_config["notion_agent"],
            verbose=True,
            tools=notion_composio_toolset,
            llm=llm,
        )

    @agent
    def slack_agent(self) -> Agent:
        """Create a slack agent"""
        return Agent(
            config=self.agents_config["slack_agent"],
            verbose=True,
            tools=slack_composio_toolset,
            llm=llm,
        )

    @task
    def research_task(self) -> Task:
        """Create a research task"""
        return Task(config=self.tasks_config["research_task"], agent=self.researcher())

    @task
    def reporting_task(self) -> Task:
        """Create a reporting task"""
        return Task(
            config=self.tasks_config["reporting_task"],
            agent=self.reporting_analyst(),
            output_file="report.md",
        )

    @task
    def notion_task(self) -> Task:
        """Create a notion task"""
        return Task(
            config=self.tasks_config["notion_task"],
            agent=self.notion_agent(),
            tools=notion_composio_toolset,
        )

    @task
    def slack_task(self) -> Task:
        """Create a slack task"""
        return Task(
            config=self.tasks_config["slack_task"],
            agent=self.slack_agent(),
            tools=slack_composio_toolset,
        )

    @crew
    def crew(self) -> Crew:
        """Create the Client crew"""
        return Crew(
            agents=self.agents, tasks=self.tasks, process=Process.sequential, verbose=2
        )

This is done. We have defined the required Agents, Tasks, the Process, and Crew.

Create Flask Server

 Now, we need to define a Flask server. The server API has only a single endpoint that receives the message from the bot and kicks off the Crew into action.

# main.py

import os

from dotenv import load_dotenv
from flask import Flask, request

load_dotenv()

from client import ClientCrew

app = Flask(__name__)

TRIGGER_ID = os.environ.get("TRIGGER_ID", None)
CHANNEL_ID = os.environ.get("CHANNEL_ID", None)

if TRIGGER_ID is None or CHANNEL_ID is None:
    print("Please set TRIGGER_ID and CHANNEL_ID environment variables in the .env file")
    exit(1)

def run_crew(topic: str):
    inputs = {"topic": topic}
    ClientCrew().crew().kickoff(inputs=inputs)

async def async_run_crew(channel, text, user):
    if channel == CHANNEL_ID:
        run_crew(text)
    return "Crew run initiated", 200


@app.route("/", methods=["POST"])
async def webhook():
    payload = request.json

    message_payload = payload.get("payload", {})
    channel = message_payload.get("channel", "")

    if channel == CHANNEL_ID:
        print("Payload received", payload)

    text = message_payload.get("text", "")
    user = message_payload.get("user", "")

    return await async_run_crew(channel, text=text, user=user)


if __name__ == "__main__":
    app.run(port=2000, debug=True)

Now run the main.py to fire up the Flask server at the localhost 2000 port where the Ngrok has been configured.

python main.py

Go to the Slack channel that you selected and send a message. The server will instantly pick this up, and the Crew workflow will start. The success of the task depends on the quality of the model.  Bigger and better models like GPT-4/GPT-4o and Claude Opus tend to perform better. Sonnet does a good job.

The AI agent wrote this when asked to write SRS for a React, NodeJs, and SQLite Chat App.

CrewAI and Composio

Here is the repository of the codes: sunilkumardash9/CrewAIxComposio-Research-Assistant

Way Forward

We created an AI research assistant that can create a nice report of tasks and update it to a Notion file.  You can add agents with internet access via SERP tools to make the assistant more versatile. The Composio supports 100+ tools with actions and triggers to let the agents freely interact with third-party services. You can use the tools to make the agents better and more useful.

Conclusion

The development of AI agents is happening in full swing and is now the hottest topic in the AI space. And as the tooling and LLM infrastructure improves, it can be expected the next generation of software systems will have AI agents built into them. Many mundane workflows will be handled by AI agents equipped with reliable and useful tools. We saw a small glimpse while building our AI research assistant with Slack and Notion tools integration.

Key Takeaways

  • AI agents are LLMs or Large Multi-modal Models enhanced with tools and goals, enabling them to interact dynamically with their environment and perform tasks.
  • CrewAI is an open-source tool that orchestrates multiple AI agents to collaboratively accomplish complex tasks by assigning roles, delegating tasks, and sharing goals.
  • Composio is a platform offering over 100 tools, such as Slack and GitHub, with actions and triggers for AI workflows. It seamlessly integrates with LangChain, Autogen, and CrewAI.
  • Integrate Composio toolsets with AI agent frameworks like CrewAI to automate workflows that require planning and decision-making.

The media shown in this article is not owned by Analytics Vidhya and is used at the Author’s discretion.

Frequently Asked Questions

Q1. What are AI agents?

A. AI agents are LLMs or Large Multi-modal Models enhanced with tools and goals, enabling them to interact dynamically with their environment and perform tasks.

Q2. What is CrewAI?

A. CrewAI is an open-source agent orchestration framework for building role-playing and collaborative agents. 

Q3. What is Composio?

A. Composio is a platform that integrates efficient tools with an agent framework for interacting with third-party services like Discord, Slack, GitHub, etc for accomplishing complex tasks.

Q4. What can CrewAI do?

A. CrewAI can create collaborative agents that can plan, reason, and delegate tasks to other agents akin to a real-world crew for accomplishing tasks.

Q5. What is the difference between CrewAI and Autogen?

A.  In Autogen, orchestrating agents’ interactions requires additional programming, which can become complex and cumbersome as the scale of tasks grows.

Sunil Kumar Dash 22 May, 2024

Frequently Asked Questions

Lorem ipsum dolor sit amet, consectetur adipiscing elit,

Responses From Readers

Clear