Dynamic AI Workflows Through LangGraph ReAct Function Calling

Himanshu Ranjan Last Updated : 23 Oct, 2024
10 min read

The LangGraph ReAct Function-Calling Pattern offers a powerful framework for integrating various tools like search engines, calculators, and APIs with an intelligent language model to create a more interactive and responsive system. This pattern is built upon the Reasoning + Acting (ReAct) approach, which allows a language model to not only reason through queries but also take specific actions, such as calling external tools to retrieve data or perform computations.

LangGraph ReAct Function Calling

Learning Objective

  • Understand the ReAct Approach: Learners will be able to explain the Reasoning + Acting (ReAct) approach and its significance in enhancing the capabilities of language models.
  • Implement Tool Integration: Participants will gain the skills to integrate various external tools (e.g., APIs, calculators) into language models, facilitating dynamic and interactive responses to user queries.
  • Develop Graph-Based Workflows: Learners will be able to construct and manage graph-based workflows that effectively route user interactions between reasoning and tool invocation.
  • Create Custom Tools: Participants will learn how to define and incorporate custom tools to extend the functionality of language models, allowing for tailored solutions to specific user needs.
  • Evaluate User Experience: Learners will assess the impact of the LangGraph ReAct Function-Calling Pattern on user experience, understanding how real-time data retrieval and intelligent reasoning enhance engagement and satisfaction.

This article was published as a part of the Data Science Blogathon.

What is ReAct Prompt?

The traditional ReAct prompt for the assistant sets up the following framework:

  • Assistant’s Capabilities: The assistant is introduced as a powerful, evolving language model that can handle various tasks. The key part here is its ability to generate human-like responses, engage in meaningful discussions, and provide insights based on large volumes of text.
  • Access to Tools: The assistant is given access to various tools such as:
    • Wikipedia Search: This is used to fetch data from Wikipedia.
    • Web Search: This is for performing general searches online.
    • Calculator: For performing arithmetic operations.
    • Weather API: For retrieving weather data.
    • These tools enable the assistant to extend its capabilities beyond text generation to real-time data fetching and mathematical problem-solving.

Tool Usage Format

The ReAct pattern uses a structured format for interacting with tools to ensure clarity and efficiency. When the assistant determines that it needs to use a tool, it follows this pattern:

Thought: Do I need to use a tool? Yes
Action: [tool name]
Action Input: [input to the tool]
Observation: [result from the tool]

For example, if the user asks, “What’s the weather in London?”, the assistant’s thought process might be:

Thought: Do I need to use a tool? Yes
Action: weather_api
Action Input: London
Observation: 15°C, cloudy

Once the tool provides the result, the assistant then responds with a final answer:

Final Answer: The weather in London is 15°C and cloudy.

Implementation of the LangGraph ReAct Function Calling Pattern

 Let’s build on implementing the LangGraph ReAct Function Calling Pattern by integrating the reasoner node and constructing a workflow to enable our assistant to interact effectively with the tools we’ve defined.

Environment Setup

First, we’ll set up the environment to use the OpenAI model by importing the necessary libraries and initialising the model with an API key:

import os
from google.colab import userdata
# Setting the OpenAI API key
os.environ['OPENAI_API_KEY'] = userdata.get('OPENAI_API_KEY')
from langchain_openai import ChatOpenAI
#Initializing the language model
llm = ChatOpenAI(model="gpt-4o")

Tool Definitions

Next, we define the arithmetic tools that the assistant can use:

def multiply(a: int, b: int) -> int:
    """Multiply a and b.
    Args:
        a: first int
        b: second int
    """
    return a * b
# This will be a tool
def add(a: int, b: int) -> int:
    """Adds a and b.
    Args:
        a: first int
        b: second int
    """
    return a + b

def divide(a: int, b: int) -> float:
    """Divide a and b.
    Args:
        a: first int
        b: second int
    """
    return a / b

In addition to these arithmetic functions, we include a search tool that allows the assistant to retrieve information from the web:

# search tools
from langchain_community.tools import DuckDuckGoSearchRun
search = DuckDuckGoSearchRun()
# Example search query to get Brad Pitt's age
search.invoke("How old is Brad Pitt?")

Output:

Brad Pitt. Photo: Amy Sussman/Getty Images. Brad Pitt is opening up about
growing older.
The Oscar winner, 60, and George Clooney, 63, spoke with GQ in an interview
published on
Tuesday, August 13 ... Brad Pitt marked his 60th birthday with a celebration
at Mother Wolf
in Los Angeles this week. One onlooker says the actor 'looked super happy' at
the party,
and 'everyone had a smile on their faces.' Brad Pitt is an American actor
born on December 18,
1963, in Shawnee, Oklahoma. He has starred in various films, won an Academy
Award, and married
Angelina Jolie. Brad Pitt rang in his six-decade milestone in a big way —
twice! Pitt celebrated
his 60th birthday on Monday, along with friends and his girlfriend, Ines de
Ramon, 33,
with "low key ... Brad Pitt's net worth is estimated to be around $400
million.
His acting career alone has contributed significantly to this, with Pitt
commanding as much as $20 million
per film. ... Born on December 18, 1963, Brad Pitt is 61 years old. His
zodiac sign is Sagittarius
who are known for being adventurous, independent, and passionate—traits ...

Binding Tools to the LLM

We then bind the defined tools to the language model:

tools = [add, multiply, divide, search]

llm_with_tools = llm.bind_tools(tools)

Defining the Reasoner

The next step is implementing the reasoner function, which serves as the assistant’s decision-making node. This function will use the bound tools to process user input:

from langgraph.graph import MessagesState
from langchain_core.messages import HumanMessage, SystemMessage


# System message
sys_msg = SystemMessage(content="You are a helpful assistant tasked with using search and performing arithmetic on a set of inputs.")

Node implementation

def reasoner(state: MessagesState):
   return {"messages": [llm_with_tools.invoke([sys_msg] + state["messages"])]}

Building the Graph Workflow

Now that we have our tools and the reasoner defined, we can construct the graph workflow that routes between reasoning and tool invocation:

from langgraph.graph import START, StateGraph
from langgraph.prebuilt import tools_condition # this is the checker for the if you got a tool back
from langgraph.prebuilt import ToolNode
from IPython.display import Image, display

# Graph
builder = StateGraph(MessagesState)

# Add nodes
builder.add_node("reasoner", reasoner)
builder.add_node("tools", ToolNode(tools)) # for the tools

# Add edges
builder.add_edge(START, "reasoner")
builder.add_conditional_edges(
    "reasoner",
    # If the latest message (result) from node reasoner is a tool call -> tools_condition routes to tools
    # If the latest message (result) from node reasoner is a not a tool call -> tools_condition routes to END
    tools_condition,
)
builder.add_edge("tools", "reasoner")
react_graph = builder.compile()

# Display the graph
display(Image(react_graph.get_graph(xray=True).draw_mermaid_png()))
Graph LangGraph React

Using the Workflow

We can now handle queries and use the assistant with the graph built. For instance, if a user asks, “What is 2 times Brad Pitt’s age?” The system will first search for Brad Pitt’s age using the DuckDuckGo search tool and then multiply that result by 2.

Here’s how you would invoke the graph for a user query:

Example query: What is 2 times Brad Pitt’s age?

messages = [HumanMessage(content="What is 2 times Brad Pitt's age?")]
messages = react_graph.invoke({"messages": messages})
#Displaying the response
for m in messages['messages']:
    m.pretty_print()
Output

Adding a Custom Tool for Stock Prices

To enhance our assistant’s capabilities, we will add a custom tool that retrieves stock prices using the Yahoo Finance library. This will allow the assistant to answer finance-related queries effectively.

Step 1: Install the Yahoo Finance Package

Before we begin, ensure that the yfinance library is installed. This library will enable us to access stock market data.

!pip -q install yahoo-finance

Step 2: Import Required Libraries

Next, we import the necessary library to interact with Yahoo Finance and define the function that fetches the stock price based on the ticker symbol:

import yfinance as yf

def get_stock_price(ticker: str) -> float:
    """Gets a stock price from Yahoo Finance.

    Args:
        ticker: ticker str
    """
    # """This is a tool for getting the price of a stock when passed a ticker symbol"""
    stock = yf.Ticker(ticker)
    return stock.info['previousClose']

Step 3: Test the Custom Tool

To verify that our tool is functioning correctly, we can make a test call to fetch the stock price of a specific company. For example, let’s get the price for Apple Inc. (AAPL):

get_stock_price("AAPL")

Output

222.5

Step 4: Define the Reasoner Function

Next, we need to modify the reasoner function to accommodate stock-related queries. The function will check the type of query and determine whether to use the stock price tool:

from langchain_core.messages import HumanMessage, SystemMessage
def reasoner(state):
    query = state["query"]
    messages = state["messages"]
    # System message indicating the assistant's capabilities
    sys_msg = SystemMessage(content="You are a helpful assistant tasked with using search, the yahoo finance tool and performing arithmetic on a set of inputs.")
    message = HumanMessage(content=query)
    messages.append(message)
    # Invoke the LLM with the messages
    result = [llm_with_tools.invoke([sys_msg] + messages)]
    return {"messages":result}

Step 5: Update the Tools List

Now we need to add the newly created stock price function to our tools list. This will ensure that our assistant can access this tool when needed:

# Update the tools list to include the stock price function
tools = [add, multiply, divide, search, get_stock_price]
# Re-initialize the language model with the updated tools
llm = ChatOpenAI(model="gpt-4o")
llm_with_tools = llm.bind_tools(tools)


tools[4]
Output

We will further enhance our assistant’s capabilities by implementing a graph-based workflow for managing queries related to both arithmetic and stock prices. This section involves defining the state for our workflow, establishing nodes, and executing various queries.

Step 1: Define the Graph State

We’ll start by defining the state for our graph using a TypedDict. This allows us to manage and type-check the different elements of our state, including the query, finance data, final answer, and message history.

from typing import Annotated, TypedDict
import operator
from langchain_core.messages import AnyMessage
from langgraph.graph.message import add_messages
class GraphState(TypedDict):
    """State of the graph."""
    query: str
    finance: str
    final_answer: str
    # intermediate_steps: Annotated[list[tuple[AgentAction, str]], operator.add]
    messages: Annotated[list[AnyMessage], operator.add]

Step 2: Create the State Graph

Next, we will create an instance of the StateGraph class. This graph will manage the different nodes and transitions based on the state of the conversation:

from langgraph.graph import START, StateGraph
from langgraph.prebuilt import tools_condition # this is the checker for the
from langgraph.prebuilt import ToolNode
# Graph
workflow = StateGraph(GraphState)
# Add Nodes
workflow.add_node("reasoner", reasoner)
workflow.add_node("tools", ToolNode(tools)) 

Step 3: Add Edges to the Graph

We will define how the nodes interact with each other by adding edges to the graph. Specifically, we want to ensure that after the reasoning node processes the input, it either calls a tool or terminates the workflow based on the outcome:

# Add Nodes
workflow.add_node("reasoner", reasoner)
workflow.add_node("tools", ToolNode(tools)) # for the tools
# Add Edges
workflow.add_edge(START, "reasoner")
workflow.add_conditional_edges(
    "reasoner",
    # If the latest message (result) from node reasoner is a tool call -> tools_condition routes to tools
    # If the latest message (result) from node reasoner is a not a tool call -> tools_condition routes to END
    tools_condition,
)
workflow.add_edge("tools", "reasoner")
react_graph = workflow.compile()

Step 4: Visualise the Graph

We can visualise the constructed graph to understand how our workflow is structured. This is useful for debugging and ensuring the logic flows as intended:

# Show
display(Image(react_graph.get_graph(xray=True).draw_mermaid_png()))
Graph LangGraph React

Step 5: Execute Queries

Now that our workflow is set up, we can execute various queries to test its functionality. We will provide different types of questions to see how well the assistant can respond.

Query1: What is 2 times Brad Pitt’s age?

response = react_graph.invoke({"query": "What is 2 times Brad Pitt's age?", "messages": []})
response['messages'][-1].pretty_print()
Output
response = react_graph.invoke({"query": "What is the stock price of Apple?", "messages": []})
for m in response['messages']:
    m.pretty_print()
Output

Query2: What is the stock price of Apple?

response = react_graph.invoke({"query": "What is the stock price of the company that Jensen Huang is CEO of?", "messages": []})
for m in response['messages']:
    m.pretty_print()
Output

Query3: What will be the price of Nvidia stock if it doubles?

response = react_graph.invoke({"query": "What will be the price of nvidia stock if it doubles?", "messages": []})
for m in response['messages']:
    m.pretty_print()
Output
image.png
display(Image(react_graph.get_graph(xray=True).draw_mermaid_png()))
Graph LangGraph React

Conclusion

The LangGraph ReAct Function-Calling Pattern provides a powerful framework for integrating tools with language models, enhancing their interactivity and responsiveness. Combining reasoning and action enables the model to process queries intelligently and execute actions, such as retrieving real-time data or performing calculations. The structured workflow allows for efficient tool usage, enabling the assistant to handle diverse inquiries, from arithmetic operations to stock price retrieval. Overall, this pattern significantly enhances the capabilities of intelligent assistants and paves the way for more dynamic user interactions.

Also, to understand the Agent AI better, explore: The Agentic AI Pioneer Program

Key Takeaways

  • Dynamic Interactivity: The pattern integrates external tools with language models, enabling more engaging and responsive user interactions.
  • ReAct Approach: By combining reasoning and action, the model can intelligently process queries and invoke tools for real-time data and computations.
  • Versatile Tool Integration: The framework supports various tools, allowing the assistant to handle a wide range of inquiries, from basic arithmetic to complex data retrieval.
  • Customizability: Users can create and incorporate custom tools, tailoring the assistant’s functionality to specific applications and enhancing its capabilities.

The media shown in this article is not owned by Analytics Vidhya and is used at the Author’s discretion.

Frequently Asked Questions

Q1. What is the LangGraph ReAct Function-Calling Pattern?

Ans. The LangGraph ReAct Function-Calling Pattern is a framework that integrates external tools with language models to enhance their interactivity and responsiveness. It enables models to process queries and execute actions like data retrieval and calculations.

Q2. How does the ReAct approach work?

Ans. The ReAct approach combines reasoning and acting, allowing the language model to reason through user queries and decide when to call external tools for information or computations, thereby producing more accurate and relevant responses.

Q3. What types of tools can be integrated using this pattern?

Ans. Various tools can be integrated, including search engines (e.g., Wikipedia, web search), arithmetic operations calculators, real-time data APIs (e.g., weather, stock prices), and more.

Q4. How does the structured tool usage format function?

Ans. The structured format guides the assistant in determining whether to use a tool based on its reasoning. It involves a series of steps: determining the need for a tool, specifying the action and input, and finally observing the result to generate a response.

Q5. Can this pattern handle complex queries?

Ans. Yes, the LangGraph ReAct Function-Calling Pattern is designed to handle complex queries by allowing the assistant to combine reasoning and tool invocation. For instance, it can fetch real-time data and perform calculations based on that data.

Hi there! I’m Himanshu Ranjan, and I have a deep passion for data everything from crunching numbers to finding patterns that tell a story. For me, data is more than just numbers on a screen; it’s a tool for discovery and insight. I’m always excited by the possibility of what data can reveal and how it can solve real-world problems.

But it’s not just data that grabs my attention. I love exploring new things, whether that’s learning a new skill, experimenting with new technologies, or diving into topics outside my comfort zone. Curiosity drives me, and I’m always looking for fresh challenges that push me to think differently and grow. At heart, I believe there’s always more to learn, and I’m on a constant journey to expand my knowledge and perspective.

Responses From Readers

Clear

Congratulations, You Did It!
Well Done on Completing Your Learning Journey. Stay curious and keep exploring!

We use cookies essential for this site to function well. Please click to help us improve its usefulness with additional cookies. Learn about our use of cookies in our Privacy Policy & Cookies Policy.

Show details