Learning LangGraph: A Structured Approach
LangGraph enables you to define stateful message passing, integrate tool calls, and handle conditional logic efficiently.
In this blog post, we will walk through a step-by-step guide to using LangGraph, covering key concepts such as defining tools, conditional edges, and compiling a reactive graph.
1. Setting Up the Environment
To get started, we first need to set up API keys and load necessary libraries.
import os
from dotenv import load_dotenv
load_dotenv()
os.environ["GROQ_API_KEY"] = os.getenv("GROQ_API_KEY")
os.environ["OPENAI_API_KEY"] = os.getenv("OPENAI_API_KEY")This code loads environment variables from a .env file and sets them as system variables. These keys will be used to authenticate requests when calling external APIs such as OpenAI.
2. Defining Tools
In LangGraph, tools represent functions that the assistant can call dynamically. Here, we define three basic arithmetic operations:
def multiply(a: int, b: int) -> int:
"""Multiply two integers."""
return a * b
# This will be a tool
def add(a: int, b: int) -> int:
"""Add two integers."""
return a + b
def divide(a: int, b: int) -> float:
"""Divide two integers."""
return a / b- multiply(): Multiplies two integers.
- add(): Adds two integers.
- divide(): Divides two integers and returns a float.
These functions act as independent tools that can be called when required. We then register these functions as tools:
tools = [add, multiply, divide]3. Initializing the Language Model with Tools
To integrate our tools with an LLM, we use the ChatOpenAI model:
from langchain_openai import ChatOpenAI
llm = ChatOpenAI(model="gpt-4o")
llm_with_tools = llm.bind_tools(tools, parallel_tool_calls=False)- ChatOpenAI(model="gpt-4o"): Initializes a GPT-4 model from OpenAI.
- .bind_tools(tools, parallel_tool_calls=False): Associates the defined tools with the model, allowing the assistant to invoke them dynamically.
4. Defining Message States
We now create a stateful message handler using LangGraph’s MessagesState:
from langgraph.graph import MessagesState
from typing_extensions import TypedDict
from langchain_core.messages import AnyMessage, HumanMessage, SystemMessage
# Define a system message
sys_msg = SystemMessage(content="You are a helpful assistant tasked with performing arithmetic on a set of inputs.")
# Function for the assistant node
def assistant(state: MessagesState):
return {"messages": [llm_with_tools.invoke([sys_msg] + state["messages"])]}
from typing import Annotated
from langgraph.graph.message import add_messages
class MessagesState(TypedDict):
messages: Annotated[list[AnyMessage], add_messages]- sys_msg: A predefined system message that guides the assistant’s behavior.
- assistant(): A function that processes incoming messages by invoking the LLM with available tools.
- MessagesState: Defines the structure of the state, storing the conversation messages.
5. Building the Graph
Using StateGraph, we define nodes and edges to structure the workflow:
from langgraph.graph import START, StateGraph
from langgraph.prebuilt import tools_condition, ToolNode
from IPython.display import Image, display
builder = StateGraph(MessagesState)
# Define nodes
builder.add_node("assistant", assistant)
builder.add_node("tools", ToolNode(tools))
# Define edges
builder.add_edge(START, "assistant")
# Define conditional edges
builder.add_conditional_edges(
"assistant",
tools_condition, # Routes to tools if a tool call is needed, else ends
)
builder.add_edge("tools", "assistant")
# Compile the graph
react_graph = builder.compile()- StateGraph(MessagesState): Creates a new state graph using the MessagesState structure.
- builder.add_node(): Defines the nodes in the graph.
- builder.add_edge(): Connects the nodes to define execution flow.
- tools_condition: A prebuilt condition that routes execution based on whether a tool needs to be called.
6. Visualizing the Graph
LangGraph provides built-in visualization for better understanding:
display(Image(react_graph.get_graph().draw_mermaid_png()))This command generates a diagram that visually represents the structure of the graph, showing how nodes and edges interact.
7. Running the Assistant
Finally, we can test our assistant by sending a message:
messages = [HumanMessage(content="Add 4 and 2, then multiply the result by 8, and add 2 to the final answer.")]
messages = react_graph.invoke({"messages": messages})
for m in messages['messages']:
m.pretty_print()- A HumanMessage containing an arithmetic request is created.
- The message is processed through the graph, utilizing tools as needed.
- The final output is printed in a readable format.
Conclusion
In this blog post, we covered the fundamentals of LangGraph, from setting up API keys to defining tools, integrating an LLM, structuring stateful message handling, building and visualizing a graph, and running an assistant. LangGraph is a powerful framework that allows structured and flexible AI-driven workflows, making it an excellent choice for building intelligent agents. Now Try extending this example by:
- Adding more complex tools.
- Refining conditions for tool invocation.
- Integrating additional workflows for different problem-solving scenarios.