LangGraph Tutorial: Understanding Concepts, Functionalities, and Project Implementation

LangGraph Tutorial

LangGraph Tutorial: Understanding Concepts, Functionalities, and Project Implementation

Introduction

LangGraph is a powerful, open-source library within the LangChain ecosystem designed to build stateful, multi-agent applications with Large Language Models (LLMs). Unlike traditional linear workflows, LangGraph enables the creation of cyclic, graph-based workflows, making it ideal for complex, agent-driven systems that require state management, conditional logic, and human-in-the-loop interactions. This tutorial explains the core concepts of LangGraph, its main functionalities, and demonstrates how to use it in a practical project—a simple customer support chatbot.

Core Concepts of LangGraph

1. Graph-Based Architecture

LangGraph represents workflows as directed graphs, where:

  • Nodes: Represent individual units of work, such as an LLM call, tool invocation, or data processing function.
  • Edges: Define the flow of execution between nodes, including direct transitions or conditional logic.
  • State: A shared data structure that persists context across nodes, updated as the workflow progresses.

This structure allows for dynamic, non-linear workflows, unlike the directed acyclic graphs (DAGs) used in traditional LangChain chains.

2. State Management

LangGraph’s state management is a standout feature. The state object:

  • Tracks information across interactions (e.g., conversation history, user inputs).
  • Is automatically updated by nodes and passed between them.
  • Supports persistence, enabling workflows to pause and resume, which is crucial for long-running tasks or human-in-the-loop scenarios.

3. Conditional Edges

Conditional edges allow dynamic routing based on a node’s output. For example, after an LLM processes input, the workflow can branch to different nodes depending on the result (e.g., invoke a tool or end the workflow). This enables decision-making and adaptability in agent behavior.

4. Human-in-the-Loop (HITL)

LangGraph supports human intervention, allowing users to review, edit, or approve agent actions before proceeding. This is particularly useful for sensitive tasks like customer support or content generation.

5. Multi-Agent Collaboration

LangGraph excels at coordinating multiple agents, each represented as a node. Agents can communicate via the shared state, enabling collaborative workflows like a writer-critique loop or a research assistant summarizing findings.

Main Functionalities

1. Cyclic Workflows

LangGraph supports cycles, allowing agents to iterate over tasks (e.g., repeatedly calling an LLM to refine output or invoking tools until a condition is met). This is critical for agent-like behaviors where decision-making evolves over multiple steps.

2. Tool Integration

Agents can interact with external tools (e.g., web search, APIs, databases) via nodes. LangGraph simplifies tool invocation by defining tools as callable functions and routing their outputs back to the agent.

3. Persistence and Memory

LangGraph’s built-in persistence layer saves the state to a durable store, ensuring workflows can recover from errors or interruptions. It also supports short- and long-term memory for maintaining conversation context.

4. Streaming Support

LangGraph provides token-by-token streaming of agent outputs and intermediate steps, offering real-time visibility into agent reasoning and actions.

5. LangGraph Studio

LangGraph Studio is a visual IDE for designing, debugging, and interacting with LangGraph workflows. It allows developers to visualize graphs, modify states, and test scenarios, enhancing the development experience.

6. Deployment Options

LangGraph applications can be deployed via:

  • LangGraph Cloud: A managed SaaS solution.
  • Self-Hosted: Run locally or on your infrastructure.
  • Bring Your Own Cloud (BYOC): Deploy within your VPC.

These options ensure scalability and flexibility for production use.

Project Example: Building a Customer Support Chatbot

Let’s walk through a project to create a simple customer support chatbot using LangGraph. The chatbot will:

  • Classify user queries (e.g., order status, refund request, or general inquiry).
  • Invoke tools (e.g., check order status via a mock API).
  • Maintain conversation history.
  • Allow human review for refund requests.

Prerequisites

  • Python 3.8+
  • Install required packages:
    pip install langgraph langchain langchain-openai
  • An OpenAI API key (or substitute with another LLM provider).

Step 1: Define the State

The state will store the conversation history and query classification.

from typing import TypedDict, Annotated, List
from langchain_core.messages import BaseMessage, add_messages

class ChatState(TypedDict):
    messages: Annotated[List[BaseMessage], add_messages]
    classification: str

Step 2: Define Nodes

We’ll create nodes for:

  • Classifying the user’s query.
  • Handling order status checks.
  • Handling refund requests (with human review).
  • Generating general responses.
from langchain_core.messages import HumanMessage, AIMessage
from langchain_openai import ChatOpenAI

llm = ChatOpenAI(model="gpt-4o-mini", api_key="your-openai-api-key")

def classify_query(state: ChatState) -> ChatState:
    question = state["messages"][-1].content
    prompt = f"Classify the following query as 'order_status', 'refund_request', or 'general': {question}"
    classification = llm.invoke(prompt).content
    return {"classification": classification}

def handle_order_status(state: ChatState) -> ChatState:
    # Mock API call to check order status
    order_id = state["messages"][-1].content.split()[-1]
    response = f"Order {order_id} is shipped and expected to arrive in 3 days."
    return {"messages": [AIMessage(content=response)]}

def handle_refund_request(state: ChatState) -> ChatState:
    response = "Refund request noted. A human representative will review your request."
    return {"messages": [AIMessage(content=response)]}

def handle_general_query(state: ChatState) -> ChatState:
    question = state["messages"][-1].content
    response = llm.invoke(f"Answer the following general query: {question}").content
    return {"messages": [AIMessage(content=response)]}

Step 3: Define Tools

We’ll create a mock tool for checking order status.

def check_order_status(order_id: str) -> str:
    return f"Order {order_id} is shipped and expected to arrive in 3 days."

Step 4: Build the Graph

We’ll use StateGraph to define nodes and edges, including conditional routing based on query classification.

from langgraph.graph import StateGraph, START, END

workflow = StateGraph(ChatState)

# Add nodes
workflow.add_node("classify_query", classify_query)
workflow.add_node("handle_order_status", handle_order_status)
workflow.add_node("handle_refund_request", handle_refund_request)
workflow.add_node("handle_general_query", handle_general_query)

# Define edges
workflow.add_edge(START, "classify_query")

# Conditional edges based on classification
workflow.add_conditional_edges(
    "classify_query",
    lambda state: state["classification"],
    {
        "order_status": "handle_order_status",
        "refund_request": "handle_refund_request",
        "general": "handle_general_query"
    }
)

# Edges to end
workflow.add_edge("handle_order_status", END)
workflow.add_edge("handle_refund_request", END)
workflow.add_edge("handle_general_query", END)

# Compile the graph
graph = workflow.compile()

Step 5: Test the Chatbot

Run the graph with sample inputs to test its behavior.

# Test order status query
input_state = {"messages": [HumanMessage(content="What is the status of order 12345?")]}
result = graph.invoke(input_state)
print(result["messages"][-1].content)
# Output: Order 12345 is shipped and expected to arrive in 3 days.

# Test refund request
input_state = {"messages": [HumanMessage(content="I want a refund for my purchase.")]}
result = graph.invoke(input_state)
print(result["messages"][-1].content)
# Output: Refund request noted. A human representative will review your request.

# Test general query
input_state = {"messages": [HumanMessage(content="What are your store hours?")]}
result = graph.invoke(input_state)
print(result["messages"][-1].content)
# Output: Our store hours are 9 AM to 6 PM, Monday through Saturday.

Step 6: Enhance with Human-in-the-Loop

To add human review for refund requests, use LangGraph’s checkpointing and interrupt features.

from langgraph.checkpoint.memory import MemorySaver

# Add checkpointing
checkpoint = MemorySaver()

# Recompile graph with checkpointing
graph = workflow.compile(checkpoint=checkpoint, interrupt_before=["handle_refund_request"])

# Test refund request with interrupt
input_state = {"messages": [HumanMessage(content="I want a refund for my purchase.")]}
result = graph.invoke(input_state)
print(result["messages"][-1].content)
# Workflow pauses before handle_refund_request, allowing human review

A human can inspect the state, approve the refund, and resume the workflow using LangGraph Studio or custom logic.

Step 7: Deploy the Application

To deploy, use LangGraph Cloud or self-host the application. Create a langgraph.json configuration file:

{
  "dependencies": ["."],
  "graphs": {
    "support_chatbot": "./agent.py:graph"
  },
  "env": {
    "OPENAI_API_KEY": "your-openai-api-key"
  }
}

Then, deploy using LangGraph CLI or integrate with a web framework like FastAPI.

Use Cases for LangGraph

LangGraph is versatile and can be applied to various projects:

  • Smarter Chatbots: Build context-aware chatbots that handle complex user requests and maintain conversation history.
  • AI Research Assistants: Create agents that search, summarize, and organize information from multiple sources.
  • Automated Workflows: Streamline business processes like document routing or data analysis with multi-agent coordination.
  • Code Generation Systems: Develop assistants that iteratively generate, test, and refine code.

Best Practices

  • Keep State Simple: Include only necessary data in the state to avoid complexity.
  • Handle Errors: Implement exception handling in nodes and retry mechanisms for transient failures.
  • Test Incrementally: Use LangGraph Studio to visualize and debug workflows during development.
  • Optimize for Scalability: Leverage LangGraph Cloud or BYOC for production-grade deployment.

Resources

Conclusion

LangGraph empowers developers to build sophisticated, stateful AI applications with cyclic workflows, multi-agent coordination, and robust state management. By following this tutorial, you’ve learned its core concepts, explored its functionalities, and implemented a customer support chatbot. Experiment with LangGraph Studio, try integrating more tools, and scale your project to production to unlock its full potential.

Happy coding!

Comments

Popular posts from this blog

Building and Deploying a Recommender System on Kubeflow with KServe

CrewAI vs LangGraph: A Simple Guide to Multi-Agent Frameworks

Tutorial: Building Login and Sign-Up Pages with React, FastAPI, and XAMPP (MySQL)