LangGraph with Groq Tutorial
Using LangGraph with Groq: A Step-by-Step Tutorial
This tutorial guides you through building a simple Q&A agent using LangGraph and Groq. LangGraph, part of the LangChain ecosystem, enables complex workflows with Large Language Models (LLMs). Groq provides fast, cost-effective LLM inference via its API. By the end, you'll have a working agent that answers questions using Groq's Llama 3.1 model.
Prerequisites
- Python 3.8+: Ensure Python is installed.
- Groq API Key: Sign up at console.groq.com to get your API key.
- Libraries: Install required packages using pip.
Install Required Libraries
Run the following command in your terminal to install LangChain, LangGraph, and the Groq integration:
pip install langchain langchain-groq langgraph
Step 1: Set Up the Groq API Key
To use Groq's models, set your API key as an environment variable. This keeps your key secure and accessible to the application.
import os
import getpass
if "GROQ_API_KEY" not in os.environ:
os.environ["GROQ_API_KEY"] = getpass.getpass("Enter your Groq API key: ")
Note: Avoid hardcoding your API key in production code. Use environment variables or a secure vault solution.
Step 2: Initialize the Groq Model
Use the ChatGroq
class from langchain_groq
to initialize a Groq model, such as Llama 3.1.
from langchain_groq import ChatGroq
# Initialize the Groq LLM
llm = ChatGroq(
model="llama-3.1-8b-instant", # Other options: "llama3-70b-8192", "mixtral-8x7b-32768"
temperature=0, # Controls randomness (0 = deterministic)
max_retries=2 # Handles API retries
)
You can explore available models at console.groq.com.
Step 3: Create a Simple LangGraph Workflow
LangGraph allows you to define a workflow as a graph with nodes (tasks) and edges (connections). We'll create a basic Q&A agent that takes a user question, processes it with the Groq LLM, and returns an answer.
Code Example: Q&A Agent
from langgraph.graph import StateGraph, START, END
from typing import TypedDict, Annotated
import operator
# Define the state schema to store messages
class AgentState(TypedDict):
messages: Annotated[list, operator.add]
# Define a node that calls the Groq LLM
def call_llm(state: AgentState):
messages = state["messages"]
response = llm.invoke(messages)
return {"messages": [response]}
# Create the workflow
workflow = StateGraph(AgentState)
# Add nodes
workflow.add_node("llm", call_llm)
# Add edges
workflow.add_edge(START, "llm")
workflow.add_edge("llm", END)
# Compile the graph
graph = workflow.compile()
# Invoke the graph with a user question
input_state = {"messages": [("human", "What is the capital of France?")]}
result = graph.invoke(input_state)
# Print the result
print(result["messages"][-1].content)
How It Works
- State:
AgentState
stores a list of messages, updated as the workflow progresses. - Node: The
call_llm
node sends the messages to the Groq LLM and appends the response. - Edges: The workflow starts at
START
, calls thellm
node, and ends atEND
. - Output: The final message contains the LLM's answer (e.g., "The capital of France is Paris.").
Step 4: Run and Test
Save the code above in a file (e.g., qna_agent.py
) and run it:
python qna_agent.py
The output should be:
The capital of France is Paris.
Tip: Test with different questions or models (e.g., llama3-70b-8192
) to explore Groq's capabilities.
Step 5: Extend the Workflow (Optional)
LangGraph supports advanced features like:
- Tool Calling: Integrate tools (e.g., web search, database queries) with the LLM.
- Conditional Routing: Use conditions to route between nodes based on LLM output.
- Human-in-the-Loop: Add interruptions for human feedback.
For example, to add tool calling, bind tools to the LLM and define additional nodes. Check the LangChain documentation for advanced examples.
Additional Resources
- LangChain Groq Integration
- LangGraph Documentation
- Groq API Reference
- YouTube: LangGraph with Groq Tutorial
Notes
- Groq's Advantage: Groq's LPU ensures low-latency inference, ideal for real-time applications.
- Streaming: Some models (e.g., Mixtral) may not support streaming; set
streaming=False
if needed. - Cost: For pricing details, visit x.ai/grok or x.ai/api.
Congratulations! You've built a Q&A agent with LangGraph and Groq. Experiment with more complex workflows to unlock the full potential of these tools!
Comments
Post a Comment