Article
· 4 hr ago 5m read

Unveiling the LangGraph

How to Build Applications with LangGraph: A Step-by-Step Guide

Tags: #LangGraph #LangChain #AI #Agents #Python #LLM #StateManagement #Workflows


Hi everyone, I want to tell you a little about LangGraph, a tool that I'm studying and developing.

Basically traditional AI applications often face challenges when dealing with complex workflows and dynamic states. LangGraph offers a robust solution, enabling the creation of stateful agents that can manage complex conversations, make context-based decisions, and execute sophisticated workflows.

This article provides a step-by-step guide to building applications using LangGraph, a framework for creating multi-step agents with state graphs.


Implementation Steps:

  1. Set Up Environment and Install Dependencies
  2. Define Application State
  3. Create Graph Nodes
  4. Configure State Graph
  5. Run the Agent

1. Set Up Environment and Install Dependencies

The first step is to set up the Python environment and install the necessary libraries:

pip install langgraph langchain langchain-openai

Configure your API credentials:

import os
from langchain_openai import ChatOpenAI

# Configure your API Key
os.environ["OPENAI_API_KEY"] = "your-api-key-here"

# Initialize the model
llm = ChatOpenAI(model="gpt-4", temperature=0)

2. Define Application State

LangGraph uses a TypedDict to define the state that will be shared between graph nodes:

from typing import TypedDict, Annotated
from operator import add

class AgentState(TypedDict):
    """State shared between graph nodes"""
    messages: Annotated[list, add]
    user_input: str
    response: str
    next_step: str

This state stores:

  • messages: History of exchanged messages
  • user_input: Current user input
  • response: Response generated by the agent
  • next_step: Next action to be executed

Graph State


3. Create Graph Nodes

3.1 - Input Processing Node

This node processes user input and prepares the context:

def process_input(state: AgentState) -> AgentState:
    """Processes user input"""
    user_message = state["user_input"]
    
    # Add message to history
    state["messages"].append({
        "role": "user",
        "content": user_message
    })
    
    # Define next step
    state["next_step"] = "analyze"
    
    return state

3.2 - Analysis and Decision Node

This node uses the LLM to analyze input and decide the next action:

from langchain.prompts import ChatPromptTemplate

def analyze_request(state: AgentState) -> AgentState:
    """Analyzes the request and decides the next action"""
    
    prompt = ChatPromptTemplate.from_messages([
        ("system", "You are an intelligent assistant. Analyze the user's request and determine the best way to respond."),
        ("user", "{input}")
    ])
    
    chain = prompt | llm
    
    result = chain.invoke({
        "input": state["user_input"]
    })
    
    state["response"] = result.content
    state["next_step"] = "respond"
    
    return state

3.3 - Response Node

This node formats and returns the final response:

def generate_response(state: AgentState) -> AgentState:
    """Generates the final response"""
    
    # Add response to history
    state["messages"].append({
        "role": "assistant",
        "content": state["response"]
    })
    
    state["next_step"] = "END"
    
    return state

4. Configure State Graph

4.1 - Create the Graph

Now let's connect all nodes in a state graph:

from langgraph.graph import StateGraph, END

# Create the graph
workflow = StateGraph(AgentState)

# Add nodes
workflow.add_node("process_input", process_input)
workflow.add_node("analyze", analyze_request)
workflow.add_node("respond", generate_response)

# Define entry point
workflow.set_entry_point("process_input")

# Add transitions (edges)
workflow.add_edge("process_input", "analyze")
workflow.add_edge("analyze", "respond")
workflow.add_edge("respond", END)

# Compile the graph
app = workflow.compile()

4.2 - Visualize the Graph

LangGraph allows you to visualize the graph structure:

from IPython.display import Image, display

try:
    display(Image(app.get_graph().draw_mermaid_png()))
except Exception:
    print("Graph visualization requires additional dependencies")

Graph Flow


5. Run the Agent

5.1 - Execute a Simple Query

def run_agent(user_input: str):
    """Runs the agent with user input"""
    
    # Initial state
    initial_state = {
        "messages": [],
        "user_input": user_input,
        "response": "",
        "next_step": ""
    }
    
    # Execute the graph
    result = app.invoke(initial_state)
    
    return result["response"]

# Test the agent
response = run_agent("What is the capital of France?")
print(f"Response: {response}")

Expected output:

Response: The capital of France is Paris.

5.2 - Execute with Streaming

For interactive applications, you can use streaming:

async def run_agent_stream(user_input: str):
    """Runs the agent with streaming"""
    
    initial_state = {
        "messages": [],
        "user_input": user_input,
        "response": "",
        "next_step": ""
    }
    
    async for event in app.astream(initial_state):
        for node_name, node_state in event.items():
            print(f"\n--- {node_name} ---")
            if "response" in node_state and node_state["response"]:
                print(f"Partial response: {node_state['response']}")

Advanced Features

Checkpoints and Persistence

LangGraph supports checkpoints to save state:

from langgraph.checkpoint.memory import MemorySaver

# Add memory to the graph
memory = MemorySaver()
app_with_memory = workflow.compile(checkpointer=memory)

# Execute with persistence
config = {"configurable": {"thread_id": "user-123"}}
result = app_with_memory.invoke(initial_state, config)

Conditions and Dynamic Routing

You can add conditional logic for routing:

def router(state: AgentState) -> str:
    """Determines the next node based on state"""
    
    if "urgent" in state["user_input"].lower():
        return "priority_handler"
    else:
        return "normal_handler"

# Add conditional routing
workflow.add_conditional_edges(
    "analyze",
    router,
    {
        "priority_handler": "priority_node",
        "normal_handler": "normal_node"
    }
)

Use Cases

LangGraph is ideal for:

  1. Complex Chatbots: Managing multi-turn conversations with context
  2. Autonomous Agents: Creating agents that make state-based decisions
  3. Processing Workflows: Orchestrating data processing pipelines
  4. Multi-Agent Systems: Coordinating multiple specialized agents

See It in Action

For more details and practical examples, visit:


Conclusion

LangGraph offers a powerful and flexible approach to building stateful AI applications. By combining state graphs with LLMs, you can create sophisticated systems that manage complex conversations, make contextual decisions, and execute dynamic workflows.

LangGraph's modular structure allows you to scale from simple chatbots to complex multi-agent systems while keeping your code organized and maintainable.

 

Thanks!


Discussion (0)1
Log in or sign up to continue