Skip to Content
referenceGraph Api Reference

Last Updated: 3/11/2026


Graph API Reference

Complete reference for the LangGraph Graph API.

StateGraph

The main class for building graphs.

Constructor

Python:

from langgraph.graph import StateGraph graph = StateGraph( state_schema, # TypedDict, Pydantic model, or dataclass input_schema=None, # Optional input schema output_schema=None, # Optional output schema config_schema=None # Optional runtime config schema )

JavaScript:

import { StateGraph, StateSchema } from "@langchain/langgraph"; const graph = new StateGraph({ state: StateSchema, // Required input: InputSchema, // Optional output: OutputSchema, // Optional context: ContextSchema // Optional });

Methods

add_node

Add a node to the graph.

Python:

graph.add_node( name: str, # Node name action: Callable, # Node function retry_policy: RetryPolicy = None, # Retry configuration cache_policy: CachePolicy = None # Cache configuration )

JavaScript:

graph.addNode( name: string, action: Function, options?: { retryPolicy?: RetryPolicy, cachePolicy?: CachePolicy, ends?: string[] // For Command routing } );

add_edge

Add a static edge.

graph.add_edge(from_node: str, to_node: str)

add_conditional_edges

Add conditional routing.

Python:

graph.add_conditional_edges( source: str, # Source node path: Callable, # Routing function path_map: list[str] | dict = None # Optional destination mapping )

JavaScript:

graph.addConditionalEdges( source: string, path: Function, pathMap?: string[] | Record<string, string> );

compile

Compile the graph for execution.

Python:

app = graph.compile( checkpointer: BaseCheckpointSaver = None, # For persistence store: BaseStore = None, # For long-term memory interrupt_before: list[str] = None, # Nodes to interrupt before interrupt_after: list[str] = None, # Nodes to interrupt after debug: bool = False # Enable debug mode )

JavaScript:

const app = graph.compile({ checkpointer?: BaseCheckpointSaver, store?: BaseStore, interruptBefore?: string[], interruptAfter?: string[], debug?: boolean });

CompiledGraph

The compiled graph returned by compile().

invoke

Run the graph synchronously.

Python:

result = app.invoke( input: dict, # Graph input config: RunnableConfig = None, # Thread ID, etc. stream_mode: str | list[str] = "values", # Stream mode(s) debug: bool = False )

JavaScript:

const result = await app.invoke( input: object, config?: { configurable?: { thread_id?: string }, streamMode?: string | string[], recursionLimit?: number } );

stream

Run the graph with streaming.

Python:

for chunk in app.stream( input: dict, config: RunnableConfig = None, stream_mode: str | list[str] = "values", subgraphs: bool = False ): process(chunk)

JavaScript:

for await (const chunk of await app.stream( input: object, config?: { streamMode?: string | string[], subgraphs?: boolean } )) { process(chunk); }

get_state

Get current state snapshot.

state = app.get_state(config) # state.values - Current state # state.next - Next nodes to execute # state.tasks - Pending tasks

get_state_history

Get execution history.

for snapshot in app.get_state_history(config): print(snapshot.values)

update_state

Manually update state.

app.update_state( config, values: dict, # State updates as_node: str = None # Treat as coming from this node )

Special Nodes

START

Virtual entry node.

from langgraph.graph import START graph.add_edge(START, "first_node")

END

Virtual exit node.

from langgraph.graph import END graph.add_edge("last_node", END)

State Types

MessagesState

Prebuilt state with message history.

Python:

from langgraph.graph import MessagesState class MyState(MessagesState): custom_field: str

JavaScript:

import { StateSchema, MessagesValue } from "@langchain/langgraph"; const MyState = new StateSchema({ messages: MessagesValue, customField: z.string() });

Utility Types

Command

Combine state updates with routing.

Python:

from langgraph.types import Command return Command( update={"field": "value"}, goto="next_node" )

JavaScript:

import { Command } from "@langchain/langgraph"; return new Command({ update: { field: "value" }, goto: "nextNode" });

interrupt

Pause for human input.

Python:

from langgraph.types import interrupt response = interrupt({"message": "Approve?"})

JavaScript:

import { interrupt } from "@langchain/langgraph"; const response = interrupt({ message: "Approve?" });

Configuration

RunnableConfig

Python:

config = { "configurable": { "thread_id": "conversation-1", "checkpoint_id": "...", # Optional "checkpoint_ns": "" # Optional }, "recursion_limit": 25, # Max steps "tags": ["production"], # For tracing }

JavaScript:

const config = { configurable: { thread_id: "conversation-1", checkpoint_id: "...", // Optional checkpoint_ns: "" // Optional }, recursionLimit: 25, // Max steps tags: ["production"] // For tracing };

Next Steps

  • Error Handling: Common errors and solutions
  • Building Graphs: Practical guide to graph construction