Why persistence matters
A graph without a checkpointer is stateless. Each call to invoke() runs from the beginning with no memory of previous interactions. This is fine for single-shot tasks, but most real applications need conversation history.
Adding a checkpointer changes this. The graph saves its state after every node execution. When you invoke it again with the same thread ID, it picks up exactly where it left off. The agent remembers what the user said, what tools it called, and what decisions it made.
This is also what makes human-in-the-loop work. When a graph pauses for human input (via interrupt()), the checkpointer saves the state. The human can respond hours later, and the graph resumes from where it stopped.
Types of memory
LangGraph separates memory into layers. Each one handles a different scope.
Short-term (Checkpointer)
Per threadSaves graph state after each node execution within a thread. Enables multi-turn conversations where the agent remembers earlier messages. State is tied to a thread_id.
Long-term (Store)
Across threadsKey-value storage that persists across threads. Store user preferences, learned facts, or any data that should survive beyond a single conversation.
Conversation history
Per threadThe messages list in your state, managed by the add_messages reducer. Each turn appends new messages. Combined with a checkpointer, this gives you full conversation history.
Getting started with InMemorySaver
The simplest checkpointer. Stores everything in process memory. Use it for development and testing. Pass a thread_id in the config to maintain separate conversation sessions.
from langgraph.checkpoint.memory import InMemorySaver
from langgraph.graph import StateGraph, START, END
# Create a checkpointer (development only)
checkpointer = InMemorySaver()
# Compile with the checkpointer
graph = StateGraph(State)
graph.add_node("chatbot", chatbot)
graph.add_edge(START, "chatbot")
graph.add_edge("chatbot", END)
app = graph.compile(checkpointer=checkpointer)
# Each thread maintains its own conversation history
config = {"configurable": {"thread_id": "user-123"}}
result = app.invoke(
{"messages": [("user", "Hi, my name is Alice")]},
config=config,
)
# Same thread_id continues the conversation
result = app.invoke(
{"messages": [("user", "What's my name?")]},
config=config,
)
# The agent remembers: "Your name is Alice"Production checkpointers
For production, use a checkpointer backed by a real database. Install the corresponding package and swap out InMemorySaver.
InMemorySaver
langgraphStores state in process memory. Fast but not persistent. Use for development and testing only.
PostgresSaver
langgraph-checkpoint-postgresStores state in PostgreSQL. Persistent and production-ready. Supports async with AsyncPostgresSaver.
RedisSaver
langgraph-checkpoint-redisStores state in Redis. Fast reads/writes with persistence. Good for high-throughput applications.
MongoDBSaver
langgraph-checkpoint-mongodbStores state in MongoDB. Flexible document storage. Supports async with AsyncMongoDBSaver.
from langgraph.checkpoint.postgres import PostgresSaver
# Connect to PostgreSQL (production)
DB_URI = "postgresql://user:pass@localhost:5432/mydb"
with PostgresSaver.from_conn_string(DB_URI) as checkpointer:
# Create tables on first use
checkpointer.setup()
# Compile with persistent checkpointer
app = graph.compile(checkpointer=checkpointer)
# State survives process restarts
config = {"configurable": {"thread_id": "user-123"}}
result = app.invoke(
{"messages": [("user", "Hello")]},
config=config,
)Long-term memory with stores
Checkpointers save per-thread state. But sometimes you need memory that works across threads. Stores give you key-value storage for user preferences, learned facts, or any data that should persist beyond a single conversation.
from langgraph.checkpoint.memory import InMemorySaver
from langgraph.store.memory import InMemoryStore
# Short-term memory (per thread)
checkpointer = InMemorySaver()
# Long-term memory (across threads)
store = InMemoryStore()
app = graph.compile(
checkpointer=checkpointer,
store=store,
)
# Inside a node, access the store:
def my_node(state: State, config, *, store):
user_id = config["configurable"]["user_id"]
# Read long-term memories
memories = store.search(
("user", user_id),
query="preferences",
)
# Save a new memory
store.put(
("user", user_id),
"favorite_color",
{"value": "blue"},
)
return {"messages": [response]}Memory on Crewship
Short-term memory in runs
In-memory state works within each run. Agents maintain context during execution, but the memory is discarded when the run completes.
Persistent state with Threads
For persistent memory across interactions, use the Threads API. Threads maintain conversation state across multiple runs, so your agent remembers past interactions.
Structured data with Tables
Use Tables to store structured data that persists across executions. Agents can read from and write to tables during runs.
FAQ
Memory questions
Deploy with persistent memory
Get your memory-enabled agents running in production.