How to deploy LangGraph to production

A step-by-step guide to deploying your LangGraph agents to production with Crewship. From install to live API in under 5 minutes.

Why deploy LangGraph to production?

Running LangGraph locally works for development. Production is different. Your agent needs an API, it needs to handle concurrent requests, and secrets need to stay out of your code.

Self-hosting means Docker, infrastructure, an API layer, and scaling -- all on you. Crewship handles that with a single command. You keep writing Python.

Prerequisites

  • Python 3.10+ installed on your machine
  • A LangGraph project with a compiled graph
  • A Crewship accountsign up free

Deploy in 6 steps

Each step builds on the previous one. The whole process takes a few minutes.

1

Install the CLI

One-line install on macOS and Linux.

bash
curl -fsSL https://www.crewship.dev/install.sh | bash
2

Configure crewship.toml

Add a configuration file to your project root. Set the framework to "langgraph" and point the entrypoint to your compiled graph.

toml
[deployment]
framework = "langgraph"
entrypoint = "my_agent.graph:app"
profile = "slim"
python = "3.10"

[build]
exclude = [ "tests" ]
3

Set environment variables

Add API keys your agents need. Secrets are encrypted and injected at runtime.

bash
crewship env set OPENAI_API_KEY sk-...
crewship env set ANTHROPIC_API_KEY sk-ant-...
4

Deploy

A single command packages your code, builds the image, and deploys it.

bash
$ crewship deploy

📦 Packaging agent...
⬆️  Uploading build context...
🔨 Building image...

✅ Deployed successfully!

   Deployment: dep_abc123xyz
   Version:    1
   Console:    console.crewship.dev/deployments/dep_abc123xyz
5

Run via API

Trigger an execution with a POST request.

bash
curl -X POST https://api.crewship.dev/v1/runs \
  -H "Authorization: Bearer YOUR_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "deployment_id": "dep_abc123",
    "input": {
      "messages": [{"role": "user", "content": "Hello"}]
    }
  }'
6

Stream execution with SSE

Add stream=true to watch agent output in real-time via Server-Sent Events.

bash
curl -N -X POST https://api.crewship.dev/v1/runs?stream=true \
  -H "Authorization: Bearer YOUR_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "deployment_id": "dep_abc123",
    "input": {
      "messages": [{"role": "user", "content": "Research AI agents"}]
    }
  }'

# SSE events:
# data: {"type":"node_start","node":"agent","content":"..."}
# data: {"type":"tool_call","tool":"search","input":"AI agents 2026"}
# data: {"type":"run_complete","output":"..."}

Advanced configuration

Customize your deployment with these crewship.toml options.

Build profiles

Choose "slim" for lightweight agents or "browser" for agents that need Playwright and Chromium to scrape websites.

profile = "browser"

Python versions

Specify Python 3.10, 3.11, or 3.12 depending on your dependencies.

python = "3.12"

Graph entrypoints

Point to your compiled StateGraph. The entrypoint should be the compiled graph object.

entrypoint = "my_project.agent:app"

Build exclusions

Keep your build lean by excluding test files, data, and other non-essential directories.

exclude = [ "tests", "data", "notebooks" ]

FAQ

Deployment questions

Ready to deploy your agent?

Get started for free. No credit card required.