Why deploy LangGraph.js to production?
Running LangGraph.js locally works for development. Production is different. Your agent needs an API, it needs to handle concurrent requests, and secrets need to stay out of your code.
Self-hosting means Docker, infrastructure, an API layer, and scaling -- all on you. Crewship handles that with a single command. You keep writing TypeScript.
Prerequisites
- Node.js 18+ installed on your machine
- A LangGraph.js project with a compiled graph
- A Crewship account — sign up free
Deploy in 6 steps
Each step builds on the previous one. The whole process takes a few minutes.
Install the CLI
One-line install on macOS and Linux.
curl -fsSL https://www.crewship.dev/install.sh | bashConfigure crewship.toml
Add a configuration file to your project root. Set the framework to "langgraphjs" and point the entrypoint to your compiled graph.
[deployment]
framework = "langgraphjs"
entrypoint = "src/graph.ts:app"
profile = "slim"
node = "20"
[build]
exclude = [ "tests", "dist" ]Set environment variables
Add API keys your agents need. Secrets are encrypted and injected at runtime.
crewship env set OPENAI_API_KEY sk-...
crewship env set ANTHROPIC_API_KEY sk-ant-...Deploy
A single command packages your code, builds the image, and deploys it.
$ crewship deploy
📦 Packaging agent...
⬆️ Uploading build context...
🔨 Building image...
✅ Deployed successfully!
Deployment: dep_abc123xyz
Version: 1
Console: console.crewship.dev/deployments/dep_abc123xyzRun via API
Trigger an execution with a POST request.
curl -X POST https://api.crewship.dev/v1/runs \
-H "Authorization: Bearer YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"deployment_id": "dep_abc123",
"input": {
"messages": [{"role": "user", "content": "Hello"}]
}
}'Stream execution with SSE
Add stream=true to watch agent output in real-time via Server-Sent Events.
curl -N -X POST https://api.crewship.dev/v1/runs?stream=true \
-H "Authorization: Bearer YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"deployment_id": "dep_abc123",
"input": {
"messages": [{"role": "user", "content": "Research AI agents"}]
}
}'
# SSE events:
# data: {"type":"node_start","node":"agent","content":"..."}
# data: {"type":"tool_call","tool":"search","input":"AI agents 2026"}
# data: {"type":"run_complete","output":"..."}Production features included
Every deployment includes monitoring, debugging, and production tooling.
Execution Traces
See every node execution, tool call, and LLM request in a timeline view.
Versioning
Every deployment gets a version number. Roll back to any previous version if something breaks.
Threads
Multi-turn conversations with persistent state across messages.
Webhooks
Get notified when runs complete, fail, or hit milestones.
Artifacts
Download files your agents generate: reports, CSVs, images.
Advanced configuration
Customize your deployment with these crewship.toml options.
Build profiles
Use "slim" for lightweight agents. Use "browser" if your agent needs Playwright and Chromium for web scraping.
profile = "browser"Node.js versions
Specify Node.js 18, 20, or 22 depending on your dependencies.
node = "22"Graph entrypoints
Point to the file and export name of your compiled StateGraph.
entrypoint = "src/agent/graph.ts:app"Build exclusions
Keep your build lean by excluding test files, dist output, and other non-essential directories.
exclude = [ "tests", "dist", "__tests__" ]FAQ
Deployment questions
Ready to deploy your agent?
Get started for free. No credit card required.