LangGraph
+
Crewship

Deploy LangGraph agents to production

Crewship is the easiest way to host your LangGraph stateful agents. Deploy with a single command, get a production API endpoint, and let us handle the infrastructure.

One-Command Deploy

Deploy with `crewship deploy`. No Docker or Kubernetes required.

Real-time Streaming

Stream graph execution and node outputs in real-time via SSE.

Artifact Storage

Agents write files to artifacts/. Reports, data, images -- collected and downloadable via API.

Slack Integration

@mention for conversations, slash commands for one-shot runs.

Auto-scaling

Scale to zero when idle, scale up automatically under load.

Secure by Default

Isolated environments and encrypted secrets management.

Deploy in one command

Run crewship deploy and you're live. Crewship detects your LangGraph project, builds a container, and gives you a production API endpoint.

  • Works with any LangGraph project structure
  • Automatic dependency detection
  • Secure secrets injection at runtime
Terminal
$ crewship deploy

📦 Packaging agent...
⬆️  Uploading build context...
🔨 Building image...

✅ Deployed successfully!

   Deployment: dep_abc123xyz
   Version:    1
   Console:    console.crewship.dev/deployments/dep_abc123xyz

Two APIs for every use case

Every deployment comes with two API modes. Use the Runs API for stateless, one-shot executions. Use the Threads API for multi-turn conversations that remember context across messages.

  • Runs -- fire-and-forget tasks like research, analysis, and data processing
  • Threads -- persistent state for chatbots, support agents, and iterative workflows
  • Real-time streaming, webhook callbacks, and token authentication
curl -X POST https://api.crewship.dev/v1/runs \
  -H "Authorization: Bearer YOUR_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "deployment_id": "dep_abc123",
    "input": {
      "messages": [{"role": "user", "content": "Research AI agents"}]
    }
  }'

Full visibility

Monitor every agent run

The Crewship console gives you complete visibility into your agent deployments, runs, and outputs.

console.crewship.dev

Overview of runs and artifacts

Integration

Add your agents to Slack

Connect any deployed LangGraph agent to Slack in minutes. Your team can interact with AI agents directly from Slack channels -- no custom code required.

  • @mention the bot for threaded conversations with full context
  • Slash commands to trigger one-shot runs from any channel
  • Connect via Settings with a pre-configured Slack app
#engineering
A
Alex3:12 PM

@Crewship Summarize the latest PRs in the backend repo

Crewship
APP
3:12 PM

Found 4 merged PRs this week. #312 adds retry logic to the webhook handler, #309 migrates auth to JWT tokens, #307 fixes a connection pool leak...

How does Crewship compare to LangGraph Platform?

LangGraph Platform is LangChain's own hosting solution. Crewship is an independent alternative that supports both LangGraph and CrewAI, with a CLI-first workflow and transparent pricing.

5x
more runs
500 vs 100 at $25/mo
$0
overage fees
No per-execution charges
2+
frameworks
LangGraph + CrewAI support
instant rollback
Version management built in

What is LangGraph?

LangGraph is a library for building stateful, multi-actor applications with LLMs. You define flows as graphs with cycles, enabling agents that loop, branch, and maintain state across multiple steps.

Agents call tools, route based on results, and persist conversations using checkpointers. Human-in-the-loop workflows are built in.

Crewship handles the production infrastructure for LangGraph: deployment, scaling, state persistence, and real-time streaming. You focus on building great graphs, we handle the rest.

Read our complete guide to LangGraph to learn about graphs, agents, state, tools, and memory.

FAQ

Common questions about deploying LangGraph

Can't find what you're looking for? Reach out to us.

Ready to deploy your LangGraph agent?

Get started for free. No credit card required.