LangGraph
+
Crewship
Coming Soon

Deploy LangGraph agents to production

Crewship is adding support for LangGraph. Deploy your stateful, multi-actor applications with a single command. Get a production API endpoint with real-time streaming.

One-Command Deploy

Deploy with `crewship deploy`. No Docker or Kubernetes required.

Real-time Streaming

Stream graph execution and node outputs in real-time.

Auto-scaling

Scale to zero when idle, scale up automatically under load.

Secure by Default

Isolated environments and encrypted secrets management.

What we're building

LangGraph support will bring all the features you love from Crewship to your graph-based agent workflows. Deploy complex, stateful applications without managing infrastructure.

  • Full LangGraph and LangChain compatibility
  • State persistence across graph executions
  • Stream node-by-node execution in real-time
  • Visual graph execution traces
  • Human-in-the-loop support
Coming soon
$ crewship deploy

Detecting framework... LangGraph
Building container...
Deploying to production...

✓ Deployed to my-graph
✓ API endpoint ready
✓ Streaming enabled

https://api.crewship.dev/v1/runs

What is LangGraph?

LangGraph is a library for building stateful, multi-actor applications with LLMs. Built on top of LangChain, it adds the ability to define flows as graphs with cycles, enabling more complex agent behaviors.

Unlike simple chains, LangGraph allows agents to loop, branch, and maintain state across multiple steps. This makes it ideal for building sophisticated AI applications like autonomous agents, multi-agent systems, and human-in-the-loop workflows.

Crewship will handle the production infrastructure for LangGraph: deployment, scaling, state persistence, and real-time streaming. You focus on building great graphs.

Get notified when LangGraph support launches

Join the waitlist and be the first to deploy LangGraph on Crewship.