What are Flows?
A CrewAI Crew is great for a team of agents working on a set of tasks. But sometimes you need more — multiple steps that depend on each other, persistent state across interactions, or conditional routing based on results.
That's what Flows are for. A Flow is a Python class that defines a pipeline of methods. Each method can call agents, crews, or plain functions. State flows between methods automatically, and you control the execution order with decorators.
Flows vs Crews
Not everything needs a Flow. Here's when to use which.
Flow decorators
Three decorators control how methods in a flow execute.
@start()
Marks the entry point of a flow. This method runs when you call flow.kickoff(). A flow must have exactly one @start method.
@listen(method)
Triggers after the specified method completes. You can chain multiple listeners to build multi-step pipelines. Pass a method reference or a string name.
@router(method)
Runs after the specified method and returns a string that determines which @listen handler runs next. Use this for conditional branching and routing logic.
Flow basics: state and steps
Define your state as a Pydantic BaseModel, then pass it as a type parameter to Flow[YourState]. Access state via self.state in any method.
from crewai.flow.flow import Flow, start, listen
from pydantic import BaseModel
class MyState(BaseModel):
query: str = ""
result: str = ""
score: float = 0.0
class MyFlow(Flow[MyState]):
@start()
def process_input(self):
# First step — runs when flow.kickoff() is called
self.state.result = analyze(self.state.query)
return self.state.result
@listen(process_input)
def evaluate(self):
# Triggered after process_input completes
self.state.score = score(self.state.result)
return self.state.scoreReal example: a chat flow
Here's a real Flow from a production chatbot. It maintains conversation history, runs a chat agent, and generates follow-up questions after each response.
from crewai.flow.flow import Flow, start, listen
from pydantic import BaseModel
class ChatState(BaseModel):
query: str = ""
messages: list[dict] = []
suggested_questions: list[str] = []
class ChatFlow(Flow[ChatState]):
@start()
def chat(self):
# Build conversation history for the agent
history = format_messages(self.state.messages)
# Run the chat agent with context
response = chat_agent.kickoff(
inputs={"query": self.state.query, "history": history}
)
# Update state with new messages
self.state.messages.append(
{"role": "user", "content": self.state.query}
)
self.state.messages.append(
{"role": "assistant", "content": response.raw}
)
return response.raw
@listen(chat)
def suggest(self):
# Generate follow-up questions after each response
result = suggest_agent.kickoff(
inputs={"messages": str(self.state.messages[-4:])}
)
self.state.suggested_questions = parse_questions(result.raw)
return self.state.suggested_questions Conditional routing with @router
The @router decorator lets you branch the pipeline based on a method's return value. Return a string that matches a @listen handler's name to route execution there.
from crewai.flow.flow import Flow, start, listen, router
class TriageFlow(Flow[TriageState]):
@start()
def classify(self):
# Classify the incoming request
self.state.category = classifier.kickoff(
inputs={"text": self.state.input}
).raw
return self.state.category
@router(classify)
def route(self):
# Route to different handlers based on classification
if self.state.category == "urgent":
return "handle_urgent"
elif self.state.category == "question":
return "handle_question"
return "handle_general"
@listen("handle_urgent")
def handle_urgent(self):
# Urgent handler with priority agent
...
@listen("handle_question")
def handle_question(self):
# Question handler with knowledge agent
...
@listen("handle_general")
def handle_general(self):
# General handler
...Deploying flows
Deploy a Flow the same way you deploy a Crew — just point the entrypoint to your Flow class. Crewship detects the Flow pattern and handles it automatically.
For chat-based flows, set input_key and output_key in the [chat] section to enable the Threads API for multi-turn conversations.
# crewship.toml
[deployment]
framework = "crewai"
entrypoint = "example_crew_chat.flows.chat_flow:ChatFlow"
python = "3.11"
profile = "slim"
[chat]
input_key = "query"
output_key = "messages"FAQ
Flow questions
Deploy your flows to production
Built your flow? Deploy it with a single command.