Best Practices and Rules
Follow these guidelines to build reliable, maintainable Conversational Graph agents. These patterns are drawn from real-world usage and help you avoid common pitfalls.
Use the Right Tool for Each Job
Conversational Graph provides several ways to communicate and control flow. Choosing the right one for each situation ensures your graph behaves predictably.
| Need | Use | Why |
|---|---|---|
| Deterministic speech | ctx.say("message") | Exact words via TTS |
| Data collection | collect(prompt=...) | Internal extractor extracts from user input |
| Advance to node | Route("target") | Explicit flow control |
| Retry current node | Interrupt(say) | Re-ask with new message |
| Announce (no extraction) | ctx.ask("instruction") | LLM rephrases |
Prompt Audience
Understanding who reads your text - the LLM or the user - is critical to writing effective prompts and messages.
collect(prompt=...)andctx.ask()are LLM-facing - write as instructions to the model.Interrupt(say)andctx.say()are user-facing - write as direct speech.
Interrupt() holds the graph and retries the same node. ctx.say() just streams and moves ahead.
Positive Guards
Write guards that check for specific known values rather than negating unknown ones. This makes your branching logic clearer and less error-prone.
# Good
if state.employment_status == "unemployed":
return Route("reject_unemployed", update={"income": 0})
# Avoid
if not state.employment_status or state.employment_status != "employed":
...
Skip-Ahead Guards
If a value has already been collected (e.g., from a previous session via checkpointing), skip the collection node and route to the next step. This avoids asking the user for information they've already provided.
async def collect_name(state, ctx):
if state.name:
return Route("collect_email") # already have it, skip
...
Explicit Constraints
Include constraints in both collect() prompts (LLM-facing) and Interrupt messages (user-facing). This ensures the LLM knows what to extract and the user knows what's expected if they need to retry.
Keep Nodes Small
One node = one task. Don't combine collection, validation, and routing in the same node. Smaller nodes are easier to test, reuse, and debug.
LLM Requirements
Keep in mind these rules about how the LLM interacts with the graph engine.
extractor.collect()requires an LLM in the pipeline. Usectx.say()for TTS-only flows.- The LLM does not control routing - it only responds and extracts.
Limits
These are the built-in safeguards that prevent runaway or misconfigured graphs.
- Recursion: Node depth capped at 10 (
GraphRecursionError). - Retries: Per-node
Interruptcount capped atmax_retries(default 10). - Validation: Invalid extractions silently dropped - field stays
None. - Replay/Resume from END: No-op - target a checkpoint with outgoing transitions.
Got a Question? Ask us on discord

