Skip to main content
AgentFlow is configured through environment variables and Python config classes.

Environment Variables

Create a .env file at the project root:
# Required
OPENAI_API_KEY=sk-...
DATABASE_URL=postgresql://user:pass@localhost:5432/agentflow

# Optional
ENVIRONMENT=development          # development | staging | production
CORS_ALLOWED_ORIGINS=http://localhost:3000
LOG_LEVEL=INFO                   # DEBUG | INFO | WARNING | ERROR
PERPLEXITY_API_KEY=pplx-...     # For research agent web search

# Observability
DD_TRACE_ENABLED=true
DD_LLMOBS_ENABLED=true

# Auth
AUTH0_DOMAIN=your-tenant.auth0.com
AUTH0_AUDIENCE=https://api.yourapp.com

LLM Defaults

These defaults are used when no explicit llm_config is provided:
PurposeModelTemperatureMax Tokens
Agentopenai/gpt-5.41.016,384
Toolopenai/gpt-5.4-mini0.020,480
Planningopenai/gpt-5.4-mini0.52,048
Reflectionopenai/gpt-5.4-mini0.01,024
Override per-agent via the llm_config parameter in the @agent decorator or the management API.

Operational Defaults

SettingDefault
DEFAULT_EXECUTABLE_TIMEOUT300s (5 min)
DEFAULT_MAX_TURNS10
DEFAULT_MAX_REFLECTION_ATTEMPTS3
DEFAULT_TOOL_MAX_RETRIES3
DEFAULT_TOOL_RETRY_DELAY1s
DEFAULT_TOOL_BACKOFF_FACTOR2.0

Multi-Provider LLM Support

AgentFlow uses LiteLLM for multi-provider access. Prefix models with the provider:
"openai/gpt-4o"
"anthropic/claude-sonnet-4-20250514"
"google/gemini-2.0-flash"
"xai/grok-2"
Set the corresponding API key environment variable for each provider you use.