Skip to main content

Overview

Agentor agents are built on top of the Agent class from the agents library, providing a high-level abstraction for creating production-ready AI agents with tools, skills, and external integrations.

Agent Class

The core Agentor class (src/agentor/core/agent.py:138) provides the primary interface for building agents:
from agentor import Agentor

agent = Agentor(
    name="Weather Agent",
    instructions="You are a helpful weather assistant",
    model="gpt-5-mini",
    tools=["get_weather"],
)

Constructor Parameters

name
str
required
Agent name used in logs, traces, and A2A protocol agent cards
instructions
str
System prompt defining agent behavior and personality
model
str | LitellmModel
default:"gpt-5-nano"
Model identifier. Supports any LiteLLM provider format:
  • "gpt-5-mini" - OpenAI models
  • "gemini/gemini-2.5-pro" - Google models
  • "anthropic/claude-3.5" - Anthropic models
tools
List[FunctionTool | str | MCPServerStreamableHttp | BaseTool]
Tools available to the agent. Can be:
  • String names from the tool registry (e.g., "get_weather")
  • FunctionTool instances decorated with @function_tool
  • BaseTool subclasses with @capability methods
  • MCP server connections
output_type
type[Any] | AgentOutputSchemaBase
Pydantic model for structured output validation
model_settings
ModelSettings
Model configuration including temperature, top_p, max_tokens
skills
List[str]
Paths to skill directories (see Skills)
enable_tracing
bool
default:false
Enable Celesto AI tracing and observability
api_key
str
API key for the LLM provider

Creating Agents from Markdown

Agents can be defined in markdown files with YAML frontmatter (src/agentor/core/agent.py:236):
---
name: WeatherBot
tools: [get_weather]
model: gpt-4o-mini
temperature: 0.3
---
You are a concise weather assistant.
Load the agent:
from agentor import Agentor

agent = Agentor.from_md("agent.md")
result = agent.run("Weather in Paris?")

Agent Lifecycle

Synchronous Execution

The run() method (src/agentor/core/agent.py:367) provides synchronous execution:
result = agent.run("What is the weather in London?")
print(result)

Asynchronous Execution

The arun() method (src/agentor/core/agent.py:370) supports async execution with batch processing:
import asyncio

# Single prompt
result = await agent.arun("What is the weather in London?")

# Batch processing with concurrency control
results = await agent.arun(
    ["Weather in London?", "Weather in Paris?", "Weather in Tokyo?"],
    limit_concurrency=10,
    max_turns=20
)

Fallback Models

Handle rate limits gracefully with fallback models (src/agentor/core/agent.py:415):
result = await agent.arun(
    "Complex task",
    fallback_models=["gpt-4o-mini", "gemini/gemini-pro"]
)
If the primary model fails with rate limit or API errors, Agentor automatically retries with fallback models in order.

Streaming Responses

Stream agent responses in real-time (src/agentor/core/agent.py:487):
async for chunk in agent.chat("Tell me about AI", stream=True):
    print(chunk, end="", flush=True)
The stream_chat() method (src/agentor/core/agent.py:498) returns an async iterator of AgentOutput objects:
async for event in agent.stream_chat("Question", serialize=False):
    if event.message:
        print(event.message)

Model Configuration

Configure model behavior with ModelSettings (src/agentor/core/agent.py:211):
from agentor import Agentor, ModelSettings

model_settings = ModelSettings(
    temperature=0.7,
    top_p=0.9,
    max_tokens=2000
)

agent = Agentor(
    name="Creative Writer",
    model="gpt-4o",
    model_settings=model_settings
)

Multi-Agent Systems

Agentor supports hierarchical multi-agent orchestration. The framework includes specialized agents:
  • Concept Research Agent - Topic research and information gathering
  • Coder Agent - Code-related operations
  • Google Agent - Workspace integration
  • Main Triage Agent - Request routing and delegation
See src/agentor/agenthub/main.py for the orchestration implementation.

Agent Context

Agents receive a RunContextWrapper with configuration (src/agentor/tools/registry.py:33):
from agents import RunContextWrapper
from agentor.tools.registry import CelestoConfig

@register_global_tool
def get_weather(wrapper: RunContextWrapper[CelestoConfig], city: str) -> str:
    """Returns the weather in the given city."""
    api_key = wrapper.context.weather_api_key
    # Use API key to fetch weather
    return f"Weather in {city}"

Tracing and Observability

Enable automatic tracing with Celesto AI (src/agentor/core/agent.py:119):
agent = Agentor(
    name="My Agent",
    model="gpt-4o",
    enable_tracing=True  # Requires CELESTO_API_KEY
)
Traces are automatically sent to https://celesto.ai/observe. To disable auto-tracing:
export CELESTO_DISABLE_AUTO_TRACING=True

Next Steps

Last modified on March 4, 2026