Pydantic AI
Auto-instrument Pydantic AI for type-safe agents.
Risicare automatically instruments Pydantic AI for type-safe agent development.
Installation
pip install risicare[pydantic-ai]
# or
pip install risicare pydantic-aiVersion Compatibility
Requires pydantic-ai >= 0.1.0.
Auto-Instrumentation
import risicare
from pydantic_ai import Agent
risicare.init()
agent = Agent(
"openai:gpt-4o",
system_prompt="You are a helpful assistant."
)
# Automatically traced
result = agent.run_sync("Hello!")What's Captured
| Feature | Description |
|---|---|
| Agent Runs | Full run/run_sync/run_stream calls |
| Model Calls | Underlying LLM API calls |
| Tool Calls | Function tool executions |
| Result Validation | Pydantic model validation |
| Dependencies | Injected dependencies |
Span Hierarchy
pydantic_ai.agent.run/{name} (AGENT kind)
├── openai.chat.completions.create (provider span, natural child)
└── openai.chat.completions.create (provider span, natural child)
Provider Spans
Pydantic AI instrumentation creates agent/framework-level spans. Underlying LLM calls (e.g., OpenAI, Anthropic) are traced separately by provider instrumentation, giving you both framework-level and LLM-level visibility.
Structured Outputs
Type-safe outputs are captured:
from pydantic import BaseModel
class CityInfo(BaseModel):
name: str
country: str
population: int
agent = Agent(
"openai:gpt-4o",
result_type=CityInfo
)
result = agent.run_sync("Tell me about Paris")
# result.output is CityInfo, schema captured in spanTools
Tool executions are traced:
from pydantic_ai import Agent, RunContext
agent = Agent("openai:gpt-4o")
@agent.tool
def get_weather(ctx: RunContext, location: str) -> str:
"""Get weather for a location."""
return f"Sunny in {location}"
# Tool calls appear as child spans
result = agent.run_sync("What's the weather in Paris?")Dependencies
Dependency injection is captured:
from dataclasses import dataclass
@dataclass
class Deps:
user_id: str
api_key: str
agent = Agent("openai:gpt-4o", deps_type=Deps)
@agent.tool
def get_user_data(ctx: RunContext[Deps]) -> str:
return f"Data for {ctx.deps.user_id}"
result = agent.run_sync(
"Get my data",
deps=Deps(user_id="123", api_key="secret")
)
# Dependencies are captured (excluding secrets)Streaming
async with agent.run_stream("Write a story") as response:
async for chunk in response.stream():
print(chunk, end="")Multiple Models
# OpenAI
agent = Agent("openai:gpt-4o")
# Anthropic
agent = Agent("anthropic:claude-3-sonnet-20240229")
# Gemini
agent = Agent("gemini-1.5-pro")
# Each model is traced with correct provider