LLMRequest/LLMResponse) and provider-specific APIs. AFK ships with three built-in adapters and supports custom adapters for any provider.
Built-in providers
OpenAI
Direct integration via the OpenAI Python SDK. Supports all GPT-4.1 and
o-series models.
Anthropic
Direct integration via the Anthropic SDK. Supports Claude Opus 4.5 and Opus.
LiteLLM
Proxy adapter for 100+ providers (Azure, Bedrock, Gemini, Mistral, local
models, etc.).
Capability comparison
| Feature | OpenAI | Anthropic | LiteLLM |
|---|---|---|---|
| Text generation | |||
| Tool calling | (provider-dependent) | ||
| Structured output | Provider-dependent | ||
| Streaming | |||
| Vision (image input) | Provider-dependent | ||
| Custom endpoints |
Usage
Custom adapter
Register your own adapter for unsupported providers or custom inference servers:Custom transport
Override the HTTP transport layer for any adapter (useful for proxies, mTLS, or custom retry logic):Next steps
Control & Session
Retry, caching, rate limiting, and circuit breaking.
Agent Integration
How agents resolve and use LLM clients.