mirror of
https://github.com/openclaw/openclaw.git
synced 2026-04-29 04:57:09 +02:00
9.5 KiB
9.5 KiB
summary, read_when, title
| summary | read_when | title | |
|---|---|---|---|
| Use Anthropic Claude via API keys or Claude CLI in OpenClaw |
|
Anthropic |
Anthropic (Claude)
Anthropic builds the Claude model family. OpenClaw supports two auth routes:
- API key — direct Anthropic API access with usage-based billing (
anthropic/*models) - Claude CLI — reuse an existing Claude CLI login on the same host
For long-lived gateway hosts, Anthropic API keys are still the clearest and most predictable production path.
Anthropic's current public docs:
- Claude Code CLI reference
- Claude Agent SDK overview
- Using Claude Code with your Pro or Max plan
- Using Claude Code with your Team or Enterprise plan
Getting started
**Best for:** standard API access and usage-based billing.<Steps>
<Step title="Get your API key">
Create an API key in the [Anthropic Console](https://console.anthropic.com/).
</Step>
<Step title="Run onboarding">
```bash
openclaw onboard
# choose: Anthropic API key
```
Or pass the key directly:
```bash
openclaw onboard --anthropic-api-key "$ANTHROPIC_API_KEY"
```
</Step>
<Step title="Verify the model is available">
```bash
openclaw models list --provider anthropic
```
</Step>
</Steps>
### Config example
```json5
{
env: { ANTHROPIC_API_KEY: "sk-ant-..." },
agents: { defaults: { model: { primary: "anthropic/claude-opus-4-6" } } },
}
```
**Best for:** reusing an existing Claude CLI login without a separate API key.
<Steps>
<Step title="Ensure Claude CLI is installed and logged in">
Verify with:
```bash
claude --version
```
</Step>
<Step title="Run onboarding">
```bash
openclaw onboard
# choose: Claude CLI
```
OpenClaw detects and reuses the existing Claude CLI credentials.
</Step>
<Step title="Verify the model is available">
```bash
openclaw models list --provider anthropic
```
</Step>
</Steps>
<Note>
Setup and runtime details for the Claude CLI backend are in [CLI Backends](/gateway/cli-backends).
</Note>
<Tip>
If you want the clearest billing path, use an Anthropic API key instead. OpenClaw also supports subscription-style options from [OpenAI Codex](/providers/openai), [Qwen Cloud](/providers/qwen), [MiniMax](/providers/minimax), and [Z.AI / GLM](/providers/glm).
</Tip>
Thinking defaults (Claude 4.6)
Claude 4.6 models default to adaptive thinking in OpenClaw when no explicit thinking level is set.
Override per-message with /think:<level> or in model params:
{
agents: {
defaults: {
models: {
"anthropic/claude-opus-4-6": {
params: { thinking: "adaptive" },
},
},
},
},
}
Prompt caching
OpenClaw supports Anthropic's prompt caching feature for API-key auth.
| Value | Cache duration | Description |
|---|---|---|
"short" (default) |
5 minutes | Applied automatically for API-key auth |
"long" |
1 hour | Extended cache |
"none" |
No caching | Disable prompt caching |
{
agents: {
defaults: {
models: {
"anthropic/claude-opus-4-6": {
params: { cacheRetention: "long" },
},
},
},
},
}
```json5
{
agents: {
defaults: {
model: { primary: "anthropic/claude-opus-4-6" },
models: {
"anthropic/claude-opus-4-6": {
params: { cacheRetention: "long" },
},
},
},
list: [
{ id: "research", default: true },
{ id: "alerts", params: { cacheRetention: "none" } },
],
},
}
```
Config merge order:
1. `agents.defaults.models["provider/model"].params`
2. `agents.list[].params` (matching `id`, overrides by key)
This lets one agent keep a long-lived cache while another agent on the same model disables caching for bursty/low-reuse traffic.
- Anthropic Claude models on Bedrock (`amazon-bedrock/*anthropic.claude*`) accept `cacheRetention` pass-through when configured.
- Non-Anthropic Bedrock models are forced to `cacheRetention: "none"` at runtime.
- API-key smart defaults also seed `cacheRetention: "short"` for Claude-on-Bedrock refs when no explicit value is set.
Advanced configuration
OpenClaw's shared `/fast` toggle supports direct Anthropic traffic (API-key and OAuth to `api.anthropic.com`).| Command | Maps to |
|---------|---------|
| `/fast on` | `service_tier: "auto"` |
| `/fast off` | `service_tier: "standard_only"` |
```json5
{
agents: {
defaults: {
models: {
"anthropic/claude-sonnet-4-6": {
params: { fastMode: true },
},
},
},
},
}
```
<Note>
- Only injected for direct `api.anthropic.com` requests. Proxy routes leave `service_tier` untouched.
- Explicit `serviceTier` or `service_tier` params override `/fast` when both are set.
- On accounts without Priority Tier capacity, `service_tier: "auto"` may resolve to `standard`.
</Note>
The bundled Anthropic plugin registers image and PDF understanding. OpenClaw
auto-resolves media capabilities from the configured Anthropic auth — no
additional config is needed.
| Property | Value |
| -------------- | -------------------- |
| Default model | `claude-opus-4-6` |
| Supported input | Images, PDF documents |
When an image or PDF is attached to a conversation, OpenClaw automatically
routes it through the Anthropic media understanding provider.
Anthropic's 1M context window is beta-gated. Enable it per model:
```json5
{
agents: {
defaults: {
models: {
"anthropic/claude-opus-4-6": {
params: { context1m: true },
},
},
},
},
}
```
OpenClaw maps this to `anthropic-beta: context-1m-2025-08-07` on requests.
<Warning>
Requires long-context access on your Anthropic credential. Legacy token auth (`sk-ant-oat-*`) is rejected for 1M context requests — OpenClaw logs a warning and falls back to the standard context window.
</Warning>