chat_anthropic {ellmer} | R Documentation |
Chat with an Anthropic Claude model
Description
Anthropic provides a number of chat based models under the Claude moniker. Note that a Claude Pro membership does not give you the ability to call models via the API; instead, you will need to sign up (and pay for) a developer account.
Usage
chat_anthropic(
system_prompt = NULL,
params = NULL,
max_tokens = deprecated(),
model = NULL,
api_args = list(),
base_url = "https://api.anthropic.com/v1",
beta_headers = character(),
api_key = anthropic_key(),
echo = NULL
)
models_anthropic(
base_url = "https://api.anthropic.com/v1",
api_key = anthropic_key()
)
Arguments
system_prompt |
A system prompt to set the behavior of the assistant. |
params |
Common model parameters, usually created by |
max_tokens |
Maximum number of tokens to generate before stopping. |
model |
The model to use for the chat (defaults to "claude-sonnet-4-20250514").
We regularly update the default, so we strongly recommend explicitly specifying a model for anything other than casual use.
Use |
api_args |
Named list of arbitrary extra arguments appended to the body
of every chat API call. Combined with the body object generated by ellmer
with |
base_url |
The base URL to the endpoint; the default uses OpenAI. |
beta_headers |
Optionally, a character vector of beta headers to opt-in claude features that are still in beta. |
api_key |
API key to use for authentication. You generally should not supply this directly, but instead set the |
echo |
One of the following options:
Note this only affects the |
Value
A Chat object.
See Also
Other chatbots:
chat_aws_bedrock()
,
chat_azure_openai()
,
chat_cloudflare()
,
chat_cortex_analyst()
,
chat_databricks()
,
chat_deepseek()
,
chat_github()
,
chat_google_gemini()
,
chat_groq()
,
chat_huggingface()
,
chat_mistral()
,
chat_ollama()
,
chat_openai()
,
chat_openrouter()
,
chat_perplexity()
,
chat_portkey()
Examples
chat <- chat_anthropic()
chat$chat("Tell me three jokes about statisticians")