chat_groq {ellmer} | R Documentation |
Chat with a model hosted on Groq
Description
Sign up at https://groq.com.
This function is a lightweight wrapper around chat_openai()
with
the defaults tweaked for groq.
Known limitations
groq does not currently support structured data extraction.
Usage
chat_groq(
system_prompt = NULL,
base_url = "https://api.groq.com/openai/v1",
api_key = groq_key(),
model = NULL,
seed = NULL,
api_args = list(),
echo = NULL
)
Arguments
system_prompt |
A system prompt to set the behavior of the assistant. |
base_url |
The base URL to the endpoint; the default uses OpenAI. |
api_key |
API key to use for authentication. You generally should not supply this directly, but instead set the |
model |
The model to use for the chat (defaults to "llama3-8b-8192"). We regularly update the default, so we strongly recommend explicitly specifying a model for anything other than casual use. |
seed |
Optional integer seed that ChatGPT uses to try and make output more reproducible. |
api_args |
Named list of arbitrary extra arguments appended to the body
of every chat API call. Combined with the body object generated by ellmer
with |
echo |
One of the following options:
Note this only affects the |
Value
A Chat object.
See Also
Other chatbots:
chat_anthropic()
,
chat_aws_bedrock()
,
chat_azure_openai()
,
chat_cloudflare()
,
chat_cortex_analyst()
,
chat_databricks()
,
chat_deepseek()
,
chat_github()
,
chat_google_gemini()
,
chat_huggingface()
,
chat_mistral()
,
chat_ollama()
,
chat_openai()
,
chat_openrouter()
,
chat_perplexity()
,
chat_portkey()
Examples
## Not run:
chat <- chat_groq()
chat$chat("Tell me three jokes about statisticians")
## End(Not run)