chat_snowflake {ellmer}R Documentation

Chat with a model hosted on Snowflake

Description

The Snowflake provider allows you to interact with LLM models available through the Cortex LLM REST API.

Authentication

chat_snowflake() picks up the following ambient Snowflake credentials:

Known limitations

Note that Snowflake-hosted models do not support images or tool calling.

See chat_cortex_analyst() to chat with the Snowflake Cortex Analyst rather than a general-purpose model.

Usage

chat_snowflake(
  system_prompt = NULL,
  account = snowflake_account(),
  credentials = NULL,
  model = NULL,
  params = NULL,
  api_args = list(),
  echo = c("none", "output", "all")
)

Arguments

system_prompt

A system prompt to set the behavior of the assistant.

account

A Snowflake account identifier, e.g. "testorg-test_account". Defaults to the value of the SNOWFLAKE_ACCOUNT environment variable.

credentials

A list of authentication headers to pass into httr2::req_headers(), a function that returns them when called, or NULL, the default, to use ambient credentials.

model

The model to use for the chat (defaults to "claude-3-7-sonnet"). We regularly update the default, so we strongly recommend explicitly specifying a model for anything other than casual use.

params

Common model parameters, usually created by params().

api_args

Named list of arbitrary extra arguments appended to the body of every chat API call. Combined with the body object generated by ellmer with modifyList().

echo

One of the following options:

  • none: don't emit any output (default when running in a function).

  • output: echo text and tool-calling output as it streams in (default when running at the console).

  • all: echo all input and output.

Note this only affects the chat() method.

Value

A Chat object.

Examples


chat <- chat_snowflake()
chat$chat("Tell me a joke in the form of a SQL query.")


[Package ellmer version 0.2.1 Index]