llm_config {LLMR}R Documentation

Create LLM Configuration

Description

Create LLM Configuration

Usage

llm_config(
  provider,
  model,
  api_key,
  troubleshooting = FALSE,
  base_url = NULL,
  embedding = NULL,
  ...
)

Arguments

provider

Provider name (openai, anthropic, groq, together, voyage, gemini, deepseek)

model

Model name to use

api_key

API key for authentication

troubleshooting

Prints out all api calls. USE WITH EXTREME CAUTION as it prints your API key.

base_url

Optional base URL override

embedding

Logical indicating embedding mode: NULL (default, uses prior defaults), TRUE (force embeddings), FALSE (force generative)

...

Additional provider-specific parameters

Value

Configuration object for use with call_llm()

See Also

The main ways to use a config object:

Examples

## Not run: 
  cfg <- llm_config(
    provider   = "openai",
    model      = "gpt-4o-mini",
    api_key    = Sys.getenv("OPENAI_API_KEY"),
    temperature = 0.7,
    max_tokens  = 500)

  call_llm(cfg, "Hello!")  # one-shot, bare string

## End(Not run)

[Package LLMR version 0.5.0 Index]