azure_openai_embedding {tidyllm} | R Documentation |
Generate Embeddings Using OpenAI API on Azure
Description
Generate Embeddings Using OpenAI API on Azure
Usage
azure_openai_embedding(
.input,
.deployment = "text-embedding-3-small",
.endpoint_url = Sys.getenv("AZURE_ENDPOINT_URL"),
.api_version = "2023-05-15",
.truncate = TRUE,
.timeout = 120,
.dry_run = FALSE,
.max_tries = 3
)
Arguments
.input |
A character vector of texts to embed or an |
.deployment |
The embedding model identifier (default: "text-embedding-3-small"). |
.endpoint_url |
Base URL for the API (default: Sys.getenv("AZURE_ENDPOINT_URL")). |
.api_version |
What API-Version othe Azure OpenAI API should be used (default: "2023-05-15") |
.truncate |
Whether to truncate inputs to fit the model's context length (default: TRUE). |
.timeout |
Timeout for the API request in seconds (default: 120). |
.dry_run |
If TRUE, perform a dry run and return the request object. |
.max_tries |
Maximum retry attempts for requests (default: 3). |
Value
A tibble with two columns: input
and embeddings
.
The input
column contains the texts sent to embed, and the embeddings
column
is a list column where each row contains an embedding vector of the sent input.