ollama {tidyllm} | R Documentation |
Ollama API Provider Function
Description
The ollama()
function acts as an interface for interacting with local AI models via the Ollama API.
It integrates seamlessly with the main tidyllm
verbs such as chat()
and embed()
.
Usage
ollama(..., .called_from = NULL)
Arguments
... |
Parameters to be passed to the appropriate Ollama-specific function, such as model configuration, input text, or API-specific options. |
.called_from |
An internal argument specifying the verb (e.g., |
Details
Some functionalities, like ollama_download_model()
or ollama_list_models()
are unique to the Ollama API and do not have a general verb counterpart.
These functions can be only accessed directly.
Supported Verbs:
-
chat()
: Sends a message to an Ollama model and retrieves the model's response. -
embed()
: Generates embeddings for input texts using an Ollama model. -
send_batch()
: Behaves different than the othersend_batch()
verbs since it immediately processes the answers
Value
The result of the requested action:
For
chat()
: An updatedLLMMessage
object containing the model's response.For
embed()
: A matrix where each column corresponds to an embedding.