get_logprobs {tidyllm} | R Documentation |
Retrieve Log Probabilities from Assistant Replies
Description
Extracts token log probabilities from assistant replies within an LLMMessage
object.
Each row represents a token with its log probability and top alternative tokens.
Usage
get_logprobs(.llm, .index = NULL)
Arguments
.llm |
An |
.index |
A positive integer specifying which assistant reply's log probabilities to extract.
If |
Details
An empty tibble is output if no logprobs were requested. Currently only works with openai_chat()
Columns include:
-
reply_index
: The index of the assistant reply in the message history. -
token
: The generated token. -
logprob
: The log probability of the generated token. -
bytes
: The byte-level encoding of the token. -
top_logprobs
: A list column containing the top alternative tokens with their log probabilities.
Value
A tibble containing log probabilities for the specified assistant reply or all replies.