LLMConversation {LLMR} | R Documentation |
LLMConversation Class for Coordinating Agents
Description
An R6 class for managing a conversation among multiple Agent
objects.
Includes optional conversation-level summarization if 'summarizer_config' is provided:
-
summarizer_config: A list that can contain:
-
llm_config
: Thellm_config
used for the summarizer call (default a basic OpenAI). -
prompt
: A custom summarizer prompt (default provided). -
threshold
: Word-count threshold (default 3000 words). -
summary_length
: Target length in words for the summary (default 400).
-
Once the total conversation word count exceeds 'threshold', a summarization is triggered.
The conversation is replaced with a single condensed message that keeps track of who said what.
Public fields
agents
A named list of
Agent
objects.conversation_history
A list of speaker/text pairs for the entire conversation.
conversation_history_full
A list of speaker/text pairs for the entire conversation that is never modified and never used directly.
topic
A short string describing the conversation's theme.
prompts
An optional list of prompt templates (may be ignored).
shared_memory
Global store that is also fed into each agent's memory.
last_response
last response received
total_tokens_sent
total tokens sent in conversation
total_tokens_received
total tokens received in conversation
summarizer_config
Config list controlling optional conversation-level summarization.
Methods
Public methods
Method new()
Create a new conversation.
Usage
LLMConversation$new(topic, prompts = NULL, summarizer_config = NULL)
Arguments
topic
Character. The conversation topic.
prompts
Optional named list of prompt templates.
summarizer_config
Optional list controlling conversation-level summarization.
Method add_agent()
Add an Agent
to this conversation. The agent is stored by agent$id
.
Usage
LLMConversation$add_agent(agent)
Arguments
agent
An Agent object.
Method add_message()
Add a message to the global conversation log. Also appended to shared memory. Then possibly trigger summarization if configured.
Usage
LLMConversation$add_message(speaker, text)
Arguments
speaker
Character. Who is speaking?
text
Character. What they said.
Method converse()
Have a specific agent produce a response. The entire global conversation plus shared memory is temporarily loaded into that agent. Then the new message is recorded in the conversation. The agent's memory is then reset except for its new line.
Usage
LLMConversation$converse( agent_id, prompt_template, replacements = list(), verbose = FALSE )
Arguments
agent_id
Character. The ID of the agent to converse.
prompt_template
Character. The prompt template for the agent.
replacements
A named list of placeholders to fill in the prompt.
verbose
Logical. If TRUE, prints extra info.
Method run()
Run a multi-step conversation among a sequence of agents.
Usage
LLMConversation$run( agent_sequence, prompt_template, replacements = list(), verbose = FALSE )
Arguments
agent_sequence
Character vector of agent IDs in the order they speak.
prompt_template
Single string or named list of strings keyed by agent ID.
replacements
Single list or list-of-lists with per-agent placeholders.
verbose
Logical. If TRUE, prints extra info.
Method print_history()
Print the conversation so far to the console.
Usage
LLMConversation$print_history()
Method reset_conversation()
Clear the global conversation and reset all agents' memories.
Usage
LLMConversation$reset_conversation()
Method |>()
Pipe-like operator to chain conversation steps. E.g., conv |> "Solver"(...)
Usage
LLMConversation$|>(agent_id)
Arguments
agent_id
Character. The ID of the agent to call next.
Returns
A function that expects (prompt_template, replacements, verbose).
Method maybe_summarize_conversation()
Possibly summarize the conversation if summarizer_config is non-null and the word count of conversation_history exceeds summarizer_config$threshold.
Usage
LLMConversation$maybe_summarize_conversation()
Method summarize_conversation()
Summarize the conversation so far into one condensed message. The new conversation history becomes a single message with speaker = "summary".
Usage
LLMConversation$summarize_conversation()
Method clone()
The objects of this class are cloneable with this method.
Usage
LLMConversation$clone(deep = FALSE)
Arguments
deep
Whether to make a deep clone.