LlamaLib  v2.0.5
Cross-platform library for local LLMs
Loading...
Searching...
No Matches
LLM_agent.h File Reference

High-level conversational agent interface for LLMs. More...

#include "LLM.h"
#include "LLM_client.h"
Include dependency graph for LLM_agent.h:
This graph shows which files directly or indirectly include this file:

Go to the source code of this file.

Classes

struct  ChatMessage
 Structure representing a single chat message. More...
 
class  LLMAgent
 High-level conversational agent for LLM interactions. More...
 

Enumerations

enum class  ContextOverflowStrategy { None , Truncate , Summarize }
 Strategy to apply when the chat history would exceed the model's context window. More...
 

Functions

LLMAgentLLMAgent_Construct (LLMLocal *llm, const char *system_prompt="")
 Construct LLMAgent (C API)
 
void LLMAgent_Set_System_Prompt (LLMAgent *llm, const char *system_prompt)
 Set system prompt (C API)
 
const char * LLMAgent_Get_System_Prompt (LLMAgent *llm)
 Get system prompt (C API)
 
void LLMAgent_Set_Slot (LLMAgent *llm, int slot_id)
 Set processing slot (C API)
 
int LLMAgent_Get_Slot (LLMAgent *llm)
 Get processing slot (C API)
 
const char * LLMAgent_Chat (LLMAgent *llm, const char *user_prompt, bool add_to_history=true, CharArrayFn callback=nullptr, bool return_response_json=false, bool debug_prompt=false)
 Conduct chat interaction (C API)
 
void LLMAgent_Clear_History (LLMAgent *llm)
 Clear conversation history (C API)
 
const char * LLMAgent_Get_History (LLMAgent *llm)
 Get conversation history (C API)
 
void LLMAgent_Set_History (LLMAgent *llm, const char *history_json)
 Set conversation history (C API)
 
void LLMAgent_Add_User_Message (LLMAgent *llm, const char *content)
 Add user message to history (C API)
 
void LLMAgent_Add_Assistant_Message (LLMAgent *llm, const char *content)
 Add assistant message to history (C API)
 
void LLMAgent_Remove_Last_Message (LLMAgent *llm)
 Remove last message from history (C API)
 
void LLMAgent_Save_History (LLMAgent *llm, const char *filepath)
 Save conversation history to file (C API)
 
void LLMAgent_Load_History (LLMAgent *llm, const char *filepath)
 Load conversation history from file (C API)
 
size_t LLMAgent_Get_History_Size (LLMAgent *llm)
 Get conversation history size (C API)
 
void LLMAgent_Set_Overflow_Strategy (LLMAgent *llm, int strategy, float target_ratio, const char *summarize_prompt)
 Configure the context overflow strategy (C API)
 
const char * LLMAgent_Get_Summary (LLMAgent *llm)
 Get the current rolling summary (C API)
 
void LLMAgent_Set_Summary (LLMAgent *llm, const char *summary)
 Set the rolling summary directly (C API)
 

Variables

const std::string SUMMARY_PROMPT
 

Detailed Description

High-level conversational agent interface for LLMs.

Provides a conversation-aware wrapper around LLM functionality with chat history management

Definition in file LLM_agent.h.

Enumeration Type Documentation

◆ ContextOverflowStrategy

enum class ContextOverflowStrategy
strong

Strategy to apply when the chat history would exceed the model's context window.

Enumerator
None 

No automatic handling — may crash if context is exceeded.

Truncate 

Remove oldest messages (in pairs) from the front until history fits within target_context_ratio.

Summarize 

Summarise the full history (rolling chunks if needed), embed it in the system message, then truncate if still needed.

Definition at line 57 of file LLM_agent.h.

Variable Documentation

◆ SUMMARY_PROMPT

const std::string SUMMARY_PROMPT
Initial value:
=
"You are maintaining a concise working memory of an ongoing conversation."
""
"If an existing summary is provided, merge it with the new messages into a single updated summary."
"If no existing summary is provided, create a new summary from the messages."
""
"Rules:"
"- Preserve user goals, decisions made, constraints, preferences, open questions, and pending tasks."
"- Remove anything resolved, superseded, redundant, or purely conversational."
"- Keep only information relevant for future reasoning."
"- Avoid duplicating or rephrasing information unnecessarily."
"- Write in present tense where possible."
"- Keep under 200 words."
"- No bullet points. No preamble. Output only the summary text."

Definition at line 64 of file LLM_agent.h.