LLM for Unity  v3.0.1
Create characters in Unity with LLMs!
Loading...
Searching...
No Matches
LLMUnity.LLMEmbedder Class Reference

Class implementing the LLM embedder. More...

Inheritance diagram for LLMUnity.LLMEmbedder:
[legend]

Public Member Functions

override bool IsAutoAssignableLLM (LLM llmSet)
 Determines if an LLM instance can be auto-assigned to this client. Override in derived classes to implement specific assignment logic.
 
- Public Member Functions inherited from LLMUnity.LLMClient
virtual void Awake ()
 Unity Awake method that validates configuration and assigns local LLM if needed.
 
virtual async void Start ()
 Unity Start method that initializes the LLM client connection.
 
virtual void SetGrammar (string grammarString)
 Sets grammar constraints for structured output generation.
 
virtual void LoadGrammar (string path)
 Loads grammar constraints from a file.
 
virtual async Task< List< int > > Tokenize (string query, Action< List< int > > callback=null)
 Converts text into a list of token IDs.
 
virtual async Task< string > Detokenize (List< int > tokens, Action< string > callback=null)
 Converts token IDs back to text.
 
virtual async Task< List< float > > Embeddings (string query, Action< List< float > > callback=null)
 Generates embedding vectors for the input text.
 
virtual async Task< string > Completion (string prompt, Action< string > callback=null, Action completionCallback=null, int id_slot=-1)
 Generates text completion.
 
void CancelRequest (int id_slot)
 Cancels an active request in the specified slot.
 

Additional Inherited Members

- Public Attributes inherited from LLMUnity.LLMClient
bool advancedOptions = false
 Show/hide advanced options in the inspector.
 
int numPredict = -1
 Maximum tokens to generate (-1 = unlimited)
 
bool cachePrompt = true
 Cache processed prompts to speed up subsequent requests.
 
int seed = 0
 Random seed for reproducible generation (0 = random)
 
float temperature = 0.2f
 Sampling temperature (0.0 = deterministic, higher = more creative)
 
int topK = 40
 Top-k sampling: limit to k most likely tokens (0 = disabled)
 
float topP = 0.9f
 Top-p (nucleus) sampling: cumulative probability threshold (1.0 = disabled)
 
float minP = 0.05f
 Minimum probability threshold for token selection.
 
float repeatPenalty = 1.1f
 Penalty for repeated tokens (1.0 = no penalty)
 
float presencePenalty = 0f
 Presence penalty: reduce likelihood of any repeated token (0.0 = disabled)
 
float frequencyPenalty = 0f
 Frequency penalty: reduce likelihood based on token frequency (0.0 = disabled)
 
float typicalP = 1f
 Locally typical sampling strength (1.0 = disabled)
 
int repeatLastN = 64
 Number of recent tokens to consider for repetition penalty (0 = disabled, -1 = context size)
 
int mirostat = 0
 Mirostat sampling mode (0 = disabled, 1 = Mirostat, 2 = Mirostat 2.0)
 
float mirostatTau = 5f
 Mirostat target entropy (tau) - balance between coherence and diversity.
 
float mirostatEta = 0.1f
 Mirostat learning rate (eta) - adaptation speed.
 
int nProbs = 0
 Include top N token probabilities in response (0 = disabled)
 
bool ignoreEos = false
 Ignore end-of-stream token and continue generating.
 
- Properties inherited from LLMUnity.LLMClient
bool remote [get, set]
 Whether this client uses a remote server connection.
 
LLM llm [get, set]
 The local LLM instance (null if using remote)
 
string APIKey [get, set]
 API key for remote server authentication.
 
string host [get, set]
 Remote server hostname or IP address.
 
int port [get, set]
 Remote server port number.
 
string grammar [get, set]
 Current grammar constraints for output formatting.
 

Detailed Description

Class implementing the LLM embedder.

Definition at line 13 of file LLMEmbedder.cs.

Member Function Documentation

◆ IsAutoAssignableLLM()

override bool LLMUnity.LLMEmbedder.IsAutoAssignableLLM ( LLM llmInstance)
inlinevirtual

Determines if an LLM instance can be auto-assigned to this client. Override in derived classes to implement specific assignment logic.

Parameters
llmInstanceLLM instance to evaluate
Returns
True if the LLM can be auto-assigned

Reimplemented from LLMUnity.LLMClient.

Definition at line 24 of file LLMEmbedder.cs.


The documentation for this class was generated from the following file: