LLM for Unity  v2.3.0
Create characters in Unity with LLMs!
Loading...
Searching...
No Matches
LLMUnity.LLMCaller Class Reference

Class implementing calling of LLM functions (local and remote). More...

Inheritance diagram for LLMUnity.LLMCaller:
[legend]

Public Member Functions

virtual void Awake ()
 The Unity Awake function that initializes the state before the application starts. The following actions are executed:
 
virtual bool IsValidLLM (LLM llmSet)
 Checks if a LLM is valid for the LLMCaller.
 
virtual bool IsAutoAssignableLLM (LLM llmSet)
 Checks if a LLM can be auto-assigned if the LLM of the LLMCaller is null.
 
virtual void CancelRequests ()
 Cancel the ongoing requests e.g. Chat, Complete.
 
virtual async Task< List< int > > Tokenize (string query, Callback< List< int > > callback=null)
 Tokenises the provided query.
 
virtual async Task< string > Detokenize (List< int > tokens, Callback< string > callback=null)
 Detokenises the provided tokens to a string.
 
virtual async Task< List< float > > Embeddings (string query, Callback< List< float > > callback=null)
 Computes the embeddings of the provided input.
 

Public Attributes

bool advancedOptions = false
 toggle to show/hide advanced options in the GameObject
 
bool remote = false
 toggle to use remote LLM server or local LLM
 
string APIKey
 allows to use a server with API key
 
string host = "localhost"
 host to use for the LLM server
 
int port = 13333
 port to use for the LLM server
 
int numRetries = 10
 number of retries to use for the LLM server requests (-1 = infinite)
 

Properties

LLM llm [get, set]
 

Detailed Description

Class implementing calling of LLM functions (local and remote).

Definition at line 16 of file LLMCaller.cs.

Member Function Documentation

◆ Awake()

virtual void LLMUnity.LLMCaller.Awake ( )
inlinevirtual

The Unity Awake function that initializes the state before the application starts. The following actions are executed:

  • the corresponding LLM server is defined (if ran locally)
  • the grammar is set based on the grammar file
  • the prompt and chat history are initialised
  • the chat template is constructed
  • the number of tokens to keep are based on the system prompt (if setNKeepToPrompt=true)

Reimplemented in LLMUnity.LLMCharacter.

Definition at line 53 of file LLMCaller.cs.

◆ CancelRequests()

virtual void LLMUnity.LLMCaller.CancelRequests ( )
inlinevirtual

Cancel the ongoing requests e.g. Chat, Complete.

Definition at line 221 of file LLMCaller.cs.

◆ Detokenize()

virtual async Task< string > LLMUnity.LLMCaller.Detokenize ( List< int > tokens,
Callback< string > callback = null )
inlinevirtual

Detokenises the provided tokens to a string.

Parameters
tokenstokens to detokenise
callbackcallback function called with the result string
Returns
the detokenised string

Definition at line 352 of file LLMCaller.cs.

◆ Embeddings()

virtual async Task< List< float > > LLMUnity.LLMCaller.Embeddings ( string query,
Callback< List< float > > callback = null )
inlinevirtual

Computes the embeddings of the provided input.

Parameters
tokensinput to compute the embeddings for
callbackcallback function called with the result string
Returns
the computed embeddings

Definition at line 367 of file LLMCaller.cs.

◆ IsAutoAssignableLLM()

virtual bool LLMUnity.LLMCaller.IsAutoAssignableLLM ( LLM llmSet)
inlinevirtual

Checks if a LLM can be auto-assigned if the LLM of the LLMCaller is null.

<param name="llmSet"LLM object>

Returns
bool specifying whether the LLM can be auto-assigned

Reimplemented in LLMUnity.LLMEmbedder.

Definition at line 105 of file LLMCaller.cs.

◆ IsValidLLM()

virtual bool LLMUnity.LLMCaller.IsValidLLM ( LLM llmSet)
inlinevirtual

Checks if a LLM is valid for the LLMCaller.

Parameters
llmSetLLM object
Returns
bool specifying whether the LLM is valid

Reimplemented in LLMUnity.LLMCharacter.

Definition at line 95 of file LLMCaller.cs.

◆ Tokenize()

virtual async Task< List< int > > LLMUnity.LLMCaller.Tokenize ( string query,
Callback< List< int > > callback = null )
inlinevirtual

Tokenises the provided query.

Parameters
queryquery to tokenise
callbackcallback function called with the result tokens
Returns
list of the tokens

Definition at line 337 of file LLMCaller.cs.

Member Data Documentation

◆ advancedOptions

bool LLMUnity.LLMCaller.advancedOptions = false

toggle to show/hide advanced options in the GameObject

Definition at line 19 of file LLMCaller.cs.

◆ APIKey

string LLMUnity.LLMCaller.APIKey

allows to use a server with API key

Definition at line 31 of file LLMCaller.cs.

◆ host

string LLMUnity.LLMCaller.host = "localhost"

host to use for the LLM server

Definition at line 34 of file LLMCaller.cs.

◆ numRetries

int LLMUnity.LLMCaller.numRetries = 10

number of retries to use for the LLM server requests (-1 = infinite)

Definition at line 38 of file LLMCaller.cs.

◆ port

int LLMUnity.LLMCaller.port = 13333

port to use for the LLM server

Definition at line 36 of file LLMCaller.cs.

◆ remote

bool LLMUnity.LLMCaller.remote = false

toggle to use remote LLM server or local LLM

Definition at line 21 of file LLMCaller.cs.

Property Documentation

◆ llm

LLM LLMUnity.LLMCaller.llm
getset

Definition at line 24 of file LLMCaller.cs.


The documentation for this class was generated from the following file: