LLMNode
Represents a node in a graph that uses a Large Language Model (LLM).
YAML decl:
kind: LLMNode
name: StartNode
prompts:
system:
en: "You are a helpful assistant."
ru: "Вы помощник."
tools:
- WeatherTool
- EmailTool
Language:
Prompts can be defined in multiple languages. The fallback_lang
is used when a specific language prompt is not available.
Language order in prompts isn't important.
so:
prompts:
system:
en: "You are a helpful assistant."
ru: "Вы помощник."
is equivalent to:
prompts:
en:
system: "You are a helpful assistant."
ru:
system: "Вы помощник."
Usage:
LLMNode(declaration=yaml_dict, LLMNode)
or
LLMNode(yaml_path="llm_node.yaml")
Attributes
attribute
__slots__= BaseNode.__slots__ + ('prompts', 'registry')
attribute
spec_type= LLMNodeSpec
attribute
state_type= LLMNodeState
attribute
registry= registry
Functions
func
__init____init__(self, /, spec, registry, *, initial_data=None, yaml_path=None, strict=False, default_lang='en', fallback_lang='en') -> None
Initialize LLM node with specification and registry.
Args: spec: LLM node specification defining prompts and tools registry: Component registry for tool and dependency resolution initial_data: Optional initial data for the component yaml_path: Optional path to the YAML file this node was loaded from strict: Whether to enforce strict validation default_lang: Default language code for prompt selection fallback_lang: Fallback language code when default is unavailable
param
selfparam
specLLMNodeSpec
param
registryRegistry
param
initial_datadict[str, Any] | None
= None
param
yaml_pathstr | None
= None
param
strictbool
= False
param
default_langstr
= 'en'
param
fallback_langstr
= 'en'
Returns
None
func
add_toolsadd_tools(self, /, tools) -> None
Add tool nodes to this LLM node for function calling.
Args: tools: List of ToolNode instances to register with this LLM node
Raises: TypeError: If any item in tools is not a ToolNode instance
param
selfparam
toolslist[ToolNode]
Returns
None
func
compilecompile(self) -> None
Compile the LLM node for execution.
Initializes prompts bundle and prepares the node for invocation. Must be called before invoke().
Raises: LimanError: If the node is already compiled
param
selfReturns
None
func
invokeinvoke(self, /, llm, inputs, lang=None, **kwargs) -> LangChainMessage
Execute the LLM node with given inputs.
Combines system prompts with input messages and invokes the LLM with available tools. Returns the LLM's response message.
Args: llm: Language model instance to use for generation inputs: Sequence of input messages for the conversation lang: Language code for prompt selection (uses default_lang if None) **kwargs: Additional arguments passed to LLM invocation
Returns: Response message from the language model
Raises: LimanError: If node is not compiled or tool is not found in registry
param
selfparam
llmBaseChatModel
param
inputsSequence[BaseMessage]
param
langLanguageCode | None
= None
param
kwargsAny
= {}
Returns
LangChainMessage
func
get_new_stateget_new_state(self) -> LLMNodeState
Create new state instance for this LLM node.
Returns: Fresh LLMNodeState with empty message history
param
selfReturns
LLMNodeState
func
_init_prompts_init_prompts(self) -> None
param
selfReturns
None
Last updated on