Bots From Extension: llm
LLM based AI System
This extension provides 1 bot.
Bot @llm:prompt-using-template
Bot Position In Pipeline: Source Sink
Prepare contents of prompt from the specified Query Template and then send the prompt to LLM
This bot expects a Restricted CFXQL.
Each parameter may be specified using '=' operator and AND logical operation
Following are the parameters expected for this Bot
| Parameter Name | Type | Default Value | Description |
|---|---|---|---|
| prompt_template* | Text | Query template name to generate the prompt | |
| input_data_as_variable | Text | no | Send input dataframe as specified variable to Template |
| temperature | Text | 0.3 | Temperature parameter for response generation. Should be between 0.0 to 2.0. Closer to 0.0 means more deterministic, closer to 1.0 or higher means more random. |
| max_tokens | Text | 4096 | Maximum number of tokens in the response. Default is 4096. |
| timeout | Text | 60 | Request timeout in seconds. Default is 60 seconds. Increase for large documents. |
| model_name | Text | Model name to use for this prompt. If not specified, uses the model from credentials. | |
| debug | Text | no | Specify 'yes' to print full data being sent to LLM. Default is 'no' |
This bot also accepts wildcard parameters. Any additional name = 'value' parameters are passed to the bot.