configure llmapp
The configure llmapp
command configures an LLM application. It is used to deploy LLM applications, such as an AI agent. In the notebook, you will be guided through a template form that will prompt you for various inputs, such as the LLM application name, the application type, the LLM model to use, and any tools that the AI agent will use.
Syntax
Parameters
To create an LLM application:
Select New from the LLM App drop-down list. Specify a name for the LLM application in the LLM App Name field.
Select the App Type, which is AI agent.
Enter a brief description of the application in the App Description field. The description will be appended to the LLM system prompt.
Select an LLM provider name from the Provider Name drop-down list, and enter a model name for the application in the LLM Model field.
If applicable, add tools for the AI agent:
Select New from the Tool drop-down list. Specify a name for the tool in the Tool Name field.
Select the Tool Type:
Custom Functions: a generic Python module with custom Pydantic functions. Functions and parameters must be adequately annotated. Textual queries are used to generate function calls with appropriate parameter values, which result in direct calls to the Python module.
RAG Query: a tool that interfaces with a document Store ID from a vector store. Textual queries may be sent to the vector store to retrieve relevant chunks from documents within a Store ID.
REST Query: a tool that interfaces with a REST endpoint URL. The REST endpoint must provide an OpenAPI specification that is adequately annotated. Textual queries are used to generate function calls with appropriate parameter values, which are then sent to the REST endpoint via HTTP Get or Post methods.
SQL Query: a tool that interfaces with a SQL table. The SQL table must be a configured data source or dataset. Textual queries are used to generate SQL query statements, which are then sent to a SQL engine.
Web Search: a tool that searches the web for relevant information. Textual queries may be sent to a web search engine.
Enter a brief description of the tool in the Tool Description field. The description will be appended to the LLM system prompt.
Keep the Use App LLM checkbox selected to use the application's LLM. If you clear the checkbox, you will need to select a Provider Name and enter an LLM Model.
Select or enter various tool parameters, depending on the tool type. See the Tool Parameters below.
Click the Add Tool button.
Select the Advanced Settings checkbox, and then specify settings for the LLM application's Temperature, Top-P, and Frequency Penalty.
Click the Save Configuration button to save the LLM application.
Tool Parameters
Select the Module File and the Class Name.
Enter the Function Names.
Example
Last updated