2024:LLM Connector (Repository Option): Difference between revisions
| Line 31: | Line 31: | ||
</div> | </div> | ||
=== OpenAI === | === OpenAI Provider === | ||
{|class="fyi-box" | {|class="fyi-box" | ||
Revision as of 11:10, 12 August 2024
|
2025 BETA |
This article covers new or changed functionality in the current or upcoming beta version of Grooper. Features are subject to change before version 2025's GA release. Configuration and functionality may differ from later beta builds and the final 2025 release. |
LLM Connector is a Repository Option that enables large language model (LLM) powered AI features for a Grooper Repository.
Glossary
LLM Connector: LLM Connector is a Repository Option that enables large language model (LLM) powered AI features for a Grooper Repository.
Repository Option: Repository Options are optional features that affect the entire repository. These optional features enable functionality that otherwise do not work without first establishing the connections these options provide. Repository Options are added to a Grooper Repository and configured using the database Root node's Options property.
Repository: A "repository" is a general term in computer science referring to where files and/or data is stored and managed. In Grooper, the term "repository" may refer to:
- PRIMARILY a Grooper Repository. This is most commonly what people are referring to when they simply say "repository".
- Less commonly a CMIS Repository
Root: The Grooper database Root node is the topmost element of the Grooper Repository. All other nodes in a Grooper Repository are its children/descendants. The Grooper Root also stores several settings that apply to the Grooper Repository, including the license serial number or license service URL and Repository Options.
About
The only configurable property on the LLM Connector is Service Providers. This property is a collection of LLM Providers which contain properties that establish a connection an LLM API.
Azure Provider
The main property of concern here is Deployments. This property is a collection one or more Deployments specific to the Azure LLM API.
OpenAI Provider
|
FYI |
In order to connect Grooper to OpenAI webservices, you will need an OpenAI API key.
|
URL
This property is the URL at which OpenAI services are hosted. This property will typically be left at the default setting, as most users will use OpenAI's hosted service. If you are hosting a ChatGPT clone, you would use the URL to your local clone.
Authorization
This property controls how the API key is handed to the requested service.
- None: No API key required.
- Bearer: "Bearer" authorization is a type of authentication scheme used in HTTP authentication protocols. In this scheme, a "bearer token" is sent in the HTTP header to authenticate requests to protected resources. The token acts as a key, and the server validates it to allow access. The format of the authorization header is typically:
Authorization: Bearer <token>
- This method is widely used in modern web services and APIs, particularly in OAuth 2.0.
- API Key: This is a "Microsoft-style" API key authentication. "Microsoft-style API key authentication" refers to an authentication method where clients include an API key in the HTTP request header to access Microsoft services or APIs. This key acts as a unique identifier for the client and authorizes their access to the requested resources. The typical format of the authorization header is:
Ocp-Apim-Subscription-Key: <api-key>
- This method is simple to implement and is used to track and control how the API is used.
API Key
This property is the API key to use when accessing web services. This is the main property to consider when connecting to the OpenAI API. Obtain an API key from OpenAI and insert it here.
Use System Messages
This property specifies whether to use "system" messages for contextual information, or use "user" messages for all interaction. This is off by default with the assumption you may choose to connect to LLMs like Mistral or Llama, as they do not use system messages.
In the context of large language models, such as those used in conversational AI systems, "system messages" and "user messages" serve distinct roles:
- System Messages: These are instructions or context provided by the system to guide the behavior and responses of the language model. They often include information about the conversation's tone, style, and specific tasks the model should perform. For example, a system message might specify that the model should respond in a formal tone or prioritize certain types of information.
- User Messages: These are inputs or queries provided by the user. They represent the questions, commands, or prompts that the user wants the model to respond to. The model processes these messages to generate appropriate and relevant responses based on the given instructions and context.
Together, system and user messages facilitate dynamic and contextually appropriate interactions between the user and the language model.
How To
Establish an LLM Connector
- Click on the Grooper Root node.
- Click the ellipsis button for the Options property.
- In the "Options" window click the "Add" button.
- Choose "LLM Connector" from the drop-down menu.
- Click the ellipsis button for the "Service Providers" property of the newly added LLM Connector.
- Click the "Add" button in the "Service Providers" window.
- Choose the Open AI Provider option from the drop-down menu.
- Enter your API Key into the API Key property.






