Configuring the AI settings
You can configure the AI settings to integrate an AI service, such as OpenAI, WatsonX, and Ollama. The AI service is used by the Deploy users to troubleshoot deployment failures.
Before you begin
Procedure
- Go to from the Web UI.
-
Provide the following AI provider details:
Field Description AI Provider Select the AI provider name. The OpenAI, WatsonX, and Ollama providers are supported. The AI settings are different based on your selection.
OpenAI settings AI Provider Endpoint Enter the OpenAI service endpoint. If you change the endpoint, you must update the API key as well.
AI Provider API Key Enter the API key that is used to make the OpenAI API requests. AI Provider Project ID Enter the OpenAI project ID, if created. AI Provider Organization Specify the unique organization ID. Model Name Choose a model name. The values in the Model Name field are automatically pulled based on the AI endpoint, API key, and organization details that you provided.
WatsonX settings AI Provider Endpoint Enter the WatsonX service endpoint. If you change the endpoint, you must update the API key as well.
AI Provider API Key Enter the API key that is used to make the WatsonX API requests. AI Provider Access Token Endpoint Enter the access token that authorizes access to WatsonX endpoint. AI Provider Project ID Enter the WatsonX project ID, if created. Model Name Choose a model name. The values in the Model Name field are automatically pulled based on the WatsonX endpoint and API key details that you provided.
Ollama settings AI Provider Endpoint Enter the Ollama service endpoint. Model Name Choose a model name. The values in the Model Name field are automatically pulled based on the Ollama endpoint details that you provided.
- Save your changes.