AWS Bedrock Profile

Use the AWS Bedrock Profile to configure and manage the connection between HCL UnO Agentic AI Builder and the foundation models hosted on Amazon Bedrock.

Before you begin

  • You must have valid Model Access granted within your AWS Bedrock console for the specific models you intend to use.

  • You must have already created a valid AWS Credential in the Credential Library to authenticate this connection.

  • Ensure that all mandatory fields (marked with *) are completed accurately to enable functional model inference.

Table 1. Mandatory fields
Option Description
Name

A unique identifier for this configuration instance. This name will be used within the Agentic AI Builder to reference this specific model setup.

LLM Name

The specific authentication credential (previously created in the Credential Library) used to authorize the connection to AWS.

Table 2. Optional fields
Options Description
Model Name
The technical name of the foundation model to use (for example, anthropic.claude-v2 or amazon.titan-text-express-v1).
Note:

The Model Name field is where you specify the exact technical identifier for the Large Language Model (LLM) or Small Language Model (SLM) to be used.

If Discover Models is checked in the corresponding Credentials account, available LLM models will populate a dropdown menu for selection in the LLM Name field.

Max Tokens The maximum number of tokens the model is allowed to generate in the output.
Temperature The sampling temperature to use. Higher values mean the model will take more risks (more creative); lower values result in more deterministic output.
Top P Nucleus sampling parameter. Controls diversity by limiting the token pool to the top P probability mass.
Region Name The AWS region name where your Bedrock resources are hosted (for example, us-east-1, us-west-2).
Endpoint Url The custom endpoint URL for AWS Bedrock, if required (optional). Leave blank to use the default public AWS endpoint.