Model Parameter Values by LLM

Note:

The content in this help topic pertains to SuiteScript 2.1.

The N/llm module supports the following large language models (LLMs) that are provided in Oracle Cloud Infrastructure (OCI):

When calling certain N/llm methods, you can provide model parameter values to customize how you want the model to generate its response. For example, you can use the temperature model parameter to adjust the creativity of the response. Higher temperature values generally increase creativity by adding randomness and diversity to the response.

The following N/llm methods use these parameters:

The following table lists the accepted ranges for model parameter values, as well as the default values used when you don't specify the values explicitly. For more information about the model parameters, refer to Pretrained Foundational Models in Generative AI in the Oracle Cloud Infrastructure Documentation.

Parameter

Accepted Ranges and Default Values

maxTokens

Number between 1 and 4,000 with a default of 2,000. Average tokens per word is 3 (or about 4 characters per token).

frequencyPenalty

Number between 0 and 1 with a default of 0.

presencePenalty

Number between 0 and 1 with a default of 0.

prompt

For Cohere Command R and Command R+, input token limit is 128,000.

For Cohere Command A, input token limit is 256,000.

temperature

Number between 0 and 1 with a default of 0.2

topK

Number between 0 and 500 with a default of 500.

topP

Number between 0 and 1 with a default of 0.7.

Related Topics

General Notices