Model Parameter Values by LLM

The following table shows the model parameter values for the supported large language model families. These parameters are used by the llm.generateText(options) and llm.generateText.promise(options) methods. For more information about the model parameters, refer to Chat Model Parameters topic in the Oracle Cloud Infrastructure Documentation.

Parameter

Cohere Command R and Command R+Values

Meta Llama Values

maxTokens

Number between 1 and 4,000 with a default of 2,000. Average tokens per word is 3 (or about 4 characters per token).

Number between 1 and 8,192 with a default of 2,000. Average tokens per word is 3 (or about 4 characters per token).

frequencyPenalty

Number between 0 and 1 with a default of 0.

Must be 0 if presencePenalty is greater than 1.

Number between -2 and 2, with a default of 0 (frequencePenalty not used).

Positive numbers encourage the model to use new tokens and negative numbers encourage the model to repeat the tokens.

presencePenalty

Number between 0 and 1 with a default of 0.

Must be 0 if frequencyPenalty is greater than 1.

Number between -2 and 2, with a default of 0 (presencePenalty not used).

prompt

Input token limit is 128,000.

Split between input and output tokens. Limit of 128,000 tokens for both combined.

temperature

Number between 0 and 1 with a default of 0.2

Number between 0 and 2 with a default of 0.2

topK

Number between 0 and 500 with a default of 500.

Number that is either -1 (topK not used) or between 1 and 500 with a default of 500.

topP

Number between 0 and 1 with a default of 0.7.

Number between 0 and 1 with a default of 0.7.

Related Topics

General Notices