Model Parameter Values by LLM
The following table shows the model parameter values for the supported large language model families. These parameters are used by the llm.generateText(options) and llm.generateText.promise(options) methods. For more information about the model parameters, refer to Chat Model Parameters topic in the Oracle Cloud Infrastructure Documentation.
Parameter |
Cohere Command R and Command R+Values |
Meta Llama Values |
---|---|---|
|
Number between 1 and 4,000 with a default of 2,000. Average tokens per word is 3 (or about 4 characters per token). |
Number between 1 and 8,192 with a default of 2,000. Average tokens per word is 3 (or about 4 characters per token). |
|
Number between 0 and 1 with a default of 0. Must be 0 if |
Number between -2 and 2, with a default of 0 ( Positive numbers encourage the model to use new tokens and negative numbers encourage the model to repeat the tokens. |
|
Number between 0 and 1 with a default of 0. Must be 0 if |
Number between -2 and 2, with a default of 0 ( |
|
Input token limit is 128,000. |
Split between input and output tokens. Limit of 128,000 tokens for both combined. |
|
Number between 0 and 1 with a default of 0.2 |
Number between 0 and 2 with a default of 0.2 |
|
Number between 0 and 500 with a default of 500. |
Number that is either -1 ( |
|
Number between 0 and 1 with a default of 0.7. |
Number between 0 and 1 with a default of 0.7. |