llm.generateText(options)

Note:

The content in this help topic pertains to SuiteScript 2.1.

Method Description

Returns the response from LLM for a given prompt.

Returns

llm.Response

Supported Script Types

Server scripts

For more information, see SuiteScript 2.x Script Types.

Governance

100

Module

N/llm Module

Since

2024.1

Parameters

Parameter

Type

Required / Optional

Description

Since

options.prompt

string

required

Prompt for the LLM.

2024.1

options.chatHistory

llm.ChatMessage[]

optional

Chat history to be taken into consideration.

2024.1

options.modelFamily

enum

optional

Specifies the LLM to use. Use llm.ModelFamily to set the value. If not specified, the Cohere Command R LLM is used.

Note:

JavaScript does not include an enumeration type. The SuiteScript 2.x documentation uses the term enumeration (or enum) to describe a plain JavaScript object with a flat, map-like structure. In this object, each key points to a read-only string value.

2024.2

options.modelParameters

Object

optional

Parameters of the model. For more information about the model parameters, refer to the Chat Model Parameters topic in the Oracle Cloud Infrastructure Documentation.

2024.1

options.modelParameters.frequencyPenalty

number

optional

A penalty that is assigned to a token when that token appears frequently. The higher the value, the stronger a penalty is applied to previously present tokens, proportional to how many times they have already appeared in the prompt or prior generation. See Model Parameter Values by LLM for valid values.

2024.1

options.modelParameters.maxTokens

number

optional

The maximum number of tokens the LLM is allowed to generate. The average number of tokens per word is 3. See Model Parameter Values by LLM for valid values.

2024.1

options.modelParameters.presencePenalty

number

optional

A penalty that is assigned to each token when it appears in the output to encourage generating outputs with tokens that haven't been used. Similar to frequencyPenalty, except that this penalty is applied equally to all tokens that have already appeared, regardless of their exact frequencies. See Model Parameter Values by LLM for valid values.

2024.1

options.modelParameters.temperature

number

optional

Defines a range of randomness for the response. A lower temperature will lean toward the highest probability tokens and expected answers, while a higher temperature will deviate toward random and unconventional responses. A lower value works best for responses that must be more factual or accurate, and a higher value works best for getting more creative responses. See Model Parameter Values by LLM for valid values.

2024.1

options.modelParameters.topK

number

optional

Determines how many tokens are considered for generation at each step. See Model Parameter Values by LLM for valid values.

2024.1

options.modelParameters.topP

number

optional

Sets the probability, which ensures that only the most likely tokens with total probability mass of p are considered for generation at each step. If both topK and topP are set, topP acts after topK. See Model Parameter Values by LLM for valid values.

2024.1

options.ociConfig

Object

optional

Configuration needed for unlimited usage through OCI Generative AI Service. Required only when accessing the LLM through an Oracle Cloud Account and the OCI Generative AI Service. SuiteApps installed to target accounts are prevented from using the free usage pool for N/llm and must use the OCI configuration.

2024.1

options.ociConfig.compartmentId

string

optional

Compartment OCID. For more information, refer to Managing Compartments in the Oracle Cloud Infrastructure Documentation.

2024.1

options.ociConfig.endpointId

string

optional

Endpoint ID. This value is needed only when a custom OCI DAC (dedicated AI cluster) is to be used. For more information, refer to Managing an Endpoint in Generative AI in the Oracle Cloud Infrastructure Documentation.

2024.1

options.ociConfig.fingerprint

string

optional

Fingerprint of the public key (only a NetSuite secret is accepted—see Creating Secrets ). For more information, refer to Required Keys and OCIDs in the Oracle Cloud Infrastructure Documentation.

2024.1

options.ociConfig.privateKey

string

optional

Private key of the OCI user (only a NetSuite secret is accepted—see Creating Secrets ). For more information, refer to Required Keys and OCIDs in the Oracle Cloud Infrastructure Documentation.

2024.1

options.ociConfig.tenancyId

string

optional

Tenancy OCID. For more information, refer to Managing the Tenancy in the Oracle Cloud Infrastructure Documentation.

2024.1

options.ociConfig.userId

string

optional

User OCID. For more information, refer to Managing Users in the Oracle Cloud Infrastructure Documentation.

2024.1

options.preamble

string

optional

Preamble override for the LLM. A preamble is the Initial context or guiding message for an LLM. For more details about using a preamble, refer to About the Chat Models in Generative AI (Chat Model Parameters section) in the Oracle Cloud Infrastructure Documentation.

Note:

Only valid for the Cohere Command R model.

2024.1

options.timeout

number

optional

Timeout in milliseconds, defaults to 30,000.

2024.1

Errors

Error Code

Thrown If

SSS_MISSING_REQD_ARGUMENT

The options.prompt parameter is missing.

MUTUALLY_EXCLUSIVE_ARGUMENTS

Both options.modelParameters.presencePenalty and options.modelParameters.frequencyPenalty parameters are used when options.modelFamily is set to COHERE_COMMAND_R. You can use a value greater than zero for one or the other, but not both.

UNRECOGNIZED_MODEL_PARAMETERS

One or more unrecognized model parameters have been used.

UNRECOGNIZED_OCI_CONFIG_PARAMETERS

One or more unrecognized parameters for OCI configuration have been used.

ONLY_API_SECRET_IS_ACCEPTED

The options.ociConfig. privateKey or options.ociConfig.fingerprint parameters are not NetSuite API secrets.

INVALID_MODEL_FAMILY_VALUE

The options.modelFamily parameter was not set to a valid options. (For example, there may have been a typo in the value or in the enum.)

MODEL_1_DOES_NOT_ACCEPT_PREAMBLE

A parameter was used with a model that does not support the parameter. (For example, this error would be returned if the options.preamble parameter is used in when the model used is Meta Llama.)

INVALID_MAX_TOKENS_VALUE

The options.modelParameters.maxTokens parameter value is incorrect for the model. See Model Parameter Values by LLM for valid values.

INVALID_TEMPERATURE_VALUE

The options.modelParameters.temperature parameter value is incorrect for the model. See Model Parameter Values by LLM for valid values.

INVALID_TOP_K_VALUE

The options.modelParameters.topK parameter value is incorrect for the model. See Model Parameter Values by LLM for valid values.

INVALID_TOP_P_VALUE

The options.modelParameters.topP parameter value is incorrect for the model. See Model Parameter Values by LLM for valid values.

INVALID_FREQUENCY_PENALTY_VALUE

The options.modelParameters.frequencyPenalty parameter value is incorrect for the model. See Model Parameter Values by LLM for valid values.

INVALID_PRESENCE_PENALTY_VALUE

The options.modelParameters.presencePenalty parameter value is incorrect for the model. See Model Parameter Values by LLM for valid values.

MAXIMUM_PARALLEL_REQUESTS_LIMIT_EXCEEDED

The number of parallel requests to the LLM is greater than 5.

Syntax

Important:

The following code sample shows the syntax for this member. It is not a functional example. For a complete script example, see N/llm Module Script Samples.

            // Add additional code
...

const response = llm.generateText({
   // preamble is optional for Cohere and must not be used for Meta Llama
   preamble: "You are a successful salesperson. Answer in an enthusiastic, professional tone.", 
   prompt: "Hello World!",
   modelFamily: llm.ModelFamily.COHERE_COMMAND_R, // uses COHERE_COMMAND_R when modelFamily is omitted
   modelParameters: {
      maxTokens: 1000,
      temperature: 0.2,
      topK: 3,
      topP: 0.7,
      frequencyPenalty: 0.4,
      presencePenalty: 0
   },
   ociConfig: {
      // Replace ociConfig values with your Oracle Cloud Account values
      userId: 'ocid1.user.oc1..aaaaaaaanld….exampleuserid',
      tenancyId: 'ocid1.tenancy.oc1..aaaaaaaabt….exampletenancyid',
      compartmentId: 'ocid1.compartment.oc1..aaaaaaaaph….examplecompartmentid',
   // Replace fingerprint and privateKey with your NetSuite API secret ID values
      fingerprint: 'custsecret_oci_fingerprint',
      privateKey: 'custsecret_oci_private_key'
   }
});

...
// Add additional code 

          

Related Topics

General Notices