N/llm Module

Note:

The content in this help topic pertains to SuiteScript 2.1.

The N/llm module supports generative artificial intelligence (AI) capabilities in SuiteScript. You can use this module to send requests to the large language models (LLMs) supported by NetSuite and to receive LLM responses to use in your scripts.

                                   

If you are new to using generative AI in SuiteScript, see the help topic SuiteScript 2.x Generative AI APIs. It contains essential information about this feature.

Important:

SuiteScript Generative AI APIs (N/llm module) are available only for accounts located in certain regions. For a list of these regions, see Generative AI Availability in NetSuite.

In This Help Topic

N/llm Module Members

Member Type

Name

Return Type / Value Type

Supported Script Types

Description

Object

llm.ChatMessage

Object

Server scripts

The chat message.

llm.Response

Object

Server scripts

The response from the LLM.

Method

llm.createChatMessage(options)

Object

Server scripts

Creates a chat message based on a specified role and text.

llm.generateText(options)

Object

Server scripts

Takes a prompt and parameters for the LLM and returns the response from the LLM. When unlimited usage mode is used, it also accepts the OCI configuration parameters.

llm.evaluatePrompt(options)

Object

Server scripts

Takes the ID of an existing prompt and values for variables used in the prompt and returns the response from the LLM. When unlimited usage mode is used, it also accepts the OCI configuration parameters.

llm.evaluatePrompt.promise(options)

void

Server scripts

Takes the ID of an existing prompt and values for variables used in the prompt and asynchronously returns the response from the LLM. When unlimited usage mode is used, it also accepts the OCI configuration parameters.

llm.generateText.promise(options)

void

Server scripts

Takes a prompt and parameters for the LLM and asynchronously returns the response from the LLM. When unlimited usage mode is used, it also accepts the OCI configuration parameters.

llm.getRemainingFreeUsage()

number

Server scripts

Returns the number of free requests in the current month.

llm.getRemainingFreeUsage.promise()

void

Server scripts

Asynchronously returns the number of free requests in the current month.

Enum

llm.ChatRole

enum

Server scripts

Holds the string values for the author (role) of a chat message.

Use this enum to set the value of the ChatMessage.role property.

llm.ModelFamily

enum

Server scripts

Holds the string values for the large language model to be used.

Use this enum to set the model parameter in llm.generateText(options).

ChatMessage Object Members

Member Type

Name

Return Type / Value Type

Supported Script Types

Description

Property

ChatMessage.role

string

Server scripts

The author (role) of the chat message. Use the llm.ChatRole enum to set the value.

ChatMessage.text

string

Server scripts

Text of the chat message. Can be either the prompt sent by the script or the response returned by the LLM.

Response Object Members

Member Type

Name

Return Type / Value Type

Supported Script Types

Description

Property

Response.chatHistory

llm.ChatMessage[]

Server scripts

List of chat messages.

Response.model

string

Server scripts

Model used to produce the LLM response.

Response.text

string

Server scripts

Text returned by the LLM.

Related Topics

General Notices