N/llm Module

Note:

The content in this help topic pertains to SuiteScript 2.1.

The N/llm module supports generative artificial intelligence (AI) capabilities in SuiteScript. You can use this module to send requests to the large language models (LLMs) supported by NetSuite and receive responses to use in your scripts.

                                   

If you're new to using generative AI in SuiteScript, see SuiteScript 2.x Generative AI APIs. That topic contains essential information about this feature.

The following list summarizes the main features that are available using the N/llm module:

Using OCI Credentials to Obtain Additional Usage

By default, NetSuite provides a free monthly usage pool of requests for the N/llm module. Successful calls to generate methods (such as llm.generateText(options) and llm.evaluatePrompt(options)) and embed methods (such as llm.embed(options)) consume usage from this pool, and the pool is refreshed each month. Usage is tracked separately for generate methods and embed methods. You can track your current monthly usage on the AI Preferences page in NetSuite. For more information, see View SuiteScript AI Usage Limit and Usage.

Each SuiteApp installed in your account gets its own separate monthly usage pool for N/llm methods, and these SuiteApp pools are independent from the usage pool for your regular (non-SuiteApp) scripts. For example, if you install two SuiteApps, each with scripts that use N/llm methods, each SuiteApp draws from its own unique usage pool. This approach means you get twice the total SuiteApp usage (one pool per SuiteApp). Any other scripts outside of SuiteApps use a separate usage pool, and SuiteApp usage doesn't count against it. This setup ensures that SuiteApps can't use up all your monthly allocation and block your own scripts from calling N/llm methods.

If you want more monthly usage, you can provide the Oracle Cloud Infrastructure (OCI) credentials for an Oracle Cloud account that includes the OCI Generative AI service. When you provide these credentials, usage is drawn from the provided OCI account instead of the free usage pool. For more information, see Using Your Own OCI Configuration for SuiteScript Generative AI APIs.

Important:

As you work with this module, keep the following considerations in mind:

  • Generative AI features, such as the N/llm module, use creativity in their responses. Make sure you validate the AI-generated responses for accuracy and quality. Oracle NetSuite isn't responsible or liable for the use or interpretation of AI-generated content.

  • SuiteScript Generative AI APIs (N/llm module) are available only for accounts located in certain regions. For a list of these regions, see Generative AI Availability in NetSuite.

In This Help Topic

N/llm Module Members

Member Type

Name

Return Type / Value Type

Supported Script Types

Description

Object

llm.ChatMessage

Object

Server scripts

The chat message.

llm.Citation

Object

Server scripts

A citation returned from the LLM.

llm.Document

Object

Server scripts

A document to be used as source content when calling the LLM.

llm.EmbedResponse

Object

Server scripts

The embeddings response returned from the LLM.

llm.Response

Object

Server scripts

The response returned from the LLM.

llm.StreamedResponse

Object

Server scripts

The streamed response returned from the LLM.

Method

llm.createChatMessage(options)

Object

Server scripts

Creates a chat message based on a specified role and text.

llm.createDocument(options)

Object

Server scripts

Creates a document to be used as source content when calling the LLM.

llm.embed(options)

Object

Server scripts

Returns the embeddings from the LLM for a given input.

llm.embed.promise(options)

void

Server scripts

Asynchronously returns the embeddings from the LLM for a given input.

llm.evaluatePrompt(options)

Object

Server scripts

Takes the ID of an existing prompt and values for variables used in the prompt and returns the response from the LLM. When you're using unlimited usage mode, this method also accepts the OCI configuration parameters.

llm.evaluatePrompt.promise(options)

void

Server scripts

Takes the ID of an existing prompt and values for variables used in the prompt and asynchronously returns the response from the LLM. When you're using unlimited usage mode, this method also accepts the OCI configuration parameters.

llm.evaluatePromptStreamed(options)

Object

Server scripts

Takes the ID of an existing prompt and values for variables used in the prompt and returns the streamed response from the LLM. When you're using unlimited usage mode, this method also accepts the OCI configuration parameters.

llm.evaluatePromptStreamed.promise(options)

void

Server scripts

Takes the ID of an existing prompt and values for variables used in the prompt and asynchronously returns the streamed response from the LLM. When you're using unlimited usage mode, this method also accepts the OCI configuration parameters.

llm.generateText(options)

Object

Server scripts

Takes a prompt and parameters for the LLM and returns the response from the LLM. When you're using unlimited usage mode, this method also accepts the OCI configuration parameters.

llm.generateText.promise(options)

void

Server scripts

Takes a prompt and parameters for the LLM and asynchronously returns the response from the LLM. When you're using unlimited usage mode, this method also accepts the OCI configuration parameters.

llm.generateTextStreamed(options)

Object

Server scripts

Takes a prompt and parameters for the LLM and returns the streamed response from the LLM. When you're using unlimited usage mode, this method also accepts the OCI configuration parameters.

llm.generateTextStreamed.promise(options)

void

Server scripts

Takes a prompt and parameters for the LLM and asynchronously returns the streamed response from the LLM. When you're using unlimited usage mode, this method also accepts the OCI configuration parameters.

llm.getRemainingFreeUsage()

number

Server scripts

Returns the number of free requests in the current month.

llm.getRemainingFreeUsage.promise()

void

Server scripts

Asynchronously returns the number of free requests in the current month.

llm.getRemainingFreeEmbedUsage()

number

Server scripts

Returns the number of free embeddings requests in the current month.

llm.getRemainingFreeEmbedUsage.promise()

void

Server scripts

Asynchronously returns the number of free embeddings requests in the current month.

Enum

llm.ChatRole

enum

Server scripts

Holds the string values for the author (role) of a chat message.

Use this enum to set the value of the options.role parameter in llm.createChatMessage(options).

llm.EmbedModelFamily

enum

Server scripts

The large language model to be used to generate embeddings.

Use this enum to set the value of the options.embedModelFamily parameter in llm.embed(options).

llm.ModelFamily

enum

Server scripts

Holds the string values for the large language model to be used.

Use this enum to set the value of the options.model parameter in llm.generateText(options).

llm.Truncate

enum

Server scripts

The truncation method to use when embeddings input exceeds 512 tokens.

Use this enum to set the value of the options.truncate parameter in llm.embed(options).

ChatMessage Object Members

Member Type

Name

Return Type / Value Type

Supported Script Types

Description

Property

ChatMessage.role

string

Server scripts

The author (role) of the chat message.

ChatMessage.text

string

Server scripts

Text of the chat message.

This text can be either the prompt sent by the script or the response returned by the LLM.

Citation Object Members

Member Type

Name

Return Type / Value Type

Supported Script Types

Description

Property

Citation.documentIds

string[]

Server scripts

The IDs of the documents where the cited text is located.

Citation.end

number

Server scripts

The ending position of the cited text.

Citation.start

number

Server scripts

The starting position of the cited text.

Citation.text

string

Server scripts

The cited text from the documents.

Document Object Members

Member Type

Name

Return Type / Value Type

Supported Script Types

Description

Property

Document.data

string

Server scripts

The content of the document.

Document.id

string

Server scripts

The ID of the document.

EmbedResponse Object Members

Member Type

Name

Return Type / Value Type

Supported Script Types

Description

Property

EmbedResponse.embeddings

number[]

Server scripts

The embeddings returned from the LLM.

EmbedResponse.inputs

string[]

Server scripts

The list of inputs used to generate the embeddings response.

EmbedResponse.model

string

Server scripts

The model used to generate the embeddings response.

Response Object Members

Member Type

Name

Return Type / Value Type

Supported Script Types

Description

Property

Response.chatHistory

llm.ChatMessage[]

Server scripts

List of chat messages.

Response.citations

llm.Citation[]

Server scripts

List of citations used to generate the response.

Response.documents

llm.Document[]

Server scripts

List of documents used to generate the response.

Response.model

string

Server scripts

Model used to produce the LLM response.

Response.text

string

Server scripts

Text returned by the LLM.

Response.usage

llm.Usage

Server scripts

Token usage for a request to the LLM.

StreamedResponse Object Members

Member Type

Name

Return Type / Value Type

Supported Script Types

Description

Property

StreamedResponse.chatHistory

llm.ChatMessage[]

Server scripts

List of chat messages.

StreamedResponse.citations

llm.Citation[]

Server scripts

List of citations used to generate the streamed response.

StreamedResponse.documents

llm.Document[]

Server scripts

List of documents used to generate the streamed response.

StreamedResponse.model

string

Server scripts

Model used to produce the streamed response.

StreamedResponse.text

string

Server scripts

Text returned by the LLM.

Usage Object Members

Member Type

Name

Return Type / Value Type

Supported Script Types

Description

Property

Usage.completionTokens

number

Server scripts

The number of tokens in the response from the LLM.

Usage.promptTokens

number

Server scripts

The number of tokens in the request to the LLM.

Usage.totalTokens

number

Server scripts

The total number of tokens for the entire request to the LLM.

Related Topics

General Notices