N/llm Module
The content in this help topic pertains to SuiteScript 2.1.
The N/llm
module supports generative artificial intelligence (AI) capabilities in SuiteScript. You can use this module to send requests to the large language models (LLMs) supported by NetSuite and to receive LLM responses to use in your scripts.
If you are new to using generative AI in SuiteScript, see the help topic SuiteScript 2.x Generative AI APIs. It contains essential information about this feature.
SuiteScript Generative AI APIs (N/llm
module) are available only for accounts located in certain regions. For a list of these regions, see Generative AI Availability in NetSuite.
In This Help Topic
N/llm Module Members
Member Type |
Name |
Return Type / Value Type |
Supported Script Types |
Description |
---|---|---|---|---|
Object |
Object |
Server scripts |
The chat message. |
|
Object |
Server scripts |
A citation returned from the LLM. |
||
Object |
Server scripts |
A document to be used as source content when calling the LLM. |
||
Object |
Server scripts |
The response from the LLM. |
||
Method |
Object |
Server scripts |
Creates a chat message based on a specified role and text. |
|
Object |
Server scripts |
Creates a document to be used as source content when calling the LLM. |
||
Object |
Server scripts |
Takes the ID of an existing prompt and values for variables used in the prompt and returns the response from the LLM. When unlimited usage mode is used, it also accepts the OCI configuration parameters. |
||
void |
Server scripts |
Takes the ID of an existing prompt and values for variables used in the prompt and asynchronously returns the response from the LLM. When unlimited usage mode is used, it also accepts the OCI configuration parameters. |
||
Object |
Server scripts |
Takes a prompt and parameters for the LLM and returns the response from the LLM. When unlimited usage mode is used, it also accepts the OCI configuration parameters. |
||
void |
Server scripts |
Takes a prompt and parameters for the LLM and asynchronously returns the response from the LLM. When unlimited usage mode is used, it also accepts the OCI configuration parameters. |
||
number |
Server scripts |
Returns the number of free requests in the current month. |
||
void |
Server scripts |
Asynchronously returns the number of free requests in the current month. |
||
Enum |
enum |
Server scripts |
Holds the string values for the author (role) of a chat message. Use this enum to set the value of the ChatMessage.role property. |
|
enum |
Server scripts |
Holds the string values for the large language model to be used. Use this enum to set the |
ChatMessage Object Members
Member Type |
Name |
Return Type / Value Type |
Supported Script Types |
Description |
---|---|---|---|---|
Property |
string |
Server scripts |
The author (role) of the chat message. Use the llm.ChatRole enum to set the value. |
|
string |
Server scripts |
Text of the chat message. Can be either the prompt sent by the script or the response returned by the LLM. |
Citation Object Members
Member Type |
Name |
Return Type / Value Type |
Supported Script Types |
Description |
---|---|---|---|---|
Property |
string[] |
Server scripts |
The IDs of the documents where the cited text is located. |
|
number |
Server scripts |
The ending position of the cited text. |
||
number |
Server scripts |
The starting position of the cited text. |
||
string |
Server scripts |
The cited text from the documents. |
Document Object Members
Member Type |
Name |
Return Type / Value Type |
Supported Script Types |
Description |
---|---|---|---|---|
Property |
string |
Server scripts |
The content of the document. |
|
string |
Server scripts |
The ID of the document. |
Response Object Members
Member Type |
Name |
Return Type / Value Type |
Supported Script Types |
Description |
---|---|---|---|---|
Property |
Server scripts |
List of chat messages. |
||
Server scripts |
List of citations used to generate the response. |
|||
Server scripts |
List of documents used to generate the response. |
|||
string |
Server scripts |
Model used to produce the LLM response. |
||
string |
Server scripts |
Text returned by the LLM. |