SuiteScript 2.x Generative AI APIs
The content in this help topic pertains to SuiteScript 2.1.
SuiteScript Generative AI APIs (N/llm Module) enable you to work with generative artificial intelligence (AI) in NetSuite using SuiteScript. You can use these APIs to send requests to large language models (LLMs) and to receive responses from them. NetSuite accesses LLMs through an integration with the Oracle Cloud Infrastructure (OCI) Generative AI service.
NetSuite provides generative AI capability in the user interface in two ways:
-
Text Enhance – Provides writing assistance for supported Text Area fields on standard NetSuite pages. For more information, see Text Enhance.
-
Prompt Studio – Lets you customize system prompts and Text Enhance actions to suit your business needs, as well as create new prompts and Text Enhance actions. For more information, see Prompt Studio.
SuiteScript Generative AI APIs (N/llm
module) are available only for accounts located in certain regions. For a list of these regions, see Generative AI Availability in NetSuite.
How the SuiteScript Generative AI APIs Work
-
A NetSuite developer uses the
N/llm
module in their SuiteScript code to send a request to the LLM. -
NetSuite accepts the prompt and any optional parameters included in the request and passes them to the OCI Generative AI Service.
-
OCI Generative AI service processes the request. OCI Generative AI service uses the LLM specified in your code. If you do not specify the LLM in your code, it uses the Cohere Command R LLM. The data never leaves Oracle, nor is it used by third parties for model training.
Note:When you use the N/llm module, your data may be processed globally according to the Oracle Services Privacy Policy.
-
OCI Generative AI Service sends the LLM response back to NetSuite.
-
NetSuite processes the response and returns it to the SuiteScript code.
-
The SuiteScript code uses the response per the rest of the script.
Supported LLMs
The generateText
methods have a modelFamily
parameter that you can use to specify the LLM used, and a modelParameters
object that enables you to send supported parameters to the LLM with your prompt. Model parameters are those supported by the Cohere Command R model (cohere.command-r-08-2024), Cohere Command R+model (cohere.command-r-plus-08-2024), or the Meta Llama 3.2 model (meta.llama-3.2-90b-vision-instruct) available through the OCI Generative AI service. If your code does not specify the model to use (through the modelFamily
parameter), the Cohere Command R model is used. For more information, refer to the Chat Model Parameters topic in the Oracle Cloud Infrastructure Documentation.
The evaluateText
methods take an existing prompt, along with values for any variables that the prompt uses, and send it to the LLM. You can use Prompt Studio to specify the model family that each prompt uses.
Usage Modes
There are three usage modes for using the N/llm
module. The following table describes the usage modes in more detail.
Usage Mode |
Description |
When to Use This Mode |
---|---|---|
Free |
Free, limited usage through NetSuite. When using the free limited-use mode, each successful response to NetSuite from the OCI Generative AI service counts as one use. When the capacity is used completely, the LLM module returns an error for subsequent calls until the next month. |
|
On Demand |
Unlimited, paid usage through your company’s own Oracle Cloud account. When using the unlimited-use model through your company’s own Oracle Cloud account, your company pays for on demand inferencing in the Oracle Generative AI service. To learn if this is the right choice for your company, refer to Paying for On Demand Inferencing in the Oracle Cloud Infrastructure Documentation. This mode requires that you complete the setup described in Using Your Own OCI Configuration for SuiteScript Generative AI APIs. You must provide your OCI credentials either in your SuiteScript code (using the |
|
Dedicated AI Cluster |
Unlimited, paid usage through your company’s own Oracle Cloud account. When using the unlimited-use model through your company’s own Oracle Cloud account, your company pays for dedicated AI clusters in the Oracle Generative AI service. To learn if this is the right choice for your company, refer to Paying for Dedicated AI Clusters in the Oracle Cloud Infrastructure Documentation. This mode requires that you complete the setup described in Using Your Own OCI Configuration for SuiteScript Generative AI APIs. You must provide your OCI credentials either in your SuiteScript code (using the |
|
SuiteScript API Governance Limits for N/llm
The N/llm
module is subject to SuiteScript API governance limits. The governance limits are included in individual method topics. For more information, see SuiteScript Governance and Limits and SuiteScript 2.x API Governance.