N/llm Module Script Samples

Note:

The content in this help topic pertains to SuiteScript 2.1.

The following script samples demonstrate how to use the features of the N/llm module.

Send a Prompt to the LLM and Receive a Response

The following sample sends a "Hello World" prompt to the default NetSuite large language model (LLM) and receives the response. It also shows the remaining free usage for the month.

For instructions about how to run a SuiteScript 2.1 code snippet in the debugger, see On-Demand Debugging of SuiteScript 2.1 Scripts. Step through the code until the line before the end of the script to see the response text returned from the LLM and the remaining free usage for the month.

Note:

This sample script uses the require function so that you can copy it into the SuiteScript Debugger and test it. You must use the define function in an entry point script (the script you attach to a script record and deploy). For more information, see SuiteScript 2.x Script Basics and SuiteScript 2.x Script Types.

            /**
*@NApiVersion 2.1
*/
// This example shows how to query the default LLM
require(['N/llm'],
    function(llm) {        
        const response = llm.generateText({
            // modelFamily is optional. When omitted, the Cohere Command R model is used.
            // To try the Meta Llama model, remove the comment delimiter from the following line
            // modelFamily: llm.ModelFamily.META_LLAMA,
            prompt: "Hello World!",
            modelParameters: {
                maxTokens: 1000,
                temperature: 0.2,
                topK: 3,
                topP: 0.7,
                frequencyPenalty: 0.4,
                presencePenalty: 0
            }
        });
        const responseText = response.text;
        const remainingUsage = llm.getRemainingFreeUsage(); // View remaining monthly free usage
    }); 

          

Clean Up Content for Text Area Fields After Saving a Record

The following sample uses the large language model (LLM) to correct the text for the purchase description and the sales description fields of an inventory item record after the user saves the record. This sample also shows how to use the llm.generateText.promise method.

To test this script after script deployment:

  1. Go to Lists > Accounting > Items > New.

  2. Select Inventory Item.

  3. Enter an Item Name and optionally fill out any other fields.

  4. Select a value for Tax Schedule in the Accounting subtab.

  5. Enter text into the Purchase Description and Sales Description fields.

  6. Click Save.

    When you save, the script will trigger. The content in the Purchase Description and Sales Description fields will be corrected, and the record will be submitted.

Note:

This script sample uses the define function, which is required for an entry point script (a script you attach to a script record and deploy). You must use the require function if you want to copy the script into the SuiteScript Debugger and test it. For more information, see SuiteScript 2.x Global Objects.

            /**
 * @NApiVersion 2.1
 * @NScriptType UserEventScript
 */
define(['N/llm'], (llm) => {
    /**
     * @param {Object} scriptContext The updated inventory item
     * record to clean up typo errors for purchase description and
     * sales description fields. The values are set before the record
     * is submitted to be saved.
     */ 
    function fixTypos(scriptContext) {
        const purchaseDescription = scriptContext.newRecord.getValue({
            fieldId: 'purchasedescription'
        })
        const salesDescription = scriptContext.newRecord.getValue({
            fieldId: 'salesdescription'
        })

        const p1 = llm.generateText.promise({
            prompt: `Please clean up typos in the following text: 
                     ${purchaseDescription} and return just the corrected text. 
                     Return the text as is if there's no typo 
                     or you don't understand the text.`
        })
        const p2 = llm.generateText.promise({
            prompt: `Please clean up typos in the following text: 
                     ${salesDescription} and return just the corrected text. 
                     Return the text as is if there's no typo 
                     or you don't understand the text.`
        })

        // When both promises are resolved, set the updated values for the
        // record
        Promise.all([p1, p2]).then((results) => {
            scriptContext.newRecord.setValue({
                fieldId: 'purchasedescription',
                value: results[0].value.text
            })
            scriptContext.newRecord.setValue({
                fieldId: 'salesdescription',
                value: results[1].value.text
            })
        })
    }

    return { beforeSubmit: fixTypos }
}) 

          

Provide an LLM-based ChatBot for NetSuite Users

The following sample creates a custom NetSuite form titled Chat Bot. The user can enter a prompt for the large language model (LLM) in the Prompt field. After the user clicks Submit, NetSuite sends the request to the LLM. The LLM returns a response, which is displayed as part of the message history displayed on the form.

The script includes code that handles the prompts and responses as a conversation between the user and the LLM. The code associates the prompts the user enters as the USER messages (these messages are labeled You on the form) and the responses from the LLM as CHATBOT messages (these messages are labeled ChatBot on the form). The code also assembles a chat history and sends it along with the prompt to the LLM. Without the chat history, it would treat each prompt as an unrelated request. For example, if your first prompt asks a question about Las Vegas and your next prompt asks, “What are the top 5 activities here?”, the chat history gives the LLM the information that “here” means Las Vegas and may also help the LLM avoid repeating information it already provided.

Note:

This script sample uses the define function, which is required for an entry point script (a script you attach to a script record and deploy). You must use the require function if you want to copy the script into the SuiteScript Debugger and test it. For more information, see SuiteScript 2.x Global Objects.

            /**
 * @NApiVersion 2.1
 * @NScriptType Suitelet
 */

define(['N/ui/serverWidget', 'N/llm'], (serverWidget, llm) => {
  /**
   * Creates NetSuite form to communicate with LLM
   */  
  function onRequest (context) {
    const form = serverWidget.createForm({
      title: 'Chat Bot'
    })
    const fieldgroup = form.addFieldGroup({
      id: 'fieldgroupid',
      label: 'Chat'
    })
    fieldgroup.isSingleColumn = true
    const historySize = parseInt(
      context.request.parameters.custpage_num_chats || '0')
    const numChats = form.addField({
      id: 'custpage_num_chats',
      type: serverWidget.FieldType.INTEGER,
      container: 'fieldgroupid',
      label: 'History Size'
    })
    numChats.updateDisplayType({
      displayType: serverWidget.FieldDisplayType.HIDDEN
    })

    if (context.request.method === 'POST') {
      numChats.defaultValue = historySize + 2
      const chatHistory = []
      for (let i = historySize - 2; i >= 0; i -= 2) {
        const you = form.addField({
          id: 'custpage_hist' + (i + 2),
          type: serverWidget.FieldType.TEXTAREA,
          label: 'You',
          container: 'fieldgroupid'
        })
        const yourMessage = context.request.parameters['custpage_hist' + i]
        you.defaultValue = yourMessage
        you.updateDisplayType({
          displayType: serverWidget.FieldDisplayType.INLINE
        })

        const chatbot = form.addField({
          id: 'custpage_hist' + (i + 3),
          type: serverWidget.FieldType.TEXTAREA,
          label: 'ChatBot',
          container: 'fieldgroupid'
        })
        const chatBotMessage =
          context.request.parameters['custpage_hist' + (i + 1)]
        chatbot.defaultValue = chatBotMessage
        chatbot.updateDisplayType({
          displayType: serverWidget.FieldDisplayType.INLINE
        })
        chatHistory.push({
          role: llm.ChatRole.USER,
          text: yourMessage
        })
        chatHistory.push({
          role: llm.ChatRole.CHATBOT,
          text: chatBotMessage
        })
      }

      const prompt = context.request.parameters.custpage_text
      const promptField = form.addField({
        id: 'custpage_hist0',
        type: serverWidget.FieldType.TEXTAREA,
        label: 'You',
        container: 'fieldgroupid'
      })
      promptField.defaultValue = prompt
      promptField.updateDisplayType({
        displayType: serverWidget.FieldDisplayType.INLINE
      })
      const result = form.addField({
        id: 'custpage_hist1',
        type: serverWidget.FieldType.TEXTAREA,
        label: 'ChatBot',
        container: 'fieldgroupid'
      })
      result.defaultValue = llm.generateText({
        prompt: prompt,
        chatHistory: chatHistory
      }).text
      result.updateDisplayType({
        displayType: serverWidget.FieldDisplayType.INLINE
      })
    } else {
      numChats.defaultValue = 0
    }

    form.addField({
      id: 'custpage_text',
      type: serverWidget.FieldType.TEXTAREA,
      label: 'Prompt',
      container: 'fieldgroupid'
    })

    form.addSubmitButton({
      label: 'Submit'
    })

    context.response.writePage(form)
  }

  return {
    onRequest: onRequest
  }
}) 

          

Evaluate an Existing Prompt and Receive a Response

The following sample evaluates an existing prompt, sends it to the default NetSuite large language model (LLM), and receives the response. The sample also shows the remaining free usage for the month.

In this sample, the llm.evaluatePrompt(options) method loads an existing prompt with an ID of stdprompt_gen_purch_desc_invt_item. This prompt applies to an inventory item record in NetSuite, and it uses several variables that represent fields on this record type, such as item ID, stock description, and vendor name. The method replaces the variables in the prompt with the values you specify, then sends it to the LLM and returns the response.

You can create and manage prompts using Prompt Studio. You can also use Prompt Studio to generate a SuiteScript example that uses the llm.evaluatePrompt(options) method and includes the variables for a prompt in the correct format. When viewing a prompt in Prompt Studio, click Show SuiteScript Example to generate SuiteScript code with all of the variables that prompt uses. You can then use this code in your scripts and provide a value for each variable.

For instructions about how to run a SuiteScript 2.1 code snippet in the debugger, see On-Demand Debugging of SuiteScript 2.1 Scripts. Step through the code until the line before the end of the script to see the response text returned from the LLM and the remaining free usage for the month.

Note:

This sample script uses the require function so that you can copy it into the SuiteScript Debugger and test it. You must use the define function in an entry point script (the script you attach to a script record and deploy). For more information, see SuiteScript 2.x Script Basics and SuiteScript 2.x Script Types.

            /**
* @NApiVersion 2.1
*/
require(['N/llm'],
    function(llm) {        
        const response = llm.evaluatePrompt({
            id: 'stdprompt_gen_purch_desc_invt_item',
            variables: {
                "form": {
                    "itemid": "My Inventory Item",
                    "stockdescription": "This is the stock description of the item.",
                    "vendorname": "My Item Vendor Inc.",
                    "isdropshipitem": "false",
                    "isspecialorderitem": "true",
                    "displayname": "My Amazing Inventory Item"
                },
                "text": "This is the purchase description of the item."
            }
        });
        const responseText = response.text;
        const remainingUsage = llm.getRemainingFreeUsage(); // View remaining monthly free usage
    }); 

          

Create a Prompt and Evaluate It

The following sample creates a prompt record, populates the required record fields, and evaluates the prompt by sending it to the default NetSuite large language model (LLM).

This sample creates a prompt record using the record.create(options) method of the N/record module, and it sets the values of the following fields (which are required to be able to save a prompt record):

  • Name

  • Prompt Type

  • Model Family

  • Template

After the prompt record is saved, llm.evaluatePrompt(options) is called, but only the ID of the created prompt is provided as a parameter. The method throws a TEMPLATE_PROCESSING_EXCEPTION error because the prompt template includes a required variable (mandatoryVariable) that was not provided when the method was called. The error is caught, and a debug message is logged.

Next, the sample calls llm.evaluatePrompt(options) again but provides values for the mandatoryVariable and optionalVariable variables. This time, the call succeeds, and a debug message is logged. Finally, the sample calls llm.evaluatePrompt.promise(options) and confirms that the call succeeds. When the call succeeds, the prompt record is deleted.

You can create and manage prompts using Prompt Studio. You can also use Prompt Studio to generate a SuiteScript example that uses the llm.evaluatePrompt(options) method and includes the variables for a prompt in the correct format. When viewing a prompt in Prompt Studio, click Show SuiteScript Example to generate SuiteScript code with all of the variables that prompt uses. You can then use this code in your scripts and provide a value for each variable. For more information about Prompt Studio, see Prompt Studio.

For instructions about how to run a SuiteScript 2.1 code snippet in the debugger, see On-Demand Debugging of SuiteScript 2.1 Scripts. Step through the code until the line before the end of the script to see the response text returned from the LLM and the remaining free usage for the month.

Note:

This sample script uses the require function so that you can copy it into the SuiteScript Debugger and test it. You must use the define function in an entry point script (the script you attach to a script record and deploy). For more information, see SuiteScript 2.x Script Basics and SuiteScript 2.x Script Types.

            /**
* @NApiVersion 2.1
*/
require(['N/record', 'N/llm'], function(record, llm) {
    const rec = record.create({
        type: "prompt"
    });
    
    rec.setValue({
        fieldId: "name",
        value: "Test"
    });
    rec.setValue({
        fieldId: "prompttype",
        value: "CUSTOM"
    });
    rec.setValue({
        fieldId: "modelfamily",
        value: "COHERE_COMMAND_R"
    });
    rec.setValue({
        fieldId: "template",
        value: "${mandatoryVariable} <#if optionalVariable?has_content>${optionalVariable}<#else>World</#if>"
    });
    
    const id = rec.save();
    
    try {
        llm.evaluatePrompt({
            id: id
        });
    }
    catch (e) {
        if (e.name === "TEMPLATE_PROCESSING_EXCEPTION")
            log.debug("Exception", "Expected exception was thrown");
    }
    
    const response = llm.evaluatePrompt({
        id: id,
        variables: {
            mandatoryVariable: "Hello",
            optionalVariable: "People"
        }
    });
    if ("Hello People" === response.chatHistory[0].text)
        log.debug("Evaluation", "Correct prompt got evaluated");
        
    llm.evaluatePrompt.promise({
        id: id,
        variables: {
            mandatoryVariable: "Hello",
            optionalVariable: "World"
        }
    }).then(function(response) {
        if ("Hello World" === response.chatHistory[0].text)
            log.debug("Evaluation", "Correct prompt got evaluated");
        record.delete({
            type: "prompt",
            id: id
        });
        debugger;
    })
}); 

          

General Notices