The N/llm module brings generative AI power into your NetSuite SuiteScript environment. With it, you can send requests to large language models (LLMs) supported by NetSuite, and use those AI-generated results directly in your scripts. From generating content and enhancing prompts to leveraging retrieval-augmented generation (RAG) and embeddings, the N/llm module makes NetSuite scripts smarter, faster, and more dynamic.
Request More Info From an Expert
What Does the N/llm Module Do?
Think of the N/llm module as your AI co-pilot for SuiteScript. It connects your scripts to Oracle Cloud Infrastructure (OCI)’s generative AI services, letting you:
- Generate content on-demand with
llm.generateText().
- Evaluate prompts stored in Prompt Studio using
llm.evaluatePrompt().
- Manage prompts and Text Enhance actions programmatically.
- Feed documents into the LLM for more accurate responses with RAG support.
- Embed text into vector embeddings for semantic search, recommendation engines, or classification.
- Stream responses in real time instead of waiting for the full output.
- Track your free monthly usage for both text and embedding calls.
Key Features of the N/llm Module
Content Generation
Use llm.generateText(options)
to create AI-driven responses based on your prompts. Perfect for generating product descriptions, summaries, or automated replies.
Alias: llm.chat(options)
Prompt Evaluation with Prompt Studio
If you manage reusable prompts in Prompt Studio, you can call them directly in your scripts using llm.evaluatePrompt(options)
. This allows you to dynamically pass variables to stored prompts while keeping model settings consistent.
Alias: llm.executePrompt(options)
Retrieval-Augmented Generation (RAG)
Go beyond generic AI responses by supplying your own documents when calling llm.generateText(options)
. This allows you to dynamically pass variables to stored prompts while keeping model settings consistent.
Embedding Support
Use llm.embed(options)
. to transform text into vector embeddings. These can power advanced use cases like:
-
Semantic search inside NetSuite
-
Product or content recommendation engines
-
Clustering and classification
Embeddings come with their own free monthly quota separate from text generation.
Streaming Responses
No need to wait for the full output, llm.generateTextStreamed(options
and llm.evaluatePromptStreamed(options)
let your scripts stream responses as they’re generated, enabling real-time interactions.
Aliases:
llm.chatStreamed(options)
llm.executePromptStreamed(options)
Objects & Members
The N/llm module provides several objects you can interact with in your scripts:
-
llm.ChatMessage: Chat messages exchanged with the LLM
-
llm.Citation: Citations showing where responses pulled context from
-
llm.Document: Documents supplied for RAG
-
llm.EmbedResponse: Response from embedding calls
-
llm.Response / llm.StreamedResponse: Standard and streaming outputs from the LLM
-
llm.Usage: Token usage per request
Best Practices & Considerations
-
Validate AI output: Generative AI is powerful but not always 100% accurate. Always validate before using AI-generated content in production.
-
Regional availability: N/llm is only available in certain NetSuite regions. Check availability before planning.
-
Track usage: Use
llm.getRemainingFreeUsage()
andllm.getRemainingFreeEmbedUsage()
to monitor your free quota. -
Leverage aliases & promises: Most methods have easier-to-use aliases and promise-based async versions.
Example Use Cases in NetSuite
Generate product descriptions for new SKUs on the fly
Use RAG to answer customer service queries with internal documents
Embed sales notes for smarter search and recommendations
Stream AI-powered suggestions into Suitelets or custom UIs
Need help implementing AI-driven NetSuite solutions?
Talk to GURUS Solutions today, our team can help you design, script, and integrate generative AI into your NetSuite workflows.