Generate Summary Using the Local REST Provider Ollama
Perform a text-to-summary transformation by accessing open LLMs, using the local host REST endpoint provider Ollama.
Ollama is a free and open-source command-line interface tool that allows you to run open LLMs (such as Llama 3, Phi 3, Mistral, or Gemma 2) locally and privately on your Linux, Windows, or macOS systems. You can access Ollama as a service using SQL and PL/SQL commands.
Here, you call the chainable utility function UTL_TO_SUMMARY
.
WARNING:
Certain features of the database may allow you to access services offered separately by third-parties, for example, through the use of JSON specifications that facilitate your access to REST APIs.
Your use of these features is solely at your own risk, and you are solely responsible for complying with any terms and conditions related to use of any such third-party services. Notwithstanding any other terms and conditions related to the third-party services, your use of such database features constitutes your acceptance of that risk and express exclusion of Oracle's responsibility or liability for any damages resulting from such access.
To generate a concise and informative summary of a textual extract, by calling a local LLM using Ollama:
Parent topic: Generate Summary