A.1 FAQs
Topics:
- How to grant access to an OML model by giving global or selective access to other users?
- How to obtain the URL for REST API for OML Services, REST API for OML4Py and OML4R embedded execution from the Autonomous AI Database ?
- How to push content represented by an OML proxy object into a table in the Oracle Autonomous AI Database?
- What is the expected behavior if the embedded REST execution requests surpass the capacity of the VM?
- Who owns and manages the VMs on which embedded Python execution is downloaded?
- Where can I find the list of all the functions available in the oml python package?
- What is the difference between the OML ECPUs and the Database ECPUs?
- What is an alternative solution to oml.connect() for Oracle Machine Learning solutions deployment?
- What other models can I host on OML besides the in-database models?
- Does Oracle Machine Learning support reading of data from OCI Streaming services or Kafka?
- How can I create a Conda environment for Python and R, install packages and upload the environment in a single step
A.1.1 How to grant access to an OML model by giving global or selective access to other users?
SELECT access on in-database models for inferencing.
For a specific database user, use:
GRANT SELECT ON MINING MODEL <model name> TO
username;Parent topic: FAQs
A.1.2 How to obtain the URL for REST API for OML Services, REST API for OML4Py and OML4R embedded execution from the Autonomous AI Database ?
To obtain the OML URL, SQL Developer URL, Graph Studio URL, APEX URL
- To obtain the tenancy ID, go to your OCI Profile on the top right corner of the Oracle Cloud page and click Tenancy.
- On the Tenancy details page, click Copy to obtain the tenancy URL.
- Type the following command in your OCI command line
interface:
oci db database list --compartment-id <tenancy OCID>Here,compartment-id:This is the unique ID assigned to your compartment.OCID:This is the Oracle Cloud Identifier (OCID) for your tenancy.
- This command returns the following:
"connection-urls": { "apex-url": https://<tenancy ID>-<database name>.<region>.oraclecloudapps.com/ords/apex, "graph-studio-url": https://<tenancy ID>-<database name>.<region>.oraclecloudapps.com/graphstudio/, "machine-learning-user-management-url": https://<tenancy ID>-<database name>.<region>-1.oraclecloudapps.com/omlusers/, "sql-dev-web-url": https://<tenancy ID>-<database name>.<region>-1.oraclecloudapps.com/ords/sql-developer },
For more information, see Access OML User Management from Command Line
To obtain the URLs for OML User Management, REST API OML Services, REST API for OML4Py and OML4R embedded execution from Command Line or Autonomous AI Database
v$pdbs and run the
following:select 'https://' ||
lower(REPLACE(p.name, '_', '-')) ||
'.' ||
REGEXP_REPLACE(j.PUBLIC_DOMAIN_NAME, '[^.]+', 'oraclecloudapps', 1, 3) ||
'/omlusers/' auth_token_url,
'https://' ||
lower(REPLACE(p.name, '_', '-')) ||
'.' ||
REGEXP_REPLACE(j.PUBLIC_DOMAIN_NAME, '[^.]+', 'oraclecloudapps', 1, 3) ||
'/oml/' embed_python_r_url,
'https://' ||
lower(REPLACE(p.name, '_', '-')) ||
'.' ||
REGEXP_REPLACE(j.PUBLIC_DOMAIN_NAME, '[^.]+', 'oraclecloudapps', 1, 3) ||
'/omlmod/' rest_services_url
from v$pdbs p,
json_table(p.cloud_identity, '$' columns (
PUBLIC_DOMAIN_NAME path '$.PUBLIC_DOMAIN_NAME'
)) j;This command works for non-private endpoint databases. Users that create their own URL behind a private endpoint could modify it.
- OML User Management URL (same as returned by OCI CLI)
- URL for OML4Py and OML4R embedded execution
- URL for OML Services REST APIs
To obtain the URL for OML REST Services from the Autonomous AI Database
- On your Oracle Autonomous AI Database console, click Database actions, and then select the option View all database actions.
- Click RESTFul Services and then click Oracle Machine Learning RESTful services. The Oracle Machine Learning RESTful Services dialog opens.
- On the Oracle Machine Learning RESTful Services dialog, copy the URL for your ADB instance.
For more information, see Task 2: Get the OML Services URL to Obtain Your REST Authentication Token
Parent topic: FAQs
A.1.3 How to push content represented by an OML proxy object into a table in the Oracle Autonomous AI Database?
From a table in Autonomous Data Warehouse , in Oracle Machine Learning notebook we have converted the table data as
'oml.core.frame.DataFrame' and using my model have done the
predictions. Now my predicted column is in
'oml.core.frame.DataFrame'. How can I insert the predicted
column back into the database table?
materialize() function. This function
pushes the contents represented by an OML proxy object into a table in the database.
proxy_object.materialize(table="MYTABLENAME")Parent topic: FAQs
A.1.4 What is the expected behavior if the embedded REST execution requests surpass the capacity of the VM?
What is the expected behavior if the embedded REST execution requests surpass the capacity of the VM? Are they enqueued or does the request fail?
The following error is displayed: Unable to reserve a STANDARD
resource at this time: no STANDARD type VM is available.
Please try again later
Embedded REST execution calls would fail if no container resources are available to serve.
Parent topic: FAQs
A.1.5 Who owns and manages the VMs on which embedded Python execution is downloaded?
Is the VM to which the embedded Python execution workload is offloaded, owned and managed by OML (service-owned )? Or is it under the ownership of the database itself (customer-owned )?
For Oracle Machine Learning, each tenant has its own VM managed by Autonomous AI Database. Embedded execution does not run on customer-owned VM. Multiple VMs might belong
to a tenant but no VM can be shared across different tenants. Through the service
levels - Low, Medium, High and
GPU settings, the different sized containers can be provisioned
respectively. There is no dynamic resource adjustment in place at runtime.
A tenant may provision one or more databases. Oracle Autonomous AI Database Serverless maintains a pool of VMs. Each embedded execution is processed in a container, which runs on a VM. When we need a VM to run a container, we request a VM from Oracle Autonomous AI Database Serverless. The VM is released when no more containers are running. Currently, a VM only serves one database at a time. All database users share this VM). If a tenant has 2 databases, each database will have its own VM.
Note:
Oracle Autonomous AI Database Serverless does not support dynamic resource adjustment for the containers.Parent topic: FAQs
A.1.6 Where can I find the list of all the functions available in the oml python package?
%python
import pandas as pd
import oml
# Create a simple Pandas DataFrame
df = pd.DataFrame({
'col1': [1, 2, 3],
'col2': ['a', 'b', 'c']})
# Create an OML proxy object
oml_df = oml.create(df, "df")
# List all public methods and attributes
methods = [m for m in oml_df.__dir__() if not m.startswith('_')]
methods.sort()
print(methods)For more information, see OML4Py API Reference Guide
Parent topic: FAQs
A.1.7 What is the difference between the OML ECPUs and the Database ECPUs?
The OML ECPUs are the app-level ECPUs. For example, OML Notebooks use resources allocated with the OML App ECPUs. The in-database ML algorithms use the ADB (server side) ECPU resources.
Parent topic: FAQs
A.1.8 What is an alternative solution to oml.connect() for Oracle Machine Learning solutions deployment?
oml.connect(). Instead of hard-coding
credentials into the oml.connect statement, is
there an alternative solution that can work on multiple database
environments?
- For the Python API for embedded execution, use
oml_connect=True - For SQL API, use
"oml_connect":1 - For more information, see pyqTableEval Function (On-Premises Database)
Parent topic: FAQs
A.1.9 What other models can I host on OML besides the in-database models?
- Native Python and R models can be stored using the OML4Py and OML4R datastores. You can call these models through Python, R, SQL, and REST endpoints, optionally in parallel.
- ONNX format classification, regression, feature extraction, and clustering models can be imported and use the same SQL prediction operators as native models. You can also deploy these models to OML Services for real-time scoring through REST endpoints.
- Transformer models from Hugging Face can be converted to ONNX format, augmented through a pipeline, and loaded to the database for use with AI Vector Search.
- LLMs can be accessed through Select AI for SQL generation, RAG, and chat.
Parent topic: FAQs
A.1.10 Does Oracle Machine Learning support reading of data from OCI Streaming services or Kafka?
- OCI GoldenGate for real-time sync to Oracle Autonomous AI Database
- OCI Connector Hub along with OCI Functions to process and load stream data
- OCI Functions triggered by streaming events
- Use data from OCI Streaming
- Load data into Oracle Autonomous AI Database tables
- Use Oracle Machine Learning to build models on those tables using Oracle Machine Learning for SQL, Oracle Machine Learning for R, or Oracle Machine Learning for Python
Once the streaming data is loaded in your database, Oracle Machine Learning can work with it just like any other database tables.
Parent topic: FAQs
A.1.11 How can I create a Conda environment for Python and R, install packages and upload the environment in a single step
Create a Conda Environment for Python, Install Python Packages and upload the environment to object storage
To create a Conda environment, install a Python package, and upload the environment to object storage in a single step, run the following command in a Conda paragraph:
%conda
create -n mypyenv -c conda-forge --override-channels --strict-channel-priority python=3.13.5 seaborn
upload mypyenv --overwrite -t application "OML4PY"
Note:
This example uses Python version 3.13.5. You must ensure compatibility between the third-party package version and the Python version that OML4Py uses.import sys
sys.version-n: This is the name of the environment. In this example, it ismypyenv.-c: This is the channel name. In this example, theconda-forgechannel is used to install the Python package.--override-channels: This argument ensures that the system does not search default, and requires a channel to be mentioned.--strict-channel-priority: This argument ensures that packages in lower priority channels are not considered if a package with the same name appears in a higher priority channel. In this example, the priority is given to Python 3.13.5 and seaborn.upload: This argument uploads the Conda environment namedmypyenvto a target location.--overwrite: This argument ensures that if a file or directory with the same name already exists at the target location, it should be overwritten.-t application: This argument specifies the type of the target as an application."OML4PY": This is the name or path of the target application where the environment will be uploaded.
For more information, see: Create a Conda Environment for Python, Install a Python Package and Upload it to Object Storage
Create a Conda Environment for R, Install R Packages and upload the environment to object storage
To create a Conda environment, install a R package and upload the environment to object storage in a single step, run the following command in a Conda paragraph:
%conda
create -n myrenv -c conda-forge --override-channels --strict-channel-priority r-base=4.0.5 r-forecast
upload myrenv --overwrite -t application "OML4R"
Note:
In this example, we are using theconda-forge channel to
install the R packages.
-n: This is the name of the environment. In this example, it ismyrenv.-c: This is the channel name. In this example, it isconda-forge.--override-channels: This argument ensures that the system does not search default, and requires a channel to be mentioned.--strict-channel-priority: This argument ensures that packages in lower priority channels are not considered if a package with the same name appears in a higher priority channel. In this example, the priority is given toR-4.0.5andforecast.upload: This argument uploads the Conda environment namedmyrenvto a target location.--overwrite: This argument ensures that if a file or directory with the same name already exists at the target location, it should be overwritten.-t application: This argument specifies the type of the target as an application."OML4R": This is the name or path of the target application where the environment will be uploaded.
For more information, see Create a Conda Environment for R and Install R Package and Upload it to Object Storage
Parent topic: FAQs