Importing Data using Oracle Data Pump

You can import data from Data Pump files into your cloud database.

Note:

Before you begin, you must have an export job. To create an export job using cURL, see Create an Export Data Pump Job.

With Oracle Data Pump Import, you can load an export dump file set into a target database, or load a target database directly from a source database with no intervening files.

For more information about Data Pump import, see Data Pump Import in Oracle Database Utilities.

Requirements

You need to set up a resource prinicipal or a cloud service credential to access the Oracle Cloud Infrastructure Storage Object.

Setting Up a Resource Principal

  1. Log in as the ADMIN user in Database Actions and enable resource principal for the Autonomous Database.

    In the SQL worksheet page, enter:

    EXEC DBMS_CLOUD_ADMIN.ENABLE_RESOURCE_PRINCIPAL();

    Optional: The following step is only required if you want to grant access to the resource principal credential to a database user other than the ADMIN user. As the ADMIN user, enable resource principal for a specified database user by using the following statement:

    EXEC DBMS_CLOUD_ADMIN.ENABLE_RESOURCE_PRINCIPAL(username => 'adb_user');

    This grants the user adb_user access to the credential OCI$RESOURCE_PRINCIPAL. If you want the specified user to have privileges to enable resource principal for other users, set the grant_option parameter to TRUE.

    BEGIN
    DBMS_CLOUD_ADMIN.ENABLE_RESOURCE_PRINCIPAL(
    username => 'adb_user',
    grant_option => TRUE);
    END;

    For more information, see Using Resource Principal in Oracle Cloud Using Oracle Autonomous Database Serverless.

  2. Obtain the resource.id.

    In the Oracle Cloud Infrastructure console, select Oracle Database and then select Autonomous Databases. In the Database tab, click the Actions icon (three dots) and select Copy OCID. This is applicable for all database instances in all the compartments.

    Copy OCID option in console
  3. Create a dynamic group.

    1. In the Oracle Cloud Infrastructure console, click Identity and Security and click Dynamic Groups.

    2. Click Create Dynamic Group and enter all the required fields. Create matching rules using the following format for all your databases:

      any { resource.id = 'here goes the OCID of your database 1', 
      resource.id ='here goes the OCID of your database 2' }

      Note:

      For managing dynamic groups, you must have one of the following privileges:
      • You are a member of the Administrators group.
      • You are granted the Identity Domain Administrator role or the Security Administrator role.
      • You are a member of a group that is granted manage identity-domains or manage dynamic-groups.
  4. Create a new policy.

    The ObjectStorageReadersPolicy allows users in the ObjectStorageReaders group to download objects from any Object Storage bucket in the tenancy. You can also narrow the scope to a specific compartment. The policy includes permissions to list the buckets, list objects in the buckets, and read existing objects in a bucket.

    1. In the Oracle Cloud Infrastructure console, click Identity, and then click Policies.

    2. Click Create Policy.

    3. For the policy name, enter ObjectStorageReadersPolicy.

    4. For the description, enter Allow ObjectStorageReaders group to read objects and buckets.

    5. From the Compartment list, select your root compartment.

    6. Add the following policy statement, which allows ObjectStorageReaders to read buckets:

      Allow dynamic-group ObjectStorageReaders to read buckets in tenancy
    7. Add a second policy statement that allows ObjectStorageReaders to read objects in a bucket:

      Allow dynamic-group ObjectStorageReaders to read objects in tenancy
    8. Click Create.

    Create Policy in OCI console

Setting Up a Cloud Service Credential

  1. In the Oracle Cloud Infrastructure console, click Identity and Security and click Domains.
  2. Under List Scope, in the Compartment field, choose the root compartment from the drop-down list followed by the default domain.
  3. In Default domain, click Users and then click Create user. In the Create user screen, select the Administrators group to have access to the buckets where the DMP files are stored.
    Create user screen
  4. After creating the user, in the left side of the new screen that appears, under Resources, select API keys.
    Select API keys
  5. Click Add API key.
    Add API key
  6. In the Add API Key screen, click Download private key. After downloading, click Add. The Configuration file preview screen appears.

    Configuration file preview

    The user, fingerprint and tenancy information will be used later to create your user database.

  7. Enter the database where you want to create the database user.

    Select database
  8. The following code block, DBMS_CLOUD.CREATE_CREDENTIAL, is used to create the credential.

    The Configuration file preview window information is used to generate the new credential. For the private_key attribute, you need to open the .PEM file that you downloaded in step 6, using any text editor. Generate a code block as shown in the following figure:

    Code block

Importing Data

This section provides the steps for importing data using Oracle Data Pump in Database Actions.

  1. In the Data Pump page, on the top right, click Import.

    The Import wizard appears.

  2. In the Source step:

    Bucket Name Source:

    • Bucket List: Allows you to select the bucket based on a bucket list after selecting a credential and the desired compartment.

    • Manual Search: After selecting a valid credential, you must enter the exact name of a bucket to list its files.

    After selecting the mode, enter the following fields:

    1. Credential Name: Select a valid credential to access the information in the Object Storage Buckets.

    2. Compartment Name: Select a compartment at any level within the tenancy (only available for Bucket List).
    3. Bucket Name: Select the bucket that contains the dump files from the drop-down list. Selecting a bucket automatically prefills the associated dump files in the Bucket Objects field.
    4. Bucket Objects: Select a dump file from the list.
    5. Import Pattern: When you select a dump file, it is automatically entered in the Import Pattern field. You can modify the pattern, if needed. The dump files that match are displayed in the Dump Files field.
    6. Dump Files: Select the dump files to import.

    Click Next.

  3. In the Import step, enter the following fields:
    • Import Name: Enter a name for the import job.
    • Import Type: Select the type of import. The options are Full, Tables, Schemas, and Tablespaces.

      Note:

      If you select Full, you skip the Filter step in the wizard and directly go to the Mapping step.
    • Content: Select Data Only, DDL Only, or Data and DDL.
    • Cloud Directory Name: Select the directory to import to.
    • Encrypt: Select if encrypted and enter an encryption password.

    Click Next.

  4. In the Filter step, depending on the import type, all the schemas, tables, or tablespaces for the import job are listed. Select the ones that apply. Click Next.
  5. In the Mapping step, select the source schema and enter a new name for the target schema. If needed, do the same for tablespaces. Click Next.
  6. In the Options step, enter the following fields:
    • Threads: Specifiy the maximum number of threads of active execution operating on behalf of the import job. The default is 1.
    • Action on Table if Table Exists: Specify the action needed if that table that import is trying to create already exists.
    • Skip Unusable indexes: Select to specify whether the import skips loading tables that have indexes that were set to the Index Unusable state.
    • Regenerate Object IDs: Select to create new object identifies for the imported database objects.
    • Delete Master Table: Select to indicate whether the Data Pump control job table should be deleted or retained at the end of an Oracle Data Pump job that completes successfully.
    • Overwrite Existing Datafiles: Select to indicate that if a table already exists in the destination schema, overwrite it.
    • Version: Select the version of database objects to import.
    • Logging: Select to create a log file. Enter the log directory and log file name.

    Click Next.

  7. The Summary step displays a summary of all the selections made in the previous steps.

    Select Show Code at the bottom to see the PL/SQL code equivalent of the form.

    Click Import.

    The start of the job execution is displayed on the Data Pump page.