2.3.2 Database Setup

This topic describes the systematic instructions to setup database

The process or steps for importing the data from the object storage are as follows:

  1. User should ensure the following necessary credentials and configuration files are set up:
    • OCI tenancy OCID
    • User OCID
    • Compartment OCID
    • Object Storage namespace
    • API key configuration
  2. Connect to Target ATP.
  3. Create a directory to store dump files containing exported data.

    Create a directory

    CREATE DIRECTORY data_export_dir as 'data_export';

    Note:

    Ensure the necessary privileges are granted to the target ATP instance to access and read from the Object Storage bucket.
  4. Run the Data Pump Import with the dumpfile parameter set to the list of file URLs on your Cloud Object Storage.
    • Run the Data Pump Import using the dumpfile parameter set to the list of file URLs on your Cloud Object Storage
    • When user uses a pre-authenticated URL, providing the credential parameter is required and impdp ignores the credential parameter.
    • When user uses a pre-authenticated URL for the dumpfile, you can use a NULL value for the credential in the next step.

    IMPDP

    impdp admin/<replace with ADMIN password>@<replace with atp instance name service name - high> \ directory=data_export_dir \ credential=NULL \ dumpfile=<PRE_AUTHENTICATED_OBJECT_STORAGE_URL> \ parallel=16 \ ENCRYPTION_PASSWORD=\"<use the plaintext DEK generated in prerequisite step>\" \ exclude=cluster,indextype,db_link

    Note:

    PRE_AUTHENTICATED_OBJECT_STORAGE_URL - Seed Data PAR URL from Data Export Status screen.
  5. Check the status of the import job and ensure it is completed successfully
    The log file is available in the specified Object Storage bucket. User can download and review the log file to verify the import process.