8.1 Getting Started
The following sections provide information needed to get started with the
DATAPUMP command in SQLcl.
8.1.1 Prerequisites
To use DATAPUMP in SQLcl, you need to know the
following:
-
When you are the logged-in user that is exporting or importing:
-
You must have the appropriate system privileges or adequate user space resource on the tablespace.
-
If you do not have the
DBAorPDB_DBArole, you need access to the Oracle Directory for importing or exporting your own schema.For example:grant read, write on directory DATA_PUMP_DIR to dpumpnotdba;
-
-
When using other schemas:
-
See Required Roles to learn about the required roles for data pump import and export operations.
-
You must have permissions for the tablespace to import.
-
-
If you are unable to use Data Pump, you can instead load a file using Loading a File.
-
To connect to an Autonomous Database:
-
Download the wallet file for the Oracle Cloud connection. See Download Client Credentials.
For importing to Autonomous, see Cloud Premigration Advisor Tool (CPAT).
-
-
To use object storage from Oracle Database:
-
See About the Cloud Storage Command Options and Setup and Prerequisites in the Oracle Cloud Infrastructure Documentation.
-
You must install an OCI profile configuration file.
-
The OCI profile must be set using
OCI PROFILE <name>orCS OCI <profile>. -
You can copy dump files to object storage as the last step of an export using the
-copycloudoption on the data pump export. -
You can copy dump files from object storage as the first step of an import using the
-copycloudoption on the data pump import. -
You can copy dump files between cloud storage and Oracle Directories using the Cloud Storage command (put and get).
-
-
To use object storage from Autonomous Database:
-
You must set the credential for object storage using Cloud Storage,
SET DATAPUMPorDATAPUMPcommand. -
You can export dump files directly to object storage without using a database directory.
-
You can import dump files directly from object storage without using a database directory.
-
You cannot write log files directly to object storage but you can copy them to cloud storage using the Cloud Storage command.
-
The Time Zone File version is shown during export. The dump file can only be imported into a database with the same (or later) time zone file version.
-
Transparent and Dual encryption modes require Transparent Data Encryption (TDE). If TDE is not available, you can use only password encryption.
8.1.2 Usage
The DATAPUMP command creates and submits data pump
jobs using the DBMS_DATAPUMP package.
-
For importing to Autonomous, see Cloud Premigration Advisor Tool (CPAT).
-
You can have different hardware, operating systems, character sets, time zones, and versions in the source and target databases.
-
All object types and data types existing in Oracle Database release 11g and later versions are supported.
-
You can transfer and filter data and metadata.
-
You can transform schema names, tablespace names, and metadata at import time.
-
You can unload and load data in an Oracle proprietary format.
-
You can encrypt dump files to ensure your data is secure. You can use transparent if available on the database or use password as the encryption mode.
-
You can filter the import or export process to limit the objects types and objects included.
Export
Data Pump export is used to unload metadata and data into a dump file, which can be stored in an Oracle Directory or object storage.
-
If a schema or list of schemas is not provided, the current schema is exported.
-
If a directory or object storage and a credential are not provided, the dump file is stored in
DATA_PUMP_DIR. -
Job name is created as
ESQL_<n>, if one is not provided. -
If a dump file name is not provided, the dump file is stored as
<jobname>.DMP. -
If a log file name is not provided, the log file is stored as
<jobname>.LOG.
Import
Data Pump import is used to load metadata and data from a previously exported dump file, which was stored in an Oracle Directory or object storage.
-
If a schema or list of schemas is not provided, all the objects in the dump file are imported (
FULLimport by default). You can use a filter to limit the number of objects imported. -
If a database directory or object storage and a credential are not provided, the dump file is stored in
DATA_PUMP_DIR. -
Job name is created as
ISQL_<n>, if one is not provided. -
If a dump file name is not provided, the dump file is stored as
<jobname>.DMP. -
If a log file name is not provided, the log file is stored as
<jobname>.LOG.