8.2 Data Pump Command Syntax and Arguments

You can invoke the Data Pump command using dp or datapump.

Syntax

dp help [examples|syntax] | export [<optional-argument>,...] | import [<optional-argument>,...] |

To see the help description for data pump in SQLcl, type:

dp help

To quickly view the syntax and exclude other details, type:

dp help syntax

To only view the examples in help, type:

dp help examples

<optional argument>: The following table describes the possible optional arguments along with default values for each of them.

Table 8-1 Optional Arguments

File Argument Description Default
-credential, -c Credential for dump file access in Oracle Object Store. As specified in the Cloud Storage command
-directory,-d Default database directory for reading and writing dump and log files. DATA_PUMP_DIR
-dumpdirectory,-dd Database directory for dump file. -directory if specified or DATA_PUMP_DIR
-dumpfile,-f <file-name>[,…]

Dump file name(s) when using database directory.

You can specify multiple files whether parallelism is enabled or not. The number of files specified must be at least as large as the degree of parallelism.

<jobname><n>.DMP
-dumpuri,-u

[<uri>[,...] | <qualified-name>[,...]]

<uri>: Complete URI for the Oracle Object Store file if a default is not set on Cloud Storage command.

<qualifier>: Name of the object, optionally qualified by the namespace and the bucket. The qualified name concatenated to the URI specified on Cloud Storage command must fully identify the object URI.

Credential must be set for direct read/write access to Oracle Object Store from Autonomous database. For -copycloud between database directory and Oracle Object Store, OCI PROFILE must be set.

You can specify multiple URIs whether parallelism is enabled or not. The number of files specified should be at least as large as the degree of parallelism.

Default object name is <jobname>.DMP
-logdirectory,-ld Database directory for log file. -directory if specified or DATA_PUMP_DIR
-logfile,-lf Log file name in the database directory. <jobname><n>.LOG

Table 8-2 Command Arguments

Command Argument Description Default
-noexec,-ne

[TRUE | FALSE]

TRUE: Validate and generate the PL/SQL, but do not execute it.

FALSE
-verbose,-ve

[TRUE | FALSE]

TRUE: Show additional diagnostic output.

FALSE

Table 8-3 Common Arguments

Common Argument Description Default
-copycloud,-cc

[TRUE | FALSE]

TRUE: Copy the dump file between database directory and Oracle Object Store.

For export, copy the dump file from the database directory to Oracle Object Store after the data pump job completes.

For import, copy the dump file from Oracle Object Store to the database directory before the data pump job starts.

Set the OCI PROFILE using OCI command or CLOUDSTORAGE command. Set the cloud storage URI using the CLOUDSTORAGE command, SET DATAPUMP command or DATAPUMP command.

FALSE

-encryptionpassword,-enp

<password>

If password is not specified, a prompt for the password is given.

For export, the dump files are encrypted using the password. For import, the same password used for export is provided.

None

-excludeexpr,-ex

<object_type_expression>

Specify an expression identifying an object type or set of object types to exclude from the job. Example:-excludeexpr "IN ('GRANT','INDEX','TRIGGER')"

None

-excludelist,-el

<object-type>[,...]

Specify a comma-separated-value list of object types to exclude from the job.

Example:-excludelist GRANT,INDEX,TRIGGER

None

-includeexpr,-ix

<object_type_expression>

Specify an expression identifying an object type or set of object types to include in the job. Only matching object types and their dependents are included in the job. Use -excludelist or -excludeexpr to exclude dependent objects. Example: -includeexpr "IN ('TABLE','VIEW')"

None

-includelist,-il

<object_type>[,...]

Specify a comma-separated-value list of object types to include in the job. Only matching object types and their dependents are included in the job. Use -excludelist or -excludeexpr to exclude dependent objects. Example: -includelist TABLE,VIEW

None

-includemetadata,-im

[TRUE | FALSE]

TRUE: Include metadata in the job.

TRUE
-includerows,-ir

[TRUE | FALSE]

TRUE: Include data in the job.

TRUE
-jobname,-j

Name for the data pump job.

Job name is appended with a data pump generated number, unless it ends with a number. jobname<n> is used when submitting the data pump job and as a default name for dump and log file names or object names.

ESQL_<n> | ISQL_<n> where n is a data pump generated number.

-nameexpr,-nx

{<object-type>=<name-expression>}[;...]

For specified object type, provide an expression identifying a set of object names to include in the job. Example: -nameexpr TABLE="IN ('EMPLOYEES', 'DEPARTMENTS')";PROCEDURE="IN ('ADD_JOB_HISTORY','SECURE_DML')"

None

-namelist,nl

{<object-type>=<name>[,...]}[;...]

For specified object type, provide a comma-separated-value list of objects to include in the job. Example: -namelist TABLE=employees,departments;PROCEDURE=add_job_history,secure_dml

None

-parallel,-p

<degree_integer>

Adjusts the degree of parallelism within a job allowing multiple processes simultaneously. Specify the same number of files as the degree or some processes may remain idle.

1

-schemas,-s

<schema>[,...] - The schema or list of schemas to process.

For example:
-schemas schema1,schema2
For export, schema for the current connection. For import, the default is FULL and all objects in the dump file are imported.

-version,-v

{<nn.n> | COMPATIBLE | LATEST}

<nn.n>: A specific database version, for example, 11.0.0.

When exporting from Oracle Database 11g release 2 (11.2.0.3) or later into an Oracle Database 12 c Release 1 (12.1), specify a value of 12 to allow all existing database features, components, and options to be exported. This applies to a multitenant container database (CDB) or a non-CDB.

COMPATIBLE: Uses the metadata version from the database compatibility level and the compatibility release level for the feature.

LATEST: The version of the metadata corresponds to the database version.

COMPATIBLE

-wait,-w

[TRUE | FALSE]

TRUE: Wait for the data pump job to finish and show summary results.

FALSE: Submit the data pump job without waiting and without showing results.

TRUE

Table 8-4 Export Only Arguments

Export-Only Arguments Description Default
-compression,-cm

{ALL | DATA_ONLY | METADATA_ONLY | NONE}

Indicates if compression is needed for user data and metadata.

ALL: Compress user data and metadata.

DATA_ONLY: Compress only user data.

METADATA_ONLY: Compress only metadata.

NONE: Do not compress user data or metadata.

METADATA_ONLY

-estimate,-e

{BLOCKS | STATISTICS}

Specifies the estimate method for the size of the tables. It should be performed before starting the job.

BLOCKS: Estimate is calculated using the count of blocks allocated to the user tables.

STATISTICS: Estimate is calculated using the statistics for each table. If no statistics are available for a table, BLOCKS is used.

 

-encryption,-en

{ALL | DATA_ONLY | ENCRYPTED_COLUMNS_ONLY | METADATA_ONLY | NONE}

Specifying any encryption option will turn on encryption if you do not specify -encryption NONE.

NONE, or ALL if any other encryption option is specified.

-encryptionalgorithm,-ena

{AES128 | AES192 | AES256}

Identifies the cryptographic algorithm to use.

AES128

-encryptionmode,-enm

{DUAL | PASSWORD | TRANSPARENT}

Identifies the types of security used for encryption and decryption.

PASSWORD encrypts the dump files using the provided password.

TRANSPARENT enables encryption if the Oracle Encryption Wallet is available.

DUAL enables import using the Oracle Encryption Wallet or the password. When using DUAL, -encryptionpassword must be specified.

TRANSPARENT, or PASSWORD if -encryptionpassword is specified.

-filesize,-fs

{<n>{B | KB | MB | GB | TB}}

Limit for the size of files.

500 MB

-flashbackscn,-fb

[TRUE | FALSE]

TRUE: Use consistent database content based on system change number (SCN) at the start time of execution.

FALSE

-reusefile,-r

[TRUE | FALSE]

TRUE: Replace existing dump file(s) with a new file.

TRUE

Table 8-5 Import Only Arguments

Import-Only Arguments Description Default

-columnencryption,-ce

[TRUE | FALSE]

TRUE: Include column encryption clause on table metadata.

FALSE Omit column encryption clause.

TRUE if supported by database

-objectid,-oid

[TRUE | FALSE]

TRUE: Assign the exported OID.

FALSE: Assign a new OID

TRUE

-remapschemas,-rs

{<oldSchema>=<newSchema>[,…]}

<oldSchema> objects in the job are moved to <newSchema>. Example: oldschema1=newschema1,oldschema2=newschema2.

Not Applicable

-remaptablespaces,-rt

{<oldTablespace>=<newTablespace>[,…]}

<oldTablespace> storage segment in the job is relocated to <newTablespace>. Example: oldtablespace1=newtablespace1,oldtablespace2=newtablespace2

Not Applicable

-segmentattributes,-sa

[TRUE | FALSE]

TRUE: Include segment attributes clauses (physical attributes, storage attributes, tablespace, logging).

TRUE

-skipunusableindexes,-sui

[TRUE | FALSE]

TRUE : Rows are inserted into tables having unusable indexes.

TRUE

-storage,-st

[TRUE | FALSE]

TRUE: Include storage clauses.

TRUE

-tableexists,-te

{APPEND | REPLACE | SKIP | TRUNCATE}

Action to take if table exists during import.

APPEND: New rows are added to the existing rows in the table.

REPLACE: Before creating the new table, the old table is dropped.

SKIP: The preexisting table is left unchanged.

TRUNCATE: Rows are removed from a preexisting table before inserting rows from the import.

SKIP when -includemetadata true, otherwise APPEND action is taken if table exists during import.

-tablecompression,-tc

[TRUE | FALSE]

TRUE: The table compression clause is included if supported.

FALSE: The table has the default compression for the tablespace.

TRUE