Search

Join this Blog for direct reference of any post into your inbox. To join just click on "Join this Site" under "Follower" then Login with your Email.*** DECLARATION: I Maintain this Blog for Helping Myself While at Work and Welcome any body Needing Help!!!.*** CAUTION: Using any of the script from this Blog may contain at Own Risk. These scripts May or May not have been Tested.***

Saturday, 22 September 2012

Oracle 11g Data Pump Enhancements

A number of new data pump features are included in Oracle 11g, including the deprecation of the EXP utility, compression of dump file sets, improvement in encryption, and data remapping. Other new features include table renaming, data pump and partitioned table operations, overwrite dump files, the data_options parameter and the transportable parameter. Some of them are discussed below.

Compression of Dump File Sets

Oracle 11g now supports compression of Dump file sets.
expdp HRMS/*** DIRECTORY= datapump_dir DUMPFILE=ORCL3_COMP.dmp
COMPRESSION=ALL
There are four options available for the compression: ALL, metadata_only, data_only, None

Data Pump Encryption Enhancements

Oracle 11g now provides the ability to encrypt Data Pump dump files. Encryption is part of a separate Oracle license (Advanced Security Option) that you must purchase (again, check with Oracle if and when you need this to make sure about current license requirements).

Data Pump Data Remapping (Obfuscation)

It is not uncommon to have a requirement that sensitive production data cannot be contained in a non-production environment. It is also a very common requirement that you need test environments to be as production like as possible. Oracle Data Pump offers help in this regard by providing the ability to obfuscate (or remap if you prefer) data during an export or import operation. Remapping of data in Data Pump is supported with the use of the new remap_data parameter.
Remapping can occur during either the export or the import of the dump file. It probably is a better idea to do the remapping during the export for security reasons. That way you can be less concerned if the export file ends up being misused, since the dump file will not contain sensitive data.

Data Pump Rename Table

In Oracle 11g you can now able to rename a table during the import process. To rename a table during the import process use the remap_table parameter.
impdp HRMS/**** DIRECTORY=datapump_dir DUMPFILE=orcl3.dmp tables=hrms.pay_payment_master remap_table=hrms.pay_payment_master:copy_pay_payment_master

Data Pump and Partitioned Tables

Now in 11g during an Oracle Data Pump import you can have some control over partitioning of tables by using partition_options.
  • Departition - Partitions will be created as individual tables rather than partitions of a partitioned table. 
  • Merge - Causes all partitions to be merged into one, unpartitioned table.
  • None – Same as per the export. 
impdp hrms/**** DIRECTORY=datapump_dir DUMPFILE=orcl3.dmp tables=hrms.pay_payment_master partition_options=merge;

Overwrite Dump Files

When using impdp in Oracle 11g you can now overwrite any dump file that might already exist by using the new reuse_dumpfiles parameter:
expdp hrms/**** DIRECTORY=datapump_dir DUMPFILE=orcl3.dmp tables=hrms.pay_payment_master reuse_dumpfiles=Y

Data Pump Data_Options Parameter

Previously with the impdp if we are loading records into a table with Oracle Data Pump Import, entire load fail because of single duplicate records were in the import file. Oracle 11g solves this problem with a new parameter: skip_constraint_errors for use with Data Pump Import. When this parameter is used Data Pump Import will ignore the rows that generate a constraint error (while still not loading them of course) and will continue to load those rows that do not generate an error.
C:\>impdp hrms/**** dumpfile=new_hrms.dmp directory=datapump_dir tables=pay_payment_master  table_exists
_action=append data_options=skip_constraint_errors;
Restriction: You must use external table method  with Data Pump when loading data.

The Transportable Parameter

Previously when the transportable parameter is used with impdp or expdp only the metadata associated with specific tables, partitions, or sub-partitions will be extracted, rather than all metadata. You can then proceed to transport the associated data files as you normally would.
You then copy the data files and the dump file set to the destination and plug in the database. Now you would use impdp as a part of this process to import the metadata into the database
impdp hrms/**** DIRECTORY=datapump_dir DUMPFILE=orcl3.dmp tables=hrms.pay_payment_master remap_schema=hrms:hrms1;

1 comments: