Search

Join this Blog for direct reference of any post into your inbox. To join just click on "Join this Site" under "Follower" then Login with your Email.*** DECLARATION: I Maintain this Blog for Helping Myself While at Work and Welcome any body Needing Help!!!.*** CAUTION: Using any of the script from this Blog may contain at Own Risk. These scripts May or May not have been Tested.***

Sunday, 27 May 2012

Datapump in Oracle 10g

Oracle 10g introduces the Oracle Data Pump. Oracle Data Pump provides enhanced support for logical extracts of data.
  • The ability to estimate the size of the files that will result from the expdp operation, without actually having to write those files. 
  • The ability to suspend and resume an export job. 
  • The ability to attach or detach from a running export job. 
  • The ability to restart many failed jobs from the point of failure. 
  • The ability to control the number of threads that will be used during the export operation. 
  • The availability of external tables and direct path access method.
  • Support for network mode operations, which allow you to load from another database, through a database link, directly. 
  • Control over the version of the object that is exported. This allows you to export data from one version of Oracle and ensure that it is compatible with a lower-level version. Note that this applies to versions of Oracle Database after 10g and that this feature is not available for any version earlier than Oracle Database 10g.
  • The ability of three methods of database extraction: extract only database metadata; extract only data from the database; extract both database metadata and data at the same time.
Oracle 10g also provides an interface into the Data Pump through the dbms_datapump procedure. The dbms_datapump procedure makes it possible to write custom code that interfaces with the Data Pump driver. This code can import or export Data Pump data, suspend and resume Data Pump jobs, and monitor Data Pump jobs, among other things.
Pitfalls:
When you run a Data Pump job, a temporary table is created in the schema that you log in to. This table is supposed to be cleaned up after the job ends. If the job ends abnormally (e.g., database crash in the middle of the job). You will find this table in the same schema that you logged in to, and it will be named the same as the job_name parameter setting. Therefore, if you logged in using the HRMS account and your job_name was set to FULL_EXPORT, the table will be in the HRMS schema and the name of it will be FULL_EXPORT.
This leads to another little issue: if you have a table with the same name as the JOB_NAME in the schema that you are logging into, the job will fail because the table already exists.

Before using expdp, a directory object should be created in the database that you will be exporting from.
SQL> CREATE DIRECTORY datapump_dir AS ‘d:\backup\datapump_dir’;
Directory created.
SQL> GRANT READ, WRITE ON DIRECTORY datapump_dir TO hrms;
Grant succeeded.
Full Database Export:
C:\>expdp hrms/**** dumpfile=datapump_dir:orcl3_%U.dat filesize=200m
nologfile=y job_name=orcl3_full_export full=y
Tablespace Export:
C:\>expdp hrms/**** dumpfile=datapump_dir:orcl3_maindbf%U.dat filesize=100m nologfile=y job_name=orcl3_maindbf_export tablespace=MAINDBF
Data Only Export:
The below example only exports database data for the entire database. It also creates a log file.
C:\>expdp HRMS/**** dumpfile=datapump_dir:orcl3_%U_data.dat filesize=100m content=data_only job_name=Orcl3_data_extract full=y
logfile=datapump_log:orcl3_exp.log
Specific Object Export:
Creating a parameter file with the name exp_obj.txt
INCLUDE=FUNCTION
INCLUDE=PROCEDURE
INCLUDE=TABLE:"LIKE 'pay_%'"
DUMPFILE=datapump_dir:orcl3_%U_objects.dat
NOLOGFILE=Y
JOB_NAME=specific_objects
FULL=y
After creating the above parameter file use this command.
C:\>expdp HRMS/**** parfile=D:\backup\imp\exp_obj.txt
Note: The above export will export all database functions, procedures, and any table that has a name that starts with Pay_
Estimate the Export Size:
It can be useful to know if there is enough space before actual export operation as we know that datapump provides the capability for size estimation.
C:\>expdp HRMS/**** full=y estimate_only=Y estimate=statistics nologfile=y

For datapump import we are taking almost same example as we have taken in the case of datapump export.
Full Database Import:
C:\>impdp HRMS/**** dumpfile=datapump_dir: orcl3_maindbf_%U.dat nologfile=y job_name= orcl3_full_export full=y
Tablespace Import:
C:\>impdp HRMS/**** dumpfile=datapump_dir:orcl3_tbs_users_%U.dat
nologfile=y job_name= orcl3_maindbf_export tablespaces= MAINDBF
table_exists_action=truncate
Specific Table Import:
C:\>impdp HRMS/**** dumpfile=datapump_dir: orcl3_%U_objects.dat
content=data_only job_name= specific_objects full=y
logfile=datapump_log:orcl3_imp.log tables=hrms.pay_payment_master
Specific Object Import:
Creating a parameter file with the name imp_obj.txt 
INCLUDE=FUNCTION
INCLUDE=PROCEDURE
INCLUDE=TABLE:"LIKE 'Pay_%'"
DUMPFILE=datapump_dir:orcl3_%U_objects.dat
NOLOGFILE=Y
JOB_NAME=specific_objects
FULL=y
After Creating the above parameter file the following command is used:
C:\>impdp HRMS/**** parfile=D:\backup\exp\imp_obj.txt
Note: Oracle recommends that you not log in using sysdba privileges when using the Data Pump utilities, because the sysdba login grants special privileges that you will generally not need with Data Pump.
Related Views:
DBA_DATAPUMP_JOBS, DBA_DATAPUMP_SESSIONS, DATAPUMP_PATHS

For more information Please check the below link:

0 comments:

Post a Comment