Total Pageviews

Wednesday, 14 January 2015

Export Import

What Are the Export and Import Utilities?

The Export and Import utilities provide a simple way for you to transfer data objects between Oracle databases, even if they reside on platforms with different hardware and software configurations.

Oracle Data Pump (expdp and impdp) in Oracle Database 10g

Oracle Data Pump is a newer, faster and more flexible alternative to the "exp" and "imp" utilities used in previous Oracle versions. In addition to basic import and export functionality data pump provides a PL/SQL API and support for external tables.
expdp help=y

Getting Started

For the examples to work we must first unlock the SCOTT account and create a directory object it can access. The directory object is only a pointer to a physical directory, creating it does not actually create the physical directory on the file system of the database server.
CONN / AS SYSDBA
ALTER USER scott IDENTIFIED BY tiger ACCOUNT UNLOCK;

CREATE OR REPLACE DIRECTORY test_dir AS '/u01/app/oracle/oradata/';
GRANT READ, WRITE ON DIRECTORY test_dir TO scott;

Table Exports/Imports

The TABLES parameter is used to specify the tables that are to be exported. The following is an example of the table export and import syntax.
expdp scott/tiger@db10g tables=EMP,DEPT directory=TEST_DIR dumpfile=EMP_DEPT.dmp logfile=expdpEMP_DEPT.log
 
impdp scott/tiger@db10g tables=EMP,DEPT directory=TEST_DIR dumpfile=EMP_DEPT.dmp 
logfile=impdpEMP_DEPT.log

Schema Exports/Imports

The OWNER parameter of exp has been replaced by the SCHEMAS parameter which is used to specify the schemas to be exported. The following is an example of the schema export and import syntax.
expdp scott/tiger@db10g schemas=SCOTT directory=TEST_DIR dumpfile=SCOTT.dmp logfile=expdpSCOTT.log

impdp scott/tiger@db10g schemas=SCOTT directory=TEST_DIR dumpfile=SCOTT.dmp logfile=impdpSCOTT.log
SQL> select count(1) from dba_objects where owner='SCOTT';

  COUNT(1)
----------
       766

Database Exports/Imports

The FULL parameter indicates that a complete database export is required. The following is an example of the full database export and import syntax.
expdp system/password@db10g full=Y directory=TEST_DIR dumpfile=DB10G.dmp logfile=expdpDB10G.log

impdp system/password@db10g full=Y directory=TEST_DIR dumpfile=DB10G.dmp logfile=impdpDB10G.log
SQL> select owner,DIRECTORY_PATH,DIRECTORY_NAME from dba_directories where DIRECTORY_NAME = 'DATA_PUMP_DIR';

OWNER   DIRECTORY_NAME          DIRECTORY_PATH
--------------------------------------------------------------------------------
SYS     DATA_PUMP_DIR          /u01/oracle/database/dpdump/


Exp and Imp Commands without data
===========================
exp system/password owner='PUB','ORD','XXFM' FILE=eNoData.DMP ROWS=N STATISTICS=NONE LOG=NoData.log


imp system/password FILE=NoData.DMP log=NoData.log ignore=y FROMUSER=pub,ord,xxfm TOUSER=pub,ord,xxfm

exp  system/password TABLES=XXHL.XHL_INT_ORDREPROC_TEMP file=XXH.dmp LOG=tab.log CONSISTANT=Y

 If you use option statistics=none, the you will not export/import this information which will make it go faster and will result in a smaller dumpfile. You can re-calculate these statistics when your import is finished by using the dbms_stats package. 

Export one table
============
expdp system/pasword tables=SCOTT.DS_ADDRESS_A_TEMP directory=DATA_PUMP_DIR dumpfile=TAB.dmp logfile=TAB.log

To Import One Table
===============
impdp system/password directory=DATA_PUMP_DIR dumpfile=TAB.dmp logfile=TAB.log tables=SCOTT.DS_ADDRESS_A_TEMP  ignore=Y

How to run export import in background
=============================
$ cat exp.sh
expdp system/****** directory=exp_ccprd dumpfile=xdrm_%U.dmp logfile=xdrm_15Sep2014.log schemas=XDRM PARALLEL=8 cluster=n

nohup sh exp.sh>exp.out &

For impdb
=========
take user creation script from toad.
then drop user cascade;

$ cat impdp.sh
impdp system/***** directory= DATA_PUMP_DIR dumpfile=xdrm_%U.dmp logfile=xdrm_17sep2014.log  schemas=XDRM ignore=y

Apps tables exp imp
================
 exp apps/pwd tables=XHLHYPTPE_INV_QTY_AUG14,XHLHYPTPE_DD_AUG14_20OCT14 file=TPE_data_new.dmp log=TPE_DATA_NEW.log STATISTICS=NONE 
imp apps/pwd tables=XHLHYPTPE_INV_QTY_AUG14,XHLHYPTPE_DD_AUG14_20OCT14 file=TPE_data_new.dmp log=TPE_DATA_NEW.log fromuser=apps touser=apps statistics=none

Export from a user and import to b user.
===============================
expdp system/pwd TABLES=CORE.WC_SNOW_CHANGE_REQUEST_D,CORE.WC_SNOW_INCIDENT_D directory=DATA_PUMP_DIR  dumpfile=export13112014.dmp logfile=export13112014.log

impdp file=export13112014.dmp log=export13112014.log ignore=y fromuser=core touser=odi_stage

Schedule IMP/EXP in cron job
======================
crontab -l
25 0-9 * * * /home/oratest/proj_automate/scripts/cron_export_fcr.sh password
# before shutting down TEST database. This file
# will be imported next time to preserve schedule
#++++++++++++++++++++++++++++++++++++++++++++++++

OUTPUT_DIR=/home/oraTEST/proj_automate/export_dump
LOG_DIR=/home/oraTEST/proj_automate/logs
LOG_NAME="cron_exp_fcr.log"
datum=`date +%m%d%Y%H%M`

if test -f ${LOG_DIR}/${LOG_NAME};then
 rm ${LOG_DIR}/${LOG_NAME}
 touch ${LOG_DIR}/${LOG_NAME}
fi

if test $# -ne 1;then
 echo "Usage: export_fcr.sh <apps_pwd>" > ${LOG_DIR}/${LOG_NAME}
 exit 1
fi

# Move FCR and FCRC dumpfiles incase we need them to restore for future
if test -f ${OUTPUT_DIR}/cron_fcr.dmp;then
  cp ${OUTPUT_DIR}/cron_fcr.dmp ${LOG_DIR}/cron_fcr.dmp.${datum}
fi

if test -f ${OUTPUT_DIR}/cron_fcrc.dmp;then
  cp ${OUTPUT_DIR}/cron_fcrc.dmp ${LOG_DIR}/cron_fcrc.dmp.${datum}
fi

# Source Oracle environment

. /u01/ora/database/11.2.0/TEST/TESTA_serverdb01.env
echo $ORACLE_HOME >> ${LOG_DIR}/${LOG_NAME}
echo $ORACLE_SID >> ${LOG_DIR}/${LOG_NAME}

echo "Start export of fnd_concurrent_requests" > ${LOG_DIR}/${LOG_NAME}

exp USERID=applsys/$1 tables=FND_CONCURRENT_REQUESTS FILE=${OUTPUT_DIR}/cron_fcr.dmp LOG=${LOG_DIR}/cron_fcr.log ROWS=Y STATISTICS=none
exp USERID=applsys/$1 tables=FND_CONC_RELEASE_CLASSES FILE=${OUTPUT_DIR}/cron_fcrc.dmp LOG=${LOG_DIR}/cron_fcrc.log ROWS=Y STATISTICS=none

for i in `ls ${LOG_DIR}/cron_fcr.log ${LOG_DIR}/cron_fcrc.log`
do
 if test `cat $i | grep -c "Export terminated successfully without warnings"` -gt 0
 then
   echo "Export of $i is successful" >> ${LOG_DIR}/${LOG_NAME}
   exit 0
 else
   echo "Export indicated warnings" >> ${LOG_DIR}/${LOG_NAME}
   exit 1
 fi
done

echo "End of export" >> ${LOG_DIR}/${LOG_NAME}
exit 0


One option create dblink and copy tables or
Exported Oracle 11g :'

 SQL> create directory my_dir as '/u01/app/dpdump';
Directory created.

 SQL> grant read, write on directory my_dir to scott;
Grant succeeded.

 SQL> conn scott/tiger
Connected.

 SQL> create table test_scott (id number, name varchar2(10));
 Table created.

 SQL> insert into test_scott values(1, 'zen');
1 row created.

 SQL> commit;
Commit complete.

expdp scott/tiger@test directory=my_dir dumpfile=test_scott.dmp tables=test_scott version=10.2

Imported to Oracle 10g
==================
SQL> create directory my_dir as '/home/oracle/dpdump';
Directory created.
SQL> grant read, write on directory my_dir to scott;
Grant succeeded.

impdp scott/tiger@test directory=my_dir dumpfile=test_scott.dmp tables=test_scott

Export more than 1 table in a dump and import one table from same dump
========================================================
expdp system/password tables=SCOTT.TABLE1,JOHN.TABLE10,CHRIS.TABLE9 directory=EXP_DUMP file=USERS001.dmp log=DMPLOGFILE.log

We can import one or more tables at a time.
=================================
impdp system/password tables=SCOTT.TABLE1 directory=DATA_PUMP_DIR file=USERS001.dmp logfile=USERS001.log FROMUSER=SCOTT TOUSER=SCOTT IGNORE=Y

Moving log files
================
mv /u01/ora/expfiles/expdp_fulldb_backup/expdp*.log /u01/ora/expfiles/logs/expdp_fulldb_backup_logs
###############################################################################3
#Granting permissions for Dump files
====================================
chmod -R 775 /u01/ora/expfiles/expdp_fulldb_backup/*
#################################################################################
#Compressing files
==================
gzip /u01/ora/expfiles/expdp_fulldb_backup/*.*
#################################################################################33
#Removing 7 days old dump files
===============================
find /u01/ora/expfiles/expdp_fulldb_backup -mtime +6 -exec rm {} \;

remap_tables
==========
impdp system/password directory=DIR_NAME dumpfile=test.dmp  tables=scott.emp remap_table=scott.emp:emp_new logfile=import.log

References
=========
Oracle sites and google.





No comments :

Post a Comment