Dp

  • Uploaded by: SANJAY BALA
  • 0
  • 0
  • December 2019
  • PDF

This document was uploaded by user and they confirmed that they have the permission to share it. If you are author or own the copyright of this book, please report to us by using this DMCA report form. Report DMCA


Overview

Download & View Dp as PDF for free.

More details

  • Words: 1,334
  • Pages: 19
BACKUP &

RECOVERY DATA PUMP

DOYENSYS

DATA PUMP • • • • • • •

New Feature Of Datapump. How Does Datapump Access Data. Accessing Data Over a Network. What Happens During Execution of Datapump. Monitoring Job Status. File Allocation. Original Export & Import [VS] Datapump Export & Import.

DOYENSYS

NEW FEATURE OF DATA PUMP •

Data pump takes the old export and import one step further, you can have total control over the job running (stop it, pause it, check it, restart it).



The new data pump export and import utility has a very similar look and feel to original export and import utilities but they are completely different.



Dump files generated by the original export utility can not imported by the data pump import utility.



Data pump can transfer large amounts of data very quickly, using parallel streams to achieve maximum throughput, they can be 15-45% faster than the older import/export utilities.

DOYENSYS

ADVANTAGES O F D ATA P UMP • •

• • • • •

Ability to estimate jobs times Ability to restart failed jobs Every Data Pump job creates a Master Table in which the entire record of the job is maintained. The Master Table is the directory to the job, so if a job is stopped for any reason, it can be restarted at a later point in time, without losing any data. Perform fine-grained object selection Monitor running jobs Directly load a database from a remote instance via the network Remapping capabilities Improved performance using parallel executions

DOYENSYS

D ATA P UMP U SE One can use Data pump for the following • • • • • • • •

Migrating databases Copying databases Transferring oracle databases between different operating systems Backing up important tables before you change them Moving database objects from one tablespace to another Transporting tablespaces between databases Reorganizing fragmented table data Extracting the DDL for tables and other objects such as stored procedures and packages

DOYENSYS

COMPONENTS OF DATAPUMP Data pump technology consists of three major components



Dbms_Datapump - the main engine for driving data dictionary metadata loading and unloading



Dbms_Detadata - used to extract the appropriate metadata



Command-Line - expdp and impdp are the import/export equivalents

DOYENSYS

DATA ACCESS METHODS • Direct Path - Bypasses the database buffer cache and writes beyond the high water mark when finished adjusts the high water mark No undo is generated and can switch off redo as well, minimal impact to users as does not use SGA. Must disable triggers on tables before use.

• External Path - Uses the database buffer cache acts as a SELECT statement into a dump file, during import reconstructs statements into INSERT statements, so whole process is like a normal SELECT or INSERT job. Both undo and redo are generated and uses a normal COMMIT just like a DML statement would.

DOYENSYS

In the following cases oracle will use the external path

• • • • • • •

Clustered tables Active triggers in the table A single partition in a table with a global index Referential integrity constraints Domain indexes on LOB columns Tables with fine-grained access control enabled in the insert mode Tables with BFILE or opaque type columns

DOYENSYS

D ATA P UMP F ILES You will use three types of files when using data pump, all files will be created on the server. •

Dump files - Holds the data and metadata



Log files - The resulting output from the data pump command



Sql files - contain the DDL statements describing the objects included in the job but can contain data

DOYENSYS

RUNNING A DATA PUMP Create a directory dump where all your exported .dmp files will be stored from OS then create a link to these directories from the sqlprompt like sql> create directory dump as ‘/u01/backup/datapump /dump’;. Only the user who created this directory has the rights to export the dmp files to this directories. If the some other users try’s to access this directory means the oracle engine will return a error stating that unable to open the log file and invalid file operation. With the help of the table dba_directories we check the owner of the directory, directory name and directory path. Finally Check that the user oracle has the access to write inside that directory. Select owner,directory_name,directory_path from dba_directories; shows

DOYENSYS

GRANTING PRIVILEGES The user SCOTT is been granted with full database export/import privileges. Granting the privilege can be done only by sysdba.

DOYENSYS

EXPORTING •

Exporting a Full Fatabase expdp Scott/tiger FULL=y DIRECTORY=datapump DUMPFILE=full.dmp LOGFILE= full. log



Exporting a Schema expdp Scott/tiger SCHEMAS=SCOTT DIRECTORY=datapump DUMPFILE= scott.dmp LOGFILE= Scott. log



Exporting a Tablespace expdp Scott/tiger TABLESPACES=USERS DIRECTORY=datapump DUMPFILE = TSusers.dmp LOGFILE= TSusers.log



Exporting a Table expdp Scott/tiger TABLES=SCOTT.DEPT,SCOTT.EMP DIRECTORY=datapump DUMPFILE= table.dmp LOGFILE=table. log DOYENSYS

IMPORTING •

Importing a Full Database impdp hr/hr DUMPFILE= full.dmp FULL=y LOGFILE = full.log



Importing a Schema impdp hr/hr SCHEMAS= hr,oe DIRECTORY=dpump_dir1 LOGFILE = schemas.log DUMPFILE= expdat.dmp



Importing a Tablespace impdp hr/hr TABLESPACES=tbs1,tbs2,tbs3,tbs4 DIRECTORY=dump DUMPFILE=tbs.dmp LOGFILE=tbs.log



Importing a Table impdp Scott / Scott TABLES=DEPT,EMP DIRECTORY=dump DUMPFILE=table.dmp LOGFILE=table.log

DOYENSYS

M ONITORING J OB S TATUS •

Monitoring Data Pump Jobs The dba_datapump_jobs table will help you to view the owner_name, job_name, job_mode, state, degree and etc. If there is some export and import job is currently active if there are no jobs active now means then it will return the message no rows selected.

DOYENSYS



Monitoring the Data Pump sessions The dba_datapump_sessions table will help us to find which session is having a problem for further information we can join the v$session view. While any job is running this view will show the owner who is executing the job and job name ,saddr and the session type.

DOYENSYS



Monitoring the Session_longops This view helps us to view how well a data pump export is doing. Basically gives us the progress monitor through the message column.

DOYENSYS

DATA PUMP VS IMPORT / EXPORT If you are familiar with the original Export (exp) and Import (imp) utilities, it is important to understand that many of the concepts behind them do not apply to Data Pump Export (expdp) and Data Pump Import (impdp). In particular:



Data Pump Export and Import operate on a group of files called a dump file set rather than on a single sequential dump file.



Data Pump Export and Import access files on the server rather than on the client. This results in improved performance. It also means that directory objects are required when you specify file locations.

DOYENSYS



Data Pump Export and Import use parallel execution rather than a single stream of execution, for improved performance. This means that the order of data within dump file sets is more variable.



Data Pump Export and Import represent metadata in the dump file set as XML documents rather than as DDL commands. This provides improved flexibility for transforming the metadata at import time.



Data Pump Export and Import are self-tuning utilities. Tuning parameters that were used in original Export and Import, such as BUFFER and RECORDLENGTH, are neither required nor supported by Data Pump Export and Import.



At import time there is no option to perform interim commits during the restoration of a partition. This was provided by the COMMIT parameter in original Import.

DOYENSYS



There is no option to merge extents when you re-create tables. In original Import, this was provided by the COMPRESS parameter. Instead, extents are reallocated according to storage parameters for the target table.



Sequential media, such as tapes and pipes, are not supported.



When you are importing data into an existing table using either APPEND or TRUNCATE, if any row violates an active constraint, the load is discontinued and no data is loaded. This is different from original Import, which logs any rows that are in violation and continues with the load.

THANK YOU SANJAY BALA.A DOYENSYS DOYENSYS

Related Documents

Dp
June 2020 12
Dp
May 2020 16
Dp
November 2019 32
Dp
December 2019 30
Dp
May 2020 12
Dp
October 2019 30

More Documents from ""

Maxextends.sql
April 2020 5
Dp
December 2019 30
Printing.pdf
November 2019 26
4_7
April 2020 6