Transcription

Oracle DatabaseUtilities Data PumpBest Practices for Export and ImportWHITE PAPER / MARCH 6, 2019

This document describes best practices for Oracle Data Pump Export and Import.DISCLAIMERThis document in any form, software or printed matter, contains proprietary information that is theexclusive property of Oracle. Your access to and use of this confidential material is subject to theterms and conditions of your Oracle software license and service agreement, which has beenexecuted and with which you agree to comply. This document and information contained herein maynot be disclosed, copied, reproduced or distributed to anyone outside Oracle without prior writtenconsent of Oracle. This document is not part of your license agreement nor can it be incorporatedinto any contractual agreement with Oracle or its subsidiaries or affiliates.This document is for informational purposes only and is intended solely to assist you in planning forthe implementation and upgrade of the product features described. It is not a commitment to deliverany material, code, or functionality, and should not be relied upon in making purchasing decisions.The development, release, and timing of any features or functionality described in this documentremains at the sole discretion of Oracle.Due to the nature of the product architecture, it may not be possible to safely include all featuresdescribed in this document without risking significant destabilization of the code.2WHITE PAPER / Oracle Database Utilities Data Pump

TABLE OF CONTENTSIntroduction.4Use a parameter file .5Make a consistent Data Pump export.5Aways include these parameters during export.6Always include these parameters during import.6Use parallelism and collect statistics .6Set resource utilization appropriately .7Use a netwok link for servers on different OS and storage .7Use SecureFile LOBs .7Additional recommended Practices for Autonomous Database .8Conclusion.83WHITE PAPER / Oracle Database Utilities Data Pump

INTRODUCTIONThere are multiple approaches for migrating on-premises Oracle databasesto the Oracle cloud. A common method is Oracle Data Pump, a feature ofOracle Database since release 10g and successor to the Oracle Exportand Import (exp/imp) utilities in release 9i and earlier. Oracle Data Pump isuseful for migrating data among schemas, databases of different versionsand on different operating systems, and from on-premises to Oracle Cloud.It should be considered for migrating an on-premises Oracle database toOracle cloud in the following circumstances: the source Oracle Database is release 10g or higher; (Older databasesmust use the original Export and Import utilities that came with thedatabase.) migrating data cross-endian; migrating from a non-CDB Oracle database to an Oracle multitenantdatabase; migrating with changes to database structure; or, combining a migration and a version upgrade.Migrating data using Oracle Data Pump is a three-step process. Assumingyour cloud instance and PDB are created:4 export the on-premise database using expdp, copy the dump files to the target system or to the Oracle ObjectStore, if required import into the cloud PDB using impdb.WHITE PAPER / Oracle Database Utilities Data Pump

As can be the case, the devil is in the details. This whitepaper describes some best practices for using OracleData Pump to help make the process go smoothly andsuccessfully.USE A PARAMETER FILEA parameter file, also referred to as a “parfile”, enables you to specifycommand-line parameters in a file for ease of reuse. It also helps avoid typographical errors fromtyping long ad hoc Data Pump commands on the command line, especially if you use parameterswhose values require quotation marks.Here is an example of a parfile:DIRECTORY my data pump dirDUMPFILE dumpfile.dmpLOGFILE logfile.logSCHEMAS HREXCLUDE STATISTICSLOGTIME ALLMETRIC YESFLASHBACK TIME SYSTIMESTAMPThe command to execute the par file looks like this:expdp parfile my data pump parfile.parMAKE A CONSISTENT DATA PUMP EXPORTBy default, Oracle Data Pump preserves consistency within a single database table. For example, ifyou export a table that has 1000 partitions, the exported table will be consistent as of the specificSystem Change Number (SCN) at which you started the export. When exporting multiple tables, thenext table exported would then be consistent as of a different SCN. For any export of more than onetable, you will probably want your export dump file to represent all of the objects as of the same SCN.This can be accomplished by using either FLASHBACK SCN scn orFLASHBACK TIME timestamp to enable the Flashback Query utility. A particularly convenientapproach is to specify FLASHBACK TIME SYSTIMESTAMP.Using FLASHBACK SCN, the export operation is performed with data that is consistent up to thespecified SCN. For example, this command assumes that an existing SCN value of 384632 exists. Itexports the hr schema up to SCN 384632:expdp hr DIRECTORY dpump dir1 DUMPFILE hr scn.dmp FLASHBACK SCN 384632Using FLASHBACK TIME timestamp , the export operation is performed with data that isconsistent up to the SCN that most closely matches the specified time. For example, this exportoperation is performed with data that is consistent up to the SCN closest to the specified time.FLASHBACK TIME "TO TIMESTAMP('27-10-2012 13:16:00', 'DD-MM-YYYYHH24:MI:SS')"5WHITE PAPER / Oracle Database Utilities Data Pump

When specifying FLASHBACK TIME SYSTIMESTAMP, the timestamp will be that of the currentsystem time. Finally, you can still use the release 11.2 legacy interface, CONSISTENT Y which istranslated directly to FLASHBACK TIME SYSTIMESTAMP.NOTE: If you use FLASHBACK TIME or FLASHBACK SCN, Data Pump Export must retain UNDOrecords for the duration of the export. If the database has insufficient UNDO retention, the result will bea “snapshot segment too old” error as Data Pump attempts to access UNDO records that are nolonger available to the database.AWAYS INCLUDE THESE PARAMETERS DURING EXPORTOracle recommends that you do not export statistics during export. This will provide betterperformance on both export and import, even accounting for the need to gather statistics after theimport. You can exclude statistics from an export operation using the EXCLUDE STATISTICSparameter. Instead, follow the best practice of either creating fresh statistics on the target database orusing a DBMS STATS staging table for statistics.Timestamp the messages that are displayed during an export operation using LOGTIME ALL. Thisparameter is available beginning with Oracle Database release 12.1. Having timestamps on every linein the logfile helps when assessing export and import performance.Record the number of objects and the elapsed time about the job in the Oracle Data Pump log file.Accomplish this using the parameter METRICS YES. This gives an extra level of detail, such as thework performed by each process in a PARALLEL export or import.ALWAYS INCLUDE THESE PARAMETERS DURING IMPORTTimestamp the messages that are displayed during an import operation using LOGTIME ALL. Thisparameter is available beginning with Oracle Database release 12.1.Record the number of objects and the elapsed time about the job in the Oracle Data Pump logfile. Accomplish this using the parameter METRICS YES.USE PARALLELISM AND COLLECT STATISTICSAccomplish more work in less time using parallelism. A new Data Pump Job consists of at least twobackground processes: a master and a worker, and 2 sessions. The Data Pump PARALLEL parametercreates additional background processes and sessions during an export or import.The PARALLEL n parameter specifies the maximum number of processes of active executionoperating on behalf of the export or import job. Typically, the value for n should be twice the number ofCPU cores but be prepared to adjust it if need be.You should always use the %U or %L substitution variable when exporting with PARALLEL. Otherwiseyou end up with parallel writes to a single logfile, which can impact performance.It is important to collect up-to-date statistics prior to an export operation. Beginning with release 12.2,for export and import by dumpfile only, metadata operations are performed in parallel. Concurrent withmetadata export, table sizes are estimated and ranked from largest to smallest for parallel export. Thetable sizes are estimated using statistics. Collect statistics using the dbms stats package, with thegather table stats, gather schema stats, or gather database stats procedure.6WHITE PAPER / Oracle Database Utilities Data Pump

Oracle Data Pump has imported package bodies in parallel for several releases. Beginning withrelease 12.2, Oracle Data Pump imports most metadata and most database objects in parallel. ThePARALLEL parameter also determines how many indexes get created in parallel. Database metadataobjects that have dependencies are still imported in a serial fashion, such as types (due toinheritance), schemas and procedural actions.If you are using Oracle Database release 11.2.0.4 and 12.1.0.2, you can apply the patch for bug22273229 to enable parallel import of constraints and indexes.SET RESOURCE UTILIZATION APPROPRIATELYSet the initialization parameter STREAMS POOL SIZE to a reasonable value in the range of 64MB to256MB. Oracle Data Pump uses Advanced Queuing (AQ) functionality to communicate betweenprocesses. If the SGA TARGET initialization parameter is set, then the STREAMS POOL SIZEinitialization parameter should be set to a reasonable minimum for database usage. See OracleDatabase Reference for details on setting the STREAMS POOL SIZE parameter.Set the maximum number of Data Pump jobs and the maximum parallelism for pluggable databases ina multitenant environment.If you encounter the error, "ORA-00018: maximum number of sessions exceeded" or "ORA00020: maximum number of processes (%s) exceeded" you may be allowing too many jobsor too much parallelism, respectively.Beginning with release 19c, DBAs can restrict resource usage for Data Pump. You can set theMAX DATAPUMP JOBS PER PDB database parameter to restrict the number of jobs created. You canalso allow the parameter to be set automatically to 50 percent of SESSIONS. This value must be samefor each RAC instance. It can be set and changed dynamically, and it is modifiable per-PDB.You can set the MAX DATAPUMP PARALLEL PER JOB database parameter to restrict the amount ofparallelism in an individual Data Pump job. You can also allow the perimeter to be set automatically to25 percent of SESSIONS. This value can be different for each RAC instance. It can be set andchanged dynamically, and it is modifiable per-PDB.USE A NETWOK LINK FOR SERVERS ON DIFFERENT OS AND STORAGEYou can start an import (impdp) from the target database over a database link. No dump file will begenerated, which can eliminate the need for temporary storage during a migration. Beginning withOracle Database release 12.2, there is support for Direct Path Load over dblink, including for LONGand LONG RAW data, using the parameter ACCESS METHOD DIRECT PATH.USE SECUREFILE LOBSIt is recommended that you use SecureFile LOBs, especially in conjunction with partitioning.SecureFile LOBs are more scalable then older BasicFile LOBs, and allow for parallel IO into and out oftables with LOB columns. You can use the impdp parameter LOB STORAGE SECUREFILE to convertold LOBs to SecureFiles without having to pre-create tables with the newer LOB storage in the targetdatabase.7WHITE PAPER / Oracle Database Utilities Data Pump

ADDITIONAL RECOMMENDED PRACTICES FOR AUTONOMOUS DATABASEAccess MethodThe Data Pump Export ACCESS METHOD parameter instructs Export to use a particular method tounload data. By default, it is set to AUTOMATIC. Beginning with release 12.2, LONG or LONG RAW typesin the source database can be exported over dblinks using ACCESS METHOD DIRECT PATH withNETWORK LINK dblink .Importing into a non-partitioned tableIf the source database has partitioned tables and you are migrating data into an Autonomous DataWarehouse database that does not use partitions then use DATA OPTIONS GROUP PARTITION TABLE DATA. This allows Data Pump to use the parallel query engine to moreefficiently load data into the data warehouse.Use the AL32UTF8 database character setOracle recommends using AL32UTF8 as the database character set. It is the most common characterset because it is a superset of all other character sets.CONCLUSIONOracle Data Pump is a mature, full featured and flexible tool for Oracle Database migration. Asdiscussed in this whitepaper, with each new release it provides more options for optimizingperformance and resource utilization. Following the best practices in this white paper will help yourData Pump exports and imports run as smoothly and quickly as possible.8WHITE PAPER / Oracle Database Utilities Data Pump

ORACLE CORPORATIONWorldwide Headquarters500 Oracle Parkway, Redwood Shores, CA 94065 USAWorldwide InquiriesTELE 1.650.506.7000FAX 1.650.506.7200oracle.com 1.800.ORACLE1CONNECT WITH USCall 1.800.ORACLE1 or visit oracle.com. Outside North America, find your local office at .com/oracletwitter.com/oracleCopyright 2019, Oracle and/or its affiliates. All rights reserved. This document is prov