当前位置:网站首页>Oracle advanced (VI) Oracle expdp/impdp details

Oracle advanced (VI) Oracle expdp/impdp details

2022-07-07 21:57:00 InfoQ

One 、ORACLE Data pump

ORCALE10G
Provides a new import and export tool ,
Data pump
.

Oracle The official description of this is :

Oracle DataPump technology enables Very High-Speed movement of data and metadata from one database to another.

among
Very High-Speed
Bright spot .

Let's start with the main features provided by the data pump ( Include , But not limited to, ):

  • Support parallel processing import 、 Export task ;
  • Support pause and restart import 、 Export task ;
  • Supported by
    Database Link
    To export or import objects in a remote database ;
  • Support to import through
    Remap_schema、Remap_datafile、Remap_tablespace
    Several parameters are used to automatically modify the owner of the object during the import process 、 Data file or table space where data resides ;
  • Import / Export provides very fine-grained object control . adopt
    Include、Exclude
    Two parameters , You can even specify whether to include or not to include an object .

Two 、 What is? Directory object

Directory
The object is
Oracle10g
A new feature provided by version . He is a point , Points to a path in the operating system . Every Directory Both contain  Read,Write Two authorities , Can pass Grant The command is authorized to the specified user or role . Users with read-write permission can read and write the Directory Object specifies the file in the operating system path .

  • Besides using
    network_link
    Unexpected parameter ,
    expdp
    The generated files are all on the server (Directory Designated location )

2.1  How to call

  • The simplest call in the command line mode , But the parameters written are limited ,
    It is recommended to use parameter files
    .
  • Parameter file is the most commonly used method . Usually you need to write a parameter file first . Specify various parameters required for export . Then call .

expdp user/pwd parfile=xxx.par

This xxx.par That is, the edited parameter file . Be careful , After this command line , You can also use other parameters ,  Even in par Parameters specified in the parameter file .

If the additional parameters in the execution command are duplicate with the parameters in the parameter file , Which parameter is finally used , It depends on the last position of the parameter . Such as :
expdp user/pwd parfile=xxx.par logfile=a.log
, If logfile, Here we will start with logfile Subject to ; Such as :
expdp user/pwd logfile=a.log parfile=xxx.par
, And this , Then the in the parameter file will prevail , because parfile=xxx.par Write it after the command line .

  • Interactive mode
    Data Pump
    Import and export tasks support stopping , Restart and other state operations . For example, users perform import or export tasks , Half way through , Use Crtl+C Interrupted the task ( Or other reasons ), At this time, the task is not cancelled , Instead, it was transferred to the backstage . It can be used again expdp/impdp command , additional attach Parameter to reconnect to the interrupted task , And select the subsequent operation .  This is the way of interaction .

2.2  What is? attach Parameters

Every time you execute an import , Or export , On the first line of the command , There will be the following information :

Starting “BAM”.”SYS_EXPORT_SCHEMA_01″:bam/******** parfile=expdp_tbs.par

This
SYS_EXPORT_SCHEMA_01
It's ours attach Parameters .

3、 ... and 、 Operation mode

  • Full library mode
    Import or export the entire database , Corresponding
    impdp/expdp
    In the command full Parameters , Only have dba perhaps  
    exp_full_database
    and
    imp_full_database
    Only authorized users can execute .
  • Schema Pattern
    Export or import Schema I have my own object , Corresponding
    impdp/expdp
    In the command Schema Parameters , This is the default mode of operation .  If you have dba perhaps  
    exp_full_database
    and
    imp_full_database
    If the user with permission executes , You can export or import multiple
    Schema
    Objects in the .
  • Table mode
    Export the specified table or table partition ( If there are partitions ) And objects that depend on the table ( As the index of the table , Constraints etc. , But only if these objects are in the same  
    Schema
    in , Or the executing user has the corresponding permission ). Corresponding
    impdp/expdp
    In the command Table Parameters .
  • Tablespace mode
    Export the contents of the specified tablespace . Corresponding
    impdp/expdp
    Medium
    Tablespaces
    Parameters , This pattern is similar to table pattern and  Schema Pattern supplement .
  • Transfer tablespace mode
    Corresponding
    impdp/expdp
    Medium
    Transport_tablespaces
    Parameters . The most significant difference between this mode and the previous modes is generated Dump The file does not contain specific logical data , Only the metadata of related objects is exported ( That is, the definition of the object , It can be understood as the creation statement of a table ), The logical data is still in the data file of the tablespace , When exporting, you need to copy metadata and data files to the target server at the same time . This export method is very efficient , The time cost is mainly generated by copying data files I/O On .expdp Perform export of transport tablespace pattern , User must   Have  exp_full_database Role or DBA role . When importing through transport tablespace mode , The user must have imp_full_database Role or DBA horn   color .
  • Filtering data
    Filtering data mainly depends on Query and Sample Two parameters . among Sample The parameters are mainly for expdp Export function .
  • Query
    And exp In the command Query The function is similar to , however Expdp in , This parameter function has been enhanced , Finer granularity of control .Expdp Medium Query It is also specified similar where Statement to limit records . The grammar is as follows :

Query = [Schema.][Table_name:] Query_clause

By default, if you do not specify
Schema.table_name
, be
Query_clause
Valid for all exported tables , Or you can specify a different
Query_clause
, Such as : export a All in the table id<5 The record of , export b All in the table name=’a’ The record of , be Query The parameters of should be as follows :

Query=A:”Where id<5″,B:”Where name=’a’”

If
Where
There is no designation before the condition
Schema
Name or table name , The default is for all current tables to be exported . Such as :
Query=Where id <5

notes ️: 
Make a proposal to Query Parameters are put into the parameter file , In order to avoid the trouble caused by escape .

  • Sample
    This parameter is used to specify the percentage of exported data , The range of values that can be specified is from 0.000001 To 99.999999, The grammar is as follows :

Sample=[[Schema_name.]Table_name:]sample_percent

After specifying this parameter ,EXPDP Export will automatically control the amount of records exported , Such as export A In the table 50% The record of , Set up Sample The parameters are as follows :Sample=A:50 notes : 
Sample_percent The specified value is just a reference value ,EXPDP An approximate value will be calculated according to the data quantity .

Four 、 Filter objects

Filtering objects mainly depend on Include and Exclude Two parameters . These two parameters work just the opposite , In these two parameters , You can specify any object type you know ( Such as :Package、Procedure、Table wait ) Or object name ( Support for wildcards ).

  • Exclude  Against the rules

Specify the object type or object name that is not included . After specifying this parameter , All objects corresponding to the specified object type will not be imported or exported .  If the excluded object has a dependent object , Then its dependent objects will not be imported or exported . Such as : adopt Exclude If the parameter specifies not to export the table object , Not only the specified table will not be exported , Linked to these tables Index、Check And so on will not be exported .
notes :  Make a proposal to Exclude Parameters are put into the parameter file , In order to avoid the trouble caused by escape .

  • Include  Just the rules

And
Exclude
Just the opposite . Specify the included object type or object name .

notes ️:  Because the functions of the two parameters are exactly the opposite , Therefore, when executing the import or export command , Two parameters cannot be used at the same time , otherwise Oracle I don't know what you want to do .

exclude/include
Parameter usage :

EXCLUDE=[object_type]:[name_clause],[object_type]:[name_clause] -- Exclude specific objects
INCLUDE=[object_type]:[name_clause],[object_type]:[name_clause] -- Contains specific objects

object_type
Clause is used to specify the type of object , Such as
table,sequence,view,procedure,package
wait

name_clause
The clause can be SQL Expressions are used to filter specific object names . It consists of SQL Operator and object name ( You can use wildcards ) To filter specific objects in the specified object type .

When... Is not specified
name_clause
, And just specify
object_type
Then all objects of this type will be filtered or filtered . Multiple [object_type]:[name_clause] Separated by commas . Example :

expdp <other_parameters> SCHEMAS=scott EXCLUDE=SEQUENCE,TABLE:&quot;IN ('EMP','DEPT')&quot;
impdp <other_parameters> SCHEMAS=scott INCLUDE=PACKAGE,FUNCTION,PROCEDURE,TABLE:&quot;='EMP'&quot;

Common filtering SQL expression

EXCLUDE=SEQUENCE,VIEW -- Filter all SEQUENCE,VIEW
EXCLUDE=TABLE:&quot;IN ('EMP','DEPT')&quot; -- Filter table objects EMP,DEPT
EXCLUDE=SEQUENCE,VIEW,TABLE:&quot;IN ('EMP','DEPT')&quot; -- Filter all SEQUENCE,VIEW And table objects EMP,DEPT
EXCLUDE=INDEX:&quot;= 'INDX_NAME'&quot; -- Filter the specified index object INDX_NAME
INCLUDE=PROCEDURE:&quot;LIKE 'PROC_U%'&quot; -- Include with PROC_U All stored procedures at the beginning (_  The symbol represents any single character )
INCLUDE=TABLE:&quot;> 'E' &quot; -- Contains characters larger than E All table objects of `

Directly encapsulate the filter operator into the parameter file , Like the following example

Parameter file:exp_scott.par
DIRECTORY = dump_scott
DUMPFILE = exp_scott_%U.dmp
LOGFILE = exp_scott.log
SCHEMAS = scott
PARALLEL= 2
EXCLUDE = TABLE:&quot;IN ('EMP', 'DEPT')&quot; 

Processing of escape characters under the command line
Windows platform :

D:\> expdp system/manager DIRECTORY=my_dir DUMPFILE=exp_tab.dmp LOGFILE=exp_tab.log SCHEMAS=scott
INCLUDE=TABLE:\&quot;IN ('EMP', 'DEPT')\&quot;

stay Windows Under the platform , Object double quotation marks are required to escape , Use escape characters
\

Unix platform :
When not in use parfile In the case of documents , All symbols need to be escaped , Include brackets , Double quotes , Single quotes, etc

expdp system/manager DIRECTORY=my_dir DUMPFILE=exp_tab.dmp LOGFILE=exp_tab.log SCHEMAS=scott
INCLUDE=TABLE:\&quot;IN \(\'EMP\', \'DEP\'\)\&quot;

exclude/include Common mistakes
Any character that needs to be escaped if it is not escaped or escaped incorrectly , produces ORA error . Here are some common ORA error .

  • ORA-39001: invalid argument value
  • ORA-39071: Value for INCLUDE is badly formed.
  • ORA-00936: missing expression
  • ORA-39071: Value for EXCLUDE is badly formed.
  • ORA-00904: “DEPT”: invalid identifier
  • ORA-39041: Filter “INCLUDE” either identifies all object types or no object types.
  • ORA-39041: Filter “EXCLUDE” either identifies all object types or no object types
  • ORA-39038: Object path “USER” is not supported for TABLE jobs.

5、 ... and 、 Advanced filtering

Exporting / When you import it , We often have such needs , Just want to export / Import table structure , Or just want to export / Import data . Fortunately, the data pump also provides this function . Use  Content Parameters . This parameter has three properties

  • ALL
     :  export / Import object definitions and data , The default value of this parameter is ALL
  • DATA_ONLY
     :  Export only / Import data .
  • METADATA_ONLY
     :  Export only / Import object definitions .

notes :  It's worth noting that , When exporting , If advanced filtering is used , If only data is exported , When importing , You need to ensure that the data definition already exists . Otherwise, the data will have no master .
If the data definition already exists , It is best to specify data_only, Otherwise it will trigger ORA-39151 error , Because the object already exists
.

5.1  Filtering existing data

We know , The imported table object already exists in the target library , And no data integrity constraints are created on the target side (RI) To test the data , Data may be imported repeatedly . Data pump provides a new parameter
Table_exists_action
, It can reduce the generation of duplicate data to a certain extent . This parameter is used to control if the table object to be imported exists , What to do . There are several parameter values :

  • SKIP
     :  Skip this table , Proceed to the next object . This parameter defaults to
    SKIP
    . It is worth noting that , If you also specify
    CONTENT
    Parameter is
    Data_only
    Words ,
    SKIP
    Invalid parameter , The default is
    APPEND
    .
  • APPEND
     :  Add data to an existing table .
  • TRUNCATE
     : 
    TRUNCATE
    Current table , Then add records . Use this parameter with caution , Unless you confirm that the data in the current table is really useless . Otherwise, data may be lost .
  • REPLACE
     :  Delete and rebuild table objects , Then add data to it . It is worth noting that , If you also specify
    CONTENT
    Parameter is
    Data_only
    Words ,
    REPLACE
    Invalid parameter .

5.2  Redefine the Schema Or table space

We may also encounter such needs , hold A The user's object is transferred to B user , Or change the table space of the data . Data pump through
Remap_Schema
and
Remap_tablespace
Parameters to achieve this function .

  • REMAP_SCHEMA
     :  Redefine the object's ownership Schema The function of this parameter is similar to IMP Medium Fromuser+Touser, Support multiple Schema Transformation , The grammar is as follows :

REMAP_SCHEMA=Source_schema:Target_schema[,Source_schema:Target_schema]

If you put A Object to b user , take C The switch to D user .
Remap_schema=a:b,c:d

notes ️:  Can't be in the same IMPDP The order specifies
remap_schema=a:b,a:c.

  • REMAP_TABLESPACE
     :  Redefine the tablespace in which the object resides . This parameter is used to remap the table space of the imported object , Support simultaneous conversion of multiple tablespaces , Separate each other with commas . The grammar is as follows :

REMAP_TABLESPACE=Source_tablespace:Target_tablespace[,Source_tablespace:Target_tablespace]

notes ️:  If you use
Remap_tablespace
Parameters , Ensure that the imported user has read and write permission to the target tablespace .

5.3  Optimize import / Export efficiency

For large amounts of data , We have to think about efficiency . Data pump also put forward higher requirements for efficiency . Even the official description is

Oracle Data Pump technology enables Very High-Speed movement of data and metadata from one database to another.

there Very High-Speed Depend on our parallel Parameters .

All optimization operations have three results : Get better 、 There is no change 、 Get worse .
Parallel
So are the parameters , It's not to specify a greater than 1 Parameters of , Performance will improve .

  • For exported
    parallel

For export , because
dump
Files can only be operated by one thread ( Include I/O Handle ), So if you output DUMP There is only one file , Even if you specify more parallels , The actual work is still a , And it triggers
ORA-39095
error . therefore , It is recommended to set this parameter less than or equal to the generated
DUMP
Number of documents . that , How to control the generated
DUMP
The number of documents ?

EXPDP
The order provides a
FILESIZE
Parameters , Used to specify a single
DUMP
The maximum capacity of the file , Use... Effectively
parallel
Parameters ,
filesize
Parameters are essential .

give an example : A user object occupies 4G Left and right space , After the actual export DUMP The document is about 3G, We try to export the user with a parallelism of 4, Set a single file to no more than 500M, The grammar is like   Next :

expdp user/pwd directory=dump_file dumpfile=expdp_20100820_%U.dmp logfile=expdp_20100820.log filesize=500M parallel=4

  • For imported
    parallel

For import , Use
parallel
Parameters are much simpler , Import can better reflect
parallel
Parameter advantage .  The parameter is set to , It is considered that the contents of several tables are imported into the library at the same time .

give an example : some dmp The file contains 200 A watch , We are trying to import the DMP File specifies the parallelism as 10, be   The grammar is as follows :

impdp user/pwd directory=dump_file dumpfile=expdp_20100820_%U.dmp logfile=impdp_20100820.log parallel=10

原网站

版权声明
本文为[InfoQ]所创,转载请带上原文链接,感谢
https://yzsam.com/2022/188/202207071423160271.html