TA的每日心情 | 开心 2023-8-9 11:05 |
---|
|
DataPump Job
------------
A- When does a DataPump Job start ?
------------------------------
When a Datapump Export or Import session is launched, a Datapump Job is automatically started. This way, we can
--> detach from and reattach to long-running jobs without affecting the job itself.
--> monitor jobs from multiple locations
--> stop a job: suspend it temporarily to leave other applications requesting more resources to
get them
--> restart it later
--> kill any Datapump job anytime
--> see the progress of the import or export operations
B- Which name does the DataPump Job get ?
------------------------------------
You are able to perform the above actions though the Datapump job as long as you know its name.
You can assign the Datapump job a name at the expdp/impdp command level or let Oracle assign an internal name.
A job name is uniquely identified by the owner name and the name of the job.
This name uses the template SYS___%N, which depends on the type of operation performed and its scope: for example, the generated name could be SYS_IMPORT_FULL_03.
It is important to know this name because the master table name for the Data Pump job owns the job name. When the master table is not deleted (for example, when the job is aborted), it is necessary to drop the master table manually. This table contains the list of objects to be exported or imported.
You can retrieve the job, ownername and state from the following views:
SQL> select * from dba_datapump_jobs;
OWNER_NAME JOB_NAME OPERATION JOB_MODE STATE DEGREE ATTACHED_SESSIONS
---------- --------- --------- ----------- --------- ------ -----------------
SCOTT SCOTTEXP EXPORT SCHEMA DEFINING 1 1
SYSTEM SYSEXP EXPORT FULL EXECUTING 2 3
SYSTEM SYSIMP IMPORT TABLESPACE STOPPING 2 0
SYSTEM FULLEXP EXPORT FULL NOT RUNNING 0 0
SYSTEM EXPFULL EXPORT FULL IDLING 1 1
C- Examples
--------
1- Set up the logical directory for dump file sets to be created locally on the server location:
SQL> create directory EXP as '/tmp/dj';
Directory created.
SQL> grant read, write on directory EXP to public;
Grant succeeded.
2- Launch a Datapump export session(1): a Datapump job is started
$ expdp scott/<PASSWORD> job_name=scottexp directory=EXP
Export: Release 10.1.0.0.0 - 64bit Beta on Tue Feb 03 07:39:42 2004
(c) Copyright 2002 Oracle Corporation. All rights reserved.
Connected to: Oracle10i Enterprise Edition Release 10.1.0.0.0 - 64bit Beta
With the Partitioning, Oracle Label Security, OLAP and Oracle Data Mining optios
Starting "SCOTT"."SCOTTEXP": scott/******** job_name=scottexp directory=EXP
Estimate in progress using BLOCKS method...
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 256 KB
Processing object type SCHEMA_EXPORT/SE_PRE_SCHEMA_PROCOBJACT/PROCACT_SCHEMA
Processing object type SCHEMA_EXPORT/TABLE/TABLE
Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
...
3- Attach to the running job (session(2) from another location :
while the above session(1) is running, you run the following expdp command
from another session(2) :
$ expdp scott/<PASSWORD> attach=scottexp
Export: Release 10.1.0.0.0 - 64bit Beta on Tue Feb 03 07:39:50 2004
(c) Copyright 2002 Oracle Corporation. All rights reserved.
Connected to: Oracle10i Enterprise Edition Release 10.1.0.0.0 - 64bit Beta
With the Partitioning, Oracle Label Security, OLAP and Oracle Data Mining optios
Job: SCOTTEXP
Owner: SCOTT
Operation: EXPORT
Creator Privs: FALSE
GUID: D273AC7BD2C06A6EE03400306E0F6291
Start Time: Tuesday, 03 February, 2004 7:39
Mode: SCHEMA
Instance: EMR101U6
MaxDegree: 1
EXPORT Job Parameters:
Parameter Name Parameter Value:
CLIENT_COMMAND scott/******** job_name=scottexp directory=EXP
DATA_ACCESS_METHOD AUTOMATIC
ESTIMATE BLOCKS
INCLUDE_METADATA 1
KEEP_MASTER 0
LOG_FILE_DIRECTORY EXP
LOG_FILE_NAME export.log
TABLE_CONSISTENCY 0
Job: SCOTTEXP
Operation: EXPORT
Mode: SCHEMA
State: EXECUTING
Degree: 1
Job Error Count: 0
Dump file: /tmp/dj/expdat.dmp
bytes written = 2048
Worker 1 Status:
State: EXECUTING
Export>
You discover that the export job is executing, and you retrieve all values set for the parameters.
Not only the job is running, but also the Worker 1 which is the single process in charge of exporting metadata and data (parallel=1).
D- Which status the job can have ?
-----------------------------
1- DEFINING : the job just starts and no worker process is executing yet.
2- EXECUTING : the job as the example above is running.
3- IDLING : the job has been suspended while executing and at least one
session is attached:
--> From session(2), you decided to suspend the job temporarily :
Export> stop_job
Are you sure you wish to stop this job ([y]/n): y
$
--> Session(1) is informed of the action:
$ expdp scott/<PASSWORD> job_name=scottexp directory=EXP
Export: Release 10.1.0.0.0 - 64bit Beta on Tue Feb 03 07:39:42 2004
(c) Copyright 2002 Oracle Corporation. All rights reserved.
Connected to: Oracle10i Enterprise Edition Release 10.1.0.0.0 - 64bit Beta
With the Partitioning, Oracle Label Security, OLAP and Oracle Data Mining optios
Starting "SCOTT"."SCOTTEXP": scott/******** job_name=scottexp directory=EXP
Estimate in progress using BLOCKS method...
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 256 KB
Processing object type SCHEMA_EXPORT/SE_PRE_SCHEMA_PROCOBJACT/PROCACT_SCHEMA
Processing object type SCHEMA_EXPORT/TABLE/TABLE
Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
....
Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
Job "SCOTT"."SCOTTEXP" stopped by user request
--> Session(3) wants to see the status of this Datapump job:
$ expdp scott/<PASSWORD> attach=scottexp
Export: Release 10.1.0.0.0 - 64bit Beta on Tue Feb 03 07:45:22 2004
(c) Copyright 2002 Oracle Corporation. All rights reserved.
Connected to: Oracle10i Enterprise Edition Release 10.1.0.0.0 - 64bit Beta
With the Partitioning, Oracle Label Security, OLAP and Oracle Data Mining optios
Job: SCOTTEXP
Owner: SCOTT
Operation: EXPORT
Creator Privs: FALSE
GUID: D273AC7BD2C06A6EE03400306E0F6291
Start Time: Tuesday, 03 February, 2004 7:39
Mode: SCHEMA
Instance: EMR101U6
MaxDegree: 1
EXPORT Job Parameters:
Parameter Name Parameter Value:
CLIENT_COMMAND scott/******** job_name=scottexp directory=EXP
DATA_ACCESS_METHOD AUTOMATIC
ESTIMATE BLOCKS
INCLUDE_METADATA 1
KEEP_MASTER 0
LOG_FILE_DIRECTORY EXP
LOG_FILE_NAME export.log
TABLE_CONSISTENCY 0
Job: SCOTTEXP
Operation: EXPORT
Mode: SCHEMA
State: IDLING
Degree: 1
Job Error Count: 0
Dump file: /tmp/dj/expdat3.dmp
bytes written = 194560
Worker 1 Status:
State: UNDEFINED
Export>
Since the job is idling, the worker process is in UNDEFINED state for the moment.
--> You can see the status of the datapump job from the view:
SQL> select * from dba_datapump_jobs;
OWNER_NAME JOB_NAME OPERATION JOB_MODE STATE DEGREE ATTACHED_SESSIONS
---------- -------- --------- -------- -------- ------ -----------------
SYSTEM SYSEXP EXPORT FULL STOPPING 1 0
--> The master table still exists :
SQL> select count(*) from system.sysexp;
COUNT(*)
----------
7688
4- COMPLETING: the job has been restarted
--> From session(3), you resume the job to complete:
Export> start_job
Be aware:
*****************************************************
**** A TABLESPACE mode export is not restartable ****
*****************************************************
Export> start_job
ORA-39004: invalid state
ORA-39053: parameter or attribute TABLESPACE_EXPR must be defined for a TABLESPACE job
Export>
--> You can see the job completing:
Export> status
Job: SCOTTEXP
Operation: EXPORT
Mode: SCHEMA
State: COMPLETING
Bytes Processed: 58992
Percent Done: 100
Degree: 1
Job Error Count: 0
Dump file: /tmp/dj/expdat3.dmp
bytes written = 307200
Worker 1 Status:
State: WORK WAITING
Since the job is completing, the worker process is in WORK WAITING state because it has already finished its task of unloading/loading metadata and data. Now the master control process write logging information and drops the master table.
5- NOT RUNNING : the job has been suspended or killed and no more session is attached.
E- Which other commands you can perform via a session attached to a job ?
--------------------------------------------------------------------
-------------------
Export> | continue_client |
-------------------
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 256 KB
Processing object type SCHEMA_EXPORT/SE_PRE_SCHEMA_PROCOBJACT/PROCACT_SCHEMA
Processing object type SCHEMA_EXPORT/TABLE/TABLE
Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
Processing object type SCHEMA_EXPORT/TABLE/COMMENT
Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT
Processing object type SCHEMA_EXPORT/SE_POST_SCHEMA_PROCOBJACT/PROCACT_SCHEMA
. . exported "SCOTT"."BONUS" 3.804 KB 0 rows
. . exported "SCOTT"."DEPT" 3.656 KB 4 rows
. . exported "SCOTT"."EMP" 5.820 KB 14 rows
. . exported "SCOTT"."SALGRADE" 3.640 KB 9 rows
Master table "SCOTT"."SCOTTEXP" successfully loaded/unloaded
******************************************************************************
Dump file set for SCOTT.SCOTTEXP is:
/tmp/dj/expdat.dmp
Job "SCOTT"."SCOTTEXP" successfully completed
This displays exactly the same output as if you were under the client launching the datapump job session.
------------
Export> | kill_job |
------------
Are you sure you wish to stop this job ([y]/n): y
$
In this case, the job is aborted, the master table dropped, and all client sessions detached.
SQL> select count(*) from system.sysexp;
select count(*) from system.sysexp
*
ERROR at line 1:
ORA-00942: table or view does not exist
-----------------------------
Export> | add_file=EXP:expdat%u.dmp |
-----------------------------
This adds more files in the dump fileset.
--------------
Export> | PARALLEL=4 |
--------------
Export> status
Job: EXPTBS
Operation: EXPORT
Mode: TABLESPACE
State: EXECUTING
Bytes Processed: 0
Current Parallelism: 4
Job Error Count: 0
Dump File: /tmp/expdat.dmp
bytes written: 4,096
Worker 1 Status:
State: EXECUTING
Worker 2 Status:
State: WORK WAITING
Worker 3 Status:
State: WORK WAITING
Worker 4 Status:
State: WORK WAITING
This launches more worker processes to speed up the operation.
F- Miscelleaneous
--------------
F-1 You attempt to attach to a job that already completed or was aborted: the master table containing the list of objects to be exported has therefore already been dropped.
$ expdp scott/<PASSWORD> attach=scottexp
Export: Release 10.1.0.1.0 - Beta on Monday, 16 February, 2004 14:48
Copyright (c) 2003, Oracle. All rights reserved.
Connected to: Oracle10i Enterprise Edition Release 10.1.0.1.0 - Beta
With the Partitioning, OLAP and Data Mining options
ORA-31626: job does not exist
ORA-39068: invalid master table data in row with PROCESS_ORDER=-1
ORA-00942: table or view does not exist
F-2 You attach to a datapump job that just started and is unable to tell yet if it will execute or not :
$ expdp scott/<PASSWORD> attach=scottexp
Job: SCOTTEXP
Owner: SCOTT
Operation: EXPORT
Creator Privs: FALSE
GUID: 6B07EFDE31014A79AACFA89DCEE48AD6
Start Time: Monday, 16 February, 2004 15:04
Mode: SCHEMA
Instance: orcl
MaxDegree: 0
EXPORT Job Parameters:
Parameter Name Parameter Value:
CLIENT_COMMAND scott/******** job_name=scottexp directory=EXP
parallel=2
LOG_FILE_DIRECTORY EXP
LOG_FILE_NAME export.log
State: DEFINING
Job Error Count: 0
It may execute as it may not execute:
$ expdp scott/<PASSWORD> job_name=scottexp directory=EXP parallel=2
Export: Release 10.1.0.1.0 - Beta on Monday, 16 February, 2004 15:03
Copyright (c) 2003, Oracle. All rights reserved.
Connected to: Oracle10i Enterprise Edition Release 10.1.0.1.0 - Beta
With the Partitioning, OLAP and Data Mining options
ORA-39001: invalid argument value
ORA-39000: bad dump file specification
ORA-31641: unable to create dump file "/tmp/expdat.dmp"
ORA-27038: skgfrcre: file exists
OSD-04010: option specified, file already exists
In this case, the dump fileset already exists and the job cannot overwrite it:
It needs to be removed.
F-3 You attach to a job working with 2 slave processes, each one executing the
data unload of its own tables.
Export> status
Job: SCOTTEXP
Operation: EXPORT
Mode: SCHEMA
State: EXECUTING
Degree: 2
Job Error Count: 0
Dump file: /tmp/expdat.dmp
bytes written = 2048
Worker 1 Status:
State: EXECUTING
Object Schema: SCOTT
Object Name: T1
Object Type: SCHEMA_EXPORT/TABLE/TABLE_DATA
Completed Objects: 5
Total Objects: 5
Worker 2 Status:
State: EXECUTING
Object Schema: SCOTT
Object Name: BONUS
Object Type: SCHEMA_EXPORT/TABLE/TABLE_DATA
Completed Objects: 1
Total Objects: 5
The parallelism was set to 2.
|
|