During database maintenance, one of the most critical questions is: "When will it be finished?" Whether it is an export, import, backup, recovery, or flashback, we are always hunting for that estimated completion time.
Typically, the first place we look is v$session_longops. It provides a rough estimation of a process's progress with a simple query:
select message, time_remaining/3600 hrs from v$session_longops where sofar <> totalwork;
However, recent tests on Oracle 11.1.0.7 (Linux 64-bit) revealed that Data Pump doesn't always play nice with this view. In many cases, the master process message column stays stuck at “0 out of … MB done” throughout the entire operation, even when the finish line is in sight.
Limitations of v$session_longops with Data Pump
While Data Pump worker processes for exports do show some useful information (such as blocks scanned out of the total), imports are a different story. I found almost no data in v$session_longops that accurately identified the progress of an import worker.
Seven Observations from the Experimental Trenches
During experiments on a test database with a 4GB table, several oddities surfaced:
- Stagnant Byte Counts: The number of bytes written to an export file often stays at ~4k until the very end, even if the file has physically grown to several gigabytes.
- Premature Completion: The "Percent Done" for an import worker status is often set to 100% at the start and never changes.
- Conflicting Data: "Completed Bytes" might show the total size of the import immediately, while "Completed Rows" is the only metric reflecting actual row-by-row progress.
- The Rollback Trap: "Processed Rows" does not necessarily mean "Inserted Rows." If a
stop_jobaction occurs, it rolls back the inserted rows, but the processed row count is not reset. - Vanishing Information: If an import is stopped via
stop_joband then resumed withstart_job,v$session_longopsloses all previous information about that import. - Null Estimates: The
time_remainingcolumn is frequently null for the Data Pump Import master process. - Command Shortcut Chaos: Short naming conventions like
statfor status orstarforstart_jobwork well.stabehaves as both simultaneously. However, be careful with the letters—Oracle might assume you want to stop the job entirely!
Import> s
ORA-39004: invalid stateORA-39016: Operation not supported when job is in EXECUTING state.Are you sure you wish to stop this job ([yes]/no):
Bypassing the Master Process: A PL/SQL Solution
If v$session_longops is failing you and the segment space is already allocated, how do you get a reliable "rows completed" value via script?
By modifying the standard documentation examples, I developed a PL/SQL block that attaches to the Data Pump job and extracts the ku$_WorkerStatusList attributes directly. This allows you to see exactly what the workers are doing without relying on the interactive mode.
The Monitoring Script
DECLARE ind NUMBER; -- Loop index h1 NUMBER; -- Data Pump job handle percent_done NUMBER; -- Percentage of job complete job_state VARCHAR2(30); -- To keep track of job state js ku$_JobStatus; -- The job status from get_status ws ku$_WorkerStatusList; -- Worker status sts ku$_Status; -- The status object returned by get_status BEGIN -- Provide your specific job name and owner here h1 := DBMS_DATAPUMP.attach('SYS_IMPORT_FULL_01', 'SYS'); dbms_datapump.get_status(h1, dbms_datapump.ku$_status_job_error + dbms_datapump.ku$_status_job_status + dbms_datapump.ku$_status_wip, 0, job_state, sts); js := sts.job_status; ws := js.worker_status_list; dbms_output.put_line('*** Job percent done = ' || to_char(js.percent_done)); dbms_output.put_line('Restarts: ' || js.restart_count); ind := ws.first; while ind is not null loop dbms_output.put_line('Worker ' || ind || ' - Rows completed: ' || ws(ind).completed_rows); ind := ws.next(ind); end loop; DBMS_DATAPUMP.detach(h1); END; /
Conclusion
Using a PL/SQL approach is far more flexible than the standard Data Pump interactive mode. It allows you to not only monitor progress more accurately but also to trigger specific administrative actions based on the results.
Oracle Database Consulting Services
Ready to optimize your Oracle Database for the future?
Share this
Share this
More resources
Learn more about Pythian by reading the following blogs and articles.
Automating DataPump Export
How To Choose Your Oracle Database ID (DBID)
Oracle Database: script to purge aud$ table using dbms_audit_mgmt package
Ready to unlock value from your data?
With Pythian, you can accomplish your data transformation goals and more.