Data Pump Overview
Oracle Data Pump Overview
Section titled “Oracle Data Pump Overview”Oracle Data Pump is the fastest and most flexible way to move data and metadata in and out of Oracle databases. It provides significant performance improvements over traditional export/import utilities through parallel processing and direct path operations.
What is Data Pump?
Section titled “What is Data Pump?”Data Pump is Oracle’s high-performance data movement utility that includes:
- expdp - Export Data Pump utility
- impdp - Import Data Pump utility
- DBMS_DATAPUMP - PL/SQL API for programmatic access
- Enterprise Manager integration - GUI interface
Key Advantages Over Legacy exp/imp
Section titled “Key Advantages Over Legacy exp/imp”Feature | Legacy exp/imp | Data Pump |
---|---|---|
Performance | Single-threaded | Parallel processing |
Restart Capability | None | Full restart support |
Network Mode | Limited | Direct database-to-database |
Filtering | Basic | Advanced row/metadata filtering |
Compression | None | Built-in compression |
Encryption | None | Transparent encryption |
Monitoring | Limited | Comprehensive job monitoring |
Data Pump Architecture
Section titled “Data Pump Architecture”┌─────────────────┐ ┌─────────────────┐ ┌─────────────────┐│ Client Tool │ │ Master Process │ │ Worker Processes││ (expdp/impdp) │◄──►│ (Coordination) │◄──►│ (Parallel I/O) │└─────────────────┘ └─────────────────┘ └─────────────────┘ │ │ │ ▼ ▼ ▼┌─────────────────────────────────────────────────────────────────┐│ Storage Layer ││ ┌─────────────┐ ┌─────────────┐ ┌─────────────┐ ││ │ Dump Files │ │ Log Files │ │ Database │ ││ │ (Data/Meta) │ │ (Progress) │ │ (Source/ │ ││ │ │ │ │ │ Target) │ ││ └─────────────┘ └─────────────┘ └─────────────┘ │└─────────────────────────────────────────────────────────────────┘
Essential Concepts
Section titled “Essential Concepts”1. Directory Objects
Section titled “1. Directory Objects”Data Pump requires Oracle directory objects for file operations:
-- Create directory object (as DBA)CREATE OR REPLACE DIRECTORY dp_dir AS '/u01/app/oracle/datapump';
-- Grant permissionsGRANT READ, WRITE ON DIRECTORY dp_dir TO hr;
-- Check existing directoriesSELECT directory_name, directory_path FROM dba_directories;
2. Job Management
Section titled “2. Job Management”Data Pump operations run as database jobs:
-- Check running jobsSELECT job_name, state, degreeFROM dba_datapump_jobsWHERE state = 'EXECUTING';
-- Attach to existing jobexpdp hr/password attach=HR_EXPORT_JOB
3. Parallel Processing
Section titled “3. Parallel Processing”# Export with 4 parallel processesexpdp hr/password directory=dp_dir dumpfile=hr_%U.dmp parallel=4
# Import with parallel processingimpdp hr/password directory=dp_dir dumpfile=hr_*.dmp parallel=4
Common Export Scenarios
Section titled “Common Export Scenarios”1. Full Database Export
Section titled “1. Full Database Export”#!/bin/bashexport ORACLE_SID=ORCLexport ORACLE_HOME=/u01/app/oracle/product/19.0.0/dbhome_1
expdp system/password \ directory=dp_dir \ dumpfile=full_db_%U.dmp \ logfile=full_db_export.log \ full=y \ parallel=4 \ compression=all \ job_name=FULL_DB_EXPORT
2. Schema Export
Section titled “2. Schema Export”#!/bin/bashexpdp hr/password \ directory=dp_dir \ dumpfile=hr_schema_%U.dmp \ logfile=hr_schema_export.log \ schemas=hr \ parallel=2 \ compression=all \ exclude=statistics \ job_name=HR_SCHEMA_EXPORT
3. Table Export with Query Filter
Section titled “3. Table Export with Query Filter”#!/bin/bashexpdp hr/password \ directory=dp_dir \ dumpfile=employees_recent.dmp \ logfile=employees_export.log \ tables=hr.employees \ query='hr.employees:"WHERE hire_date >= DATE'\''2020-01-01'\''"' \ job_name=EMPLOYEES_FILTERED_EXPORT
Common Import Scenarios
Section titled “Common Import Scenarios”1. Full Database Import
Section titled “1. Full Database Import”#!/bin/bashimpdp system/password \ directory=dp_dir \ dumpfile=full_db_*.dmp \ logfile=full_db_import.log \ full=y \ parallel=4 \ job_name=FULL_DB_IMPORT
2. Schema Import with Remap
Section titled “2. Schema Import with Remap”#!/bin/bashimpdp system/password \ directory=dp_dir \ dumpfile=hr_schema_*.dmp \ logfile=hr_import.log \ schemas=hr \ remap_schema=hr:hr_test \ remap_tablespace=users:test_users \ parallel=2 \ job_name=HR_SCHEMA_IMPORT
3. Table Import with Transform
Section titled “3. Table Import with Transform”#!/bin/bashimpdp hr/password \ directory=dp_dir \ dumpfile=employees_recent.dmp \ logfile=employees_import.log \ tables=hr.employees \ remap_table=hr.employees:hr.employees_backup \ transform=segment_attributes:n \ job_name=EMPLOYEES_IMPORT
Advanced Features
Section titled “Advanced Features”1. Network Mode (Database Link)
Section titled “1. Network Mode (Database Link)”# Direct database-to-database transferexpdp hr/password \ network_link=source_db_link \ directory=dp_dir \ dumpfile=network_export.dmp \ schemas=hr
2. Encryption
Section titled “2. Encryption”# Password-based encryptionexpdp hr/password \ directory=dp_dir \ dumpfile=encrypted_export.dmp \ encryption=all \ encryption_algorithm=AES256 \ encryption_password=SecurePass123
3. Flashback Export
Section titled “3. Flashback Export”# Export data as of specific SCNexpdp hr/password \ directory=dp_dir \ dumpfile=flashback_export.dmp \ schemas=hr \ flashback_scn=1234567890
# Export data as of specific timeexpdp hr/password \ directory=dp_dir \ dumpfile=point_in_time_export.dmp \ schemas=hr \ flashback_time="TO_TIMESTAMP('2024-01-01 12:00:00', 'YYYY-MM-DD HH24:MI:SS')"
Monitoring and Management
Section titled “Monitoring and Management”1. Job Status Monitoring
Section titled “1. Job Status Monitoring”-- Check job progressSELECT job_name, operation, job_mode, state, degree, attached_sessionsFROM dba_datapump_jobs;
-- Detailed job informationSELECT job_name, owner_name, operation, job_mode, state, degree, attached_sessions, datapump_sessionsFROM dba_datapump_jobsWHERE job_name = 'HR_EXPORT_JOB';
2. Interactive Job Control
Section titled “2. Interactive Job Control”# Attach to running jobexpdp hr/password attach=HR_EXPORT_JOB
# In interactive mode:Export> status # Check statusExport> parallel=6 # Change parallelismExport> stop_job=immediate # Stop jobExport> start_job # Resume jobExport> kill_job # Terminate job
3. Performance Monitoring
Section titled “3. Performance Monitoring”-- Check Data Pump session performanceSELECT s.sid, s.serial#, s.username, s.program, st.value/1024/1024 mb_processedFROM v$session s, v$sesstat st, v$statname snWHERE s.sid = st.sidAND st.statistic# = sn.statistic#AND sn.name = 'bytes sent via SQL*Net to client'AND s.program LIKE '%DM%'ORDER BY st.value DESC;
Best Practices
Section titled “Best Practices”1. Performance Optimization
Section titled “1. Performance Optimization”# Optimal parallel settings# For export: parallel = number of CPU cores# For import: parallel = number of CPU cores / 2
# Use appropriate dump file sizingexpdp hr/password \ directory=dp_dir \ dumpfile=hr_%U.dmp \ filesize=2G \ parallel=4
2. Space Management
Section titled “2. Space Management”-- Estimate export sizeSELECT table_name, num_rows, blocks * 8192 bytes, ROUND(blocks * 8192 / 1024 / 1024, 2) mbFROM user_tablesWHERE table_name = 'EMPLOYEES';
3. Error Handling
Section titled “3. Error Handling”#!/bin/bashLOG_FILE="/tmp/datapump_export.log"
expdp hr/password \ directory=dp_dir \ dumpfile=hr_export.dmp \ logfile=hr_export.log \ schemas=hr \ parallel=2 2>&1 | tee $LOG_FILE
# Check for errorsif grep -i "error\|ora-" $LOG_FILE > /dev/null; then echo "Data Pump export encountered errors" | \ mail -s "Data Pump Export Failed" $EMAIL exit 1fi
Security Considerations
Section titled “Security Considerations”1. Directory Permissions
Section titled “1. Directory Permissions”# Secure directory permissionschmod 750 /u01/app/oracle/datapumpchown oracle:oinstall /u01/app/oracle/datapump
# Limit directory object accessGRANT READ, WRITE ON DIRECTORY dp_dir TO hr;REVOKE READ, WRITE ON DIRECTORY dp_dir FROM public;
2. Encryption Best Practices
Section titled “2. Encryption Best Practices”# Use strong encryption for sensitive dataexpdp hr/password \ directory=dp_dir \ dumpfile=sensitive_data.dmp \ encryption=all \ encryption_algorithm=AES256 \ encryption_password=ComplexPassword123!
Common Parameters Reference
Section titled “Common Parameters Reference”Export Parameters
Section titled “Export Parameters”Parameter | Description | Example |
---|---|---|
directory | Directory object name | directory=dp_dir |
dumpfile | Dump file name(s) | dumpfile=export_%U.dmp |
logfile | Log file name | logfile=export.log |
parallel | Parallel processes | parallel=4 |
compression | Compression type | compression=all |
encryption | Encryption level | encryption=all |
exclude | Exclude objects | exclude=statistics |
include | Include objects | include=table |
query | Row filtering | query="WHERE rownum <= 1000" |
Import Parameters
Section titled “Import Parameters”Parameter | Description | Example |
---|---|---|
remap_schema | Schema remapping | remap_schema=old:new |
remap_tablespace | Tablespace remapping | remap_tablespace=old:new |
remap_table | Table remapping | remap_table=old:new |
transform | Object transformations | transform=segment_attributes:n |
table_exists_action | Action for existing tables | table_exists_action=replace |
Troubleshooting Common Issues
Section titled “Troubleshooting Common Issues”1. ORA-39002: Invalid operation
Section titled “1. ORA-39002: Invalid operation”-- Check directory permissionsSELECT privilege FROM dba_tab_privsWHERE grantee = 'HR' AND table_name = 'DP_DIR';
-- Grant missing permissionsGRANT READ, WRITE ON DIRECTORY dp_dir TO hr;
2. ORA-39070: Unable to open the log file
Section titled “2. ORA-39070: Unable to open the log file”# Check directory space and permissionsdf -h /u01/app/oracle/datapumpls -la /u01/app/oracle/datapump
3. Performance Issues
Section titled “3. Performance Issues”-- Check for blocking sessionsSELECT blocking_session, sid, serial#, wait_eventFROM v$sessionWHERE blocking_session IS NOT NULL;
Integration Examples
Section titled “Integration Examples”1. Automated Daily Schema Backup
Section titled “1. Automated Daily Schema Backup”#!/bin/bashDATE=$(date +%Y%m%d)SCHEMA="HR"
expdp system/password \ directory=dp_dir \ dumpfile=${SCHEMA}_${DATE}.dmp \ logfile=${SCHEMA}_${DATE}.log \ schemas=$SCHEMA \ compression=all \ job_name=${SCHEMA}_BACKUP_${DATE}
# Cleanup old backups (keep 7 days)find /u01/app/oracle/datapump -name "${SCHEMA}_*.dmp" -mtime +7 -delete
2. Cross-Platform Migration
Section titled “2. Cross-Platform Migration”#!/bin/bash# Export from source (different endianness)expdp system/password@source \ directory=dp_dir \ dumpfile=migration.dmp \ full=y \ version=12.2 \ compression=all
# Import to targetimpdp system/password@target \ directory=dp_dir \ dumpfile=migration.dmp \ full=y \ transform=segment_attributes:n
Next Steps
Section titled “Next Steps”- Export Operations - Detailed export procedures and examples
- Import Operations - Comprehensive import techniques
- Data Pump Scripts - Production-ready automation scripts
- Performance Tuning - Optimization strategies
Quick Reference Commands
Section titled “Quick Reference Commands”# Basic exportexpdp user/password directory=dp_dir dumpfile=export.dmp schemas=user
# Basic importimpdp user/password directory=dp_dir dumpfile=export.dmp
# Check job statusSELECT job_name, state FROM dba_datapump_jobs;
# Attach to jobexpdp user/password attach=JOB_NAME
This overview provides the foundation for using Oracle Data Pump effectively in your environment. Each section links to detailed procedures and examples for specific use cases.