Data Pump Export (expdp) Guide
Data Pump Export (expdp) Guide
Section titled “Data Pump Export (expdp) Guide”This guide provides detailed instructions for using Oracle Data Pump Export (expdp) to extract data and metadata from Oracle databases efficiently.
Export Fundamentals
Section titled “Export Fundamentals”Basic Export Syntax
Section titled “Basic Export Syntax”expdp [username/password[@connect_identifier]] [parameter=value]...
Essential Parameters
Section titled “Essential Parameters”Parameter | Description | Example |
---|---|---|
directory | Directory object for dump files | directory=dp_dir |
dumpfile | Output dump file name(s) | dumpfile=export_%U.dmp |
logfile | Export log file | logfile=export.log |
schemas | Schema(s) to export | schemas=hr,scott |
tables | Specific tables to export | tables=hr.employees |
full | Full database export | full=y |
parallel | Parallel worker processes | parallel=4 |
Export Scenarios
Section titled “Export Scenarios”1. Full Database Export
Section titled “1. Full Database Export”#!/bin/bash# Complete database export with compression and encryption
export ORACLE_SID=PRODexport DATE=$(date +%Y%m%d_%H%M%S)
expdp system/password \ directory=dp_dir \ dumpfile=full_db_${DATE}_%U.dmp \ logfile=full_db_${DATE}.log \ full=y \ parallel=8 \ compression=all \ encryption=all \ encryption_password=SecurePass123 \ cluster=n \ filesize=10G \ job_name=FULL_DB_EXPORT_${DATE}
2. Schema-Level Export
Section titled “2. Schema-Level Export”#!/bin/bash# Export multiple schemas with metadata filtering
expdp system/password \ directory=dp_dir \ dumpfile=schemas_export_%U.dmp \ logfile=schemas_export.log \ schemas=hr,scott,sh \ parallel=4 \ compression=metadata_only \ exclude=statistics \ exclude=index:"IN ('EMP_EMAIL_UK')" \ content=all \ job_name=SCHEMA_EXPORT
3. Table-Level Export
Section titled “3. Table-Level Export”#!/bin/bash# Export specific tables with query filtering
expdp hr/password \ directory=dp_dir \ dumpfile=emp_tables.dmp \ logfile=emp_tables.log \ tables=employees,departments,job_history \ query=employees:"WHERE department_id IN (10,20,30)" \ query=job_history:"WHERE end_date >= DATE'2020-01-01'" \ compression=all \ job_name=EMP_TABLE_EXPORT
4. Partition Export
Section titled “4. Partition Export”#!/bin/bash# Export specific partitions from partitioned tables
expdp sales/password \ directory=dp_dir \ dumpfile=sales_2024.dmp \ logfile=sales_2024.log \ tables=sales_data:sales_2024_q1,sales_data:sales_2024_q2 \ parallel=2 \ job_name=PARTITION_EXPORT
Advanced Export Features
Section titled “Advanced Export Features”1. Consistent Export with Flashback
Section titled “1. Consistent Export with Flashback”#!/bin/bash# Export data as of specific point in time
# Get current SCNCURRENT_SCN=$(sqlplus -s system/password <<EOFset heading off feedback offselect current_scn from v\$database;EOF)
expdp system/password \ directory=dp_dir \ dumpfile=consistent_export.dmp \ logfile=consistent_export.log \ schemas=hr,finance \ flashback_scn=$CURRENT_SCN \ parallel=4 \ job_name=CONSISTENT_EXPORT
2. Network Export (No Dump File)
Section titled “2. Network Export (No Dump File)”#!/bin/bash# Direct transfer from source to target database
expdp system/password@source_db \ network_link=target_db_link \ schemas=hr \ parallel=4 \ job_name=NETWORK_TRANSFER
3. Metadata-Only Export
Section titled “3. Metadata-Only Export”#!/bin/bash# Export database structure without data
expdp system/password \ directory=dp_dir \ dumpfile=metadata_only.dmp \ logfile=metadata_only.log \ full=y \ content=metadata_only \ compression=all \ job_name=METADATA_EXPORT
Filtering and Exclusions
Section titled “Filtering and Exclusions”1. Object Type Filtering
Section titled “1. Object Type Filtering”#!/bin/bash# Export with complex filtering rules
expdp system/password \ directory=dp_dir \ dumpfile=filtered_export.dmp \ schemas=hr \ exclude=table:"LIKE 'TMP_%'" \ exclude=table:"IN ('AUDIT_LOG','ERROR_LOG')" \ exclude=index:"LIKE '%_TMP'" \ exclude=constraint \ exclude=ref_constraint \ exclude=statistics \ include=table:"LIKE 'EMP%'" \ job_name=FILTERED_EXPORT
2. Data Filtering with Subqueries
Section titled “2. Data Filtering with Subqueries”#!/bin/bash# Export with complex WHERE conditions
expdp hr/password \ directory=dp_dir \ dumpfile=active_employees.dmp \ tables=employees \ query=employees:"WHERE employee_id IN ( SELECT employee_id FROM employees e WHERE e.department_id IN ( SELECT department_id FROM departments WHERE location_id IN (1700,1800) ) AND e.hire_date >= DATE'2020-01-01' )" \ job_name=ACTIVE_EMP_EXPORT
Performance Optimization
Section titled “Performance Optimization”1. Parallel Export Configuration
Section titled “1. Parallel Export Configuration”#!/bin/bash# Configure parallel export for maximum performance
# Calculate optimal parallel degreePARALLEL_DEGREE=$(nproc)
expdp system/password \ directory=dp_dir \ dumpfile=optimized_%U.dmp \ logfile=optimized_export.log \ full=y \ parallel=$PARALLEL_DEGREE \ cluster=n \ compression=all \ metrics=y \ logtime=all \ job_name=OPTIMIZED_EXPORT
2. Large Table Optimization
Section titled “2. Large Table Optimization”#!/bin/bash# Export large tables efficiently
expdp system/password \ directory=dp_dir \ dumpfile=large_table_%U.dmp \ tables=transactions.large_table \ parallel=16 \ compression=all \ cluster=n \ access_method=direct_path \ job_name=LARGE_TABLE_EXPORT
Export Monitoring
Section titled “Export Monitoring”1. Real-Time Progress Monitoring
Section titled “1. Real-Time Progress Monitoring”-- monitor_export.sql-- Monitor running Data Pump export jobs
-- Job overviewSELECT job_name, operation, job_mode, state, degree, attached_sessions, TO_CHAR(start_time, 'DD-MON-YY HH24:MI:SS') start_timeFROM dba_datapump_jobsWHERE operation = 'EXPORT'ORDER BY start_time DESC;
-- Detailed progressSELECT sid, serial#, username, opname, target_desc, sofar, totalwork, ROUND(sofar/totalwork*100,2) "% Complete", time_remaining/60 "Minutes Remaining"FROM v$session_longopsWHERE opname LIKE 'Data Pump%'AND sofar <> totalworkORDER BY start_time;
-- Worker process statusSELECT owner_name, job_name, operation, job_mode, state, degree, attached_sessions, datapump_sessionsFROM dba_datapump_jobsWHERE state = 'EXECUTING';
2. Export Job Control
Section titled “2. Export Job Control”#!/bin/bash# Attach to and control running export job
# Attach to running jobexpdp system/password attach=FULL_DB_EXPORT
# Interactive commands:# Export> status # Check current status# Export> parallel=8 # Change parallel degree# Export> add_file=extra_01.dmp size=5G # Add dump file# Export> stop_job=immediate # Stop job immediately# Export> start_job # Resume stopped job# Export> kill_job # Terminate and remove job
Security and Encryption
Section titled “Security and Encryption”1. Transparent Data Encryption Export
Section titled “1. Transparent Data Encryption Export”#!/bin/bash# Export with multiple encryption options
expdp system/password \ directory=dp_dir \ dumpfile=encrypted_data.dmp \ schemas=sensitive_data \ encryption=all \ encryption_algorithm=AES256 \ encryption_mode=password \ encryption_password=ComplexPass123! \ job_name=ENCRYPTED_EXPORT
2. Secure Parameter File
Section titled “2. Secure Parameter File”# Create parameter file with sensitive informationcat > export_params.par << EOFuserid=system/passworddirectory=dp_dirdumpfile=secure_export_%U.dmplogfile=secure_export.logschemas=hr,financeencryption_password=SecurePassword123compression=allparallel=4exclude=statisticsEOF
# Set restrictive permissionschmod 600 export_params.par
# Run export with parameter fileexpdp parfile=export_params.par
Automation Scripts
Section titled “Automation Scripts”1. Automated Daily Export
Section titled “1. Automated Daily Export”#!/bin/bash# Automated export with retention management
# ConfigurationEXPORT_DIR="/u01/exports"RETENTION_DAYS=7SCHEMAS="hr,finance,sales"LOG_DIR="/u01/logs"
# Set date variablesDATE=$(date +%Y%m%d)TIMESTAMP=$(date +%Y%m%d_%H%M%S)
# Create log fileLOG_FILE="${LOG_DIR}/export_${TIMESTAMP}.log"exec 1> >(tee -a "$LOG_FILE")exec 2>&1
echo "Starting automated export at $(date)"
# Export schemasexpdp system/password \ directory=dp_dir \ dumpfile=daily_${DATE}_%U.dmp \ logfile=daily_${DATE}.log \ schemas=$SCHEMAS \ compression=all \ parallel=4 \ job_name=DAILY_EXPORT_${DATE}
# Check export statusif [ $? -eq 0 ]; then echo "Export completed successfully"
# Clean up old exports find $EXPORT_DIR -name "daily_*.dmp" -mtime +$RETENTION_DAYS -delete find $EXPORT_DIR -name "daily_*.log" -mtime +$RETENTION_DAYS -delete
# Send success notification echo "Daily export completed successfully" | \ mail -s "Export Success - ${DATE}" $EMAILelse echo "Export failed with error"
# Send failure notification tail -100 $LOG_FILE | \ mail -s "Export Failed - ${DATE}" $EMAILfi
2. Schema Refresh Export
Section titled “2. Schema Refresh Export”#!/bin/bash# Export for test environment refresh
# ParametersSOURCE_SCHEMA=$1TARGET_ENV=$2SCRAMBLE_DATA=${3:-false}
if [ -z "$SOURCE_SCHEMA" ] || [ -z "$TARGET_ENV" ]; then echo "Usage: $0 <schema_name> <target_env> [scramble_data]" exit 1fi
DATE=$(date +%Y%m%d_%H%M%S)
# Base exportexpdp system/password \ directory=dp_dir \ dumpfile=${SOURCE_SCHEMA}_${TARGET_ENV}_${DATE}.dmp \ logfile=${SOURCE_SCHEMA}_${TARGET_ENV}_${DATE}.log \ schemas=$SOURCE_SCHEMA \ compression=all \ exclude=statistics \ job_name=REFRESH_${SOURCE_SCHEMA}_${DATE}
# If data scrambling requiredif [ "$SCRAMBLE_DATA" = "true" ]; then echo "Note: Data scrambling must be performed during import" echo "Use REMAP_DATA parameter during import for sensitive columns"fi
echo "Export completed: ${SOURCE_SCHEMA}_${TARGET_ENV}_${DATE}.dmp"
Best Practices
Section titled “Best Practices”1. Pre-Export Checklist
Section titled “1. Pre-Export Checklist”-- pre_export_checks.sql-- Run before major exports
-- Check database sizeSELECT SUM(bytes)/1024/1024/1024 size_gbFROM dba_segments;
-- Check directory space!df -h /u01/exports
-- Check running jobsSELECT job_name, state FROM dba_datapump_jobs;
-- Verify directory permissionsSELECT directory_name, directory_pathFROM dba_directoriesWHERE directory_name = 'DP_DIR';
-- Check user privilegesSELECT * FROM session_privsWHERE privilege LIKE '%DATAPUMP%';
2. Export Validation Script
Section titled “2. Export Validation Script”#!/bin/bash# Validate export completion and integrity
DUMPFILE=$1LOGFILE=$2
if [ -z "$DUMPFILE" ] || [ -z "$LOGFILE" ]; then echo "Usage: $0 <dumpfile> <logfile>" exit 1fi
# Check if files existif [ ! -f "$DUMPFILE" ] || [ ! -f "$LOGFILE" ]; then echo "ERROR: Export files not found" exit 1fi
# Check for errors in logERROR_COUNT=$(grep -c "ORA-" "$LOGFILE")if [ $ERROR_COUNT -gt 0 ]; then echo "WARNING: Found $ERROR_COUNT errors in log file" grep "ORA-" "$LOGFILE"fi
# Check completion statusif grep -q "successfully completed" "$LOGFILE"; then echo "Export completed successfully"
# Get export statistics grep -A5 "Master table" "$LOGFILE" grep "exported" "$LOGFILE" | tail -10else echo "ERROR: Export did not complete successfully" exit 1fi
Troubleshooting
Section titled “Troubleshooting”Common Export Issues
Section titled “Common Export Issues”Error | Cause | Solution |
---|---|---|
ORA-39001 | Invalid argument | Check parameter syntax |
ORA-39002 | Invalid operation | Verify privileges |
ORA-39070 | Unable to open file | Check directory permissions |
ORA-39095 | Dump file space | Increase filesize or add files |
ORA-39126 | Worker process error | Check alert log |
Debug Mode Export
Section titled “Debug Mode Export”# Enable detailed tracingexpdp system/password \ directory=dp_dir \ dumpfile=debug_export.dmp \ schemas=hr \ trace=480300 \ metrics=y \ logtime=all \ job_name=DEBUG_EXPORT
This comprehensive guide provides everything needed for successful Data Pump export operations in production environments.