利用数据泵的 SQLFILE 参数生成创建索引的 DDL 语句
作者 | JiekeXu
来源 |公众号 JiekeXu DBA之路(ID: JiekeXu_IT)
如需转载请联系授权 | (个人微信 ID:JiekeXu_DBA)
大家好,我是 JiekeXu,很高兴又和大家见面了,今天和大家一起来学习利用数据泵的 SQLFILE 参数生成创建索引的 DDL 语句,欢迎点击上方蓝字关注我,标星或置顶,更多干货第一时间到达!
Oracle 的数据泵功能从 10g 开始慢慢引入,从此导入导出变得更加快捷,方便。使用时只需要 help 就可以获得很多有用的参数。只要你可以登录到数据库服务器,导入导出变得更加的方便高效,可以使用命令行、参数文件或交互命令模式与 Oracle Data Pump 交互。
在这三个选项中进行选择:
命令行界面:使您能够直接在命令行上指定大部分导出参数。
参数文件接口:使您能够在参数文件中指定命令行参数。唯一的例外是 PARFILE 参数,因为参数文件不能嵌套。如果您使用的参数的值需要引号,那么 Oracle 建议您使用参数文件。
交互命令界面:停止记录到终端并显示导出提示,您可以从中输入各种命令,其中一些特定于交互命令模式。在使用命令行界面或参数文件界面启动的导出操作期间,可以通过按 Ctrl+C 启用此模式。当您附加到正在执行或停止的作业时,也会启用交互命令模式。
影响数据泵性能的初始化参数
某些 Oracle 数据库初始化参数的设置会影响数据泵导出和导入的性能。
特别是,您可以尝试使用以下设置来提高性能,尽管在所有平台上效果可能并不相同。
DISK_ASYNCH_IO=TRUE
DB_BLOCK_CHECKING=FALSE
DB_BLOCK_CHECKSUM=FALSE
以下初始化参数的值必须设置得足够高以允许最大并行度:
PROCESSES
SESSIONS
PARALLEL_MAX_SERVERS
此外,SHARED_POOL_SIZE
和 UNDO_TABLESPACE
初始化参数的大小应该足够大。确切的值取决于数据库的大小。
当您有多个用户在同一个数据库环境中执行数据泵作业时,您可以使用MAX_DATAPUMP_JOBS_PER_PDB
和 MAX_DATAPUMP_PARALLEL_PER_JOB
初始化参数来获得对资源利用率的更多控制。初始化参数 MAX_DATAPUMP_JOBS_PER_PDB
确定每个可插拔数据库 (PDB) 的并发 Oracle 数据泵作业的最大数量。对于 Oracle Database 19c 及更高版本,您可以将参数设置为 AUTO
. 此设置意味着 Oracle Data Pump 得出的实际值为初始化参数值的 MAX_DATAPUMP_JOBS_PER_PDB 的
50% ( 50%
) 。SESSIONS
如果不将该值设置为AUTO
,则默认值为 100
。您可以将值设置为 0
至 250
。
Oracle Database Release 19c 和后续版本包含初始化参数MAX_DATAPUMP_PARALLEL_PER_JOB。当多个用户在给定的数据库环境中同时执行数据泵作业时,可以使用此参数来获得对资源利用的更多控制。参数MAX_DATAPUMP_PARALLEL_PER_JOB 指定每个 Oracle 数据泵作业可使用的最大并行进程数。您可以指定一个特定的最大进程数,也可以选择 AUTO。如果您选择指定一个设置值,那么这个最大数字可以从 1 到 1024(默认为1024)。如果选择指定 AUTO,那么 Oracle Data Pump 将参数 MAX_DATAPUMP_PARALLEL_PER_JOB 的实际值推导为 SESSIONS 初始化参数值的 25%。
下面是 expdp/impdp help=y 帮助信息,根据这些帮助信息便可以很好的使用数据泵,如下为 11.2.0.4 版本。
[oracle@JiekeXu ~]$ expdp help=y
Export: Release 11.2.0.4.0 - Production on Thu Nov 4 18:48:36 2021Copyright (c) 1982, 2011, Oracle and/or its affiliates. All rights reserved.
The Data Pump export utility provides a mechanism for transferring data objectsbetween Oracle databases. The utility is invoked with the following command:
Example: expdp scott/tiger DIRECTORY=dmpdir DUMPFILE=scott.dmp
You can control how Export runs by entering the 'expdp' command followedby various parameters. To specify parameters, you use keywords:
Format: expdp KEYWORD=value or KEYWORD=(value1,value2,...,valueN)
Example: expdp scott/tiger DUMPFILE=scott.dmp DIRECTORY=dmpdir SCHEMAS=scott or TABLES=(T1:P1,T1:P2), if T1 is partitioned tableUSERID must be the first parameter on the command line.------------------------------------------------------------------------------The available keywords and their descriptions follow. Default values are listed within square brackets.
ATTACH
Attach to an existing job.For example, ATTACH=job_name.
CLUSTER
Utilize cluster resources and distribute workers across the Oracle RAC.
Valid keyword values are: [Y] and N.
COMPRESSION
Reduce the size of a dump file.
Valid keyword values are: ALL, DATA_ONLY, [METADATA_ONLY] and NONE.
CONTENT
Specifies data to unload.
Valid keyword values are: [ALL], DATA_ONLY and METADATA_ONLY.
DATA_OPTIONS
Data layer option flags.
Valid keyword values are: XML_CLOBS.
DIRECTORY
Directory object to be used for dump and log files.
DUMPFILE
Specify list of destination dump file names [expdat.dmp].For example, DUMPFILE=scott1.dmp, scott2.dmp, dmpdir:scott3.dmp.
ENCRYPTION
Encrypt part or all of a dump file.
Valid keyword values are: ALL, DATA_ONLY, ENCRYPTED_COLUMNS_ONLY, METADATA_ONLY and NONE.
ENCRYPTION_ALGORITHM
Specify how encryption should be done.
Valid keyword values are: [AES128], AES192 and AES256.
ENCRYPTION_MODEMethod of generating encryption key.
Valid keyword values are: DUAL, PASSWORD and [TRANSPARENT].
ENCRYPTION_PASSWORD
Password key for creating encrypted data within a dump file.
ESTIMATE
Calculate job estimates.
Valid keyword values are: [BLOCKS] and STATISTICS.
ESTIMATE_ONLY
Calculate job estimates without performing the export.
EXCLUDE
Exclude specific object types.For example, EXCLUDE=SCHEMA:"='HR'".
FILESIZE
Specify the size of each dump file in units of bytes.
FLASHBACK_SCN
SCN used to reset session snapshot.
FLASHBACK_TIMETime used to find the closest corresponding SCN value.FULLExport entire database [N].
HELP
Display Help messages [N].
INCLUDE
Include specific object types.For example, INCLUDE=TABLE_DATA.
JOB_NAME
Name of export job to create.
LOGFILE
Specify log file name [export.log].
NETWORK_LINK
Name of remote database link to the source system.
NOLOGFILE
Do not write log file [N].
PARALLEL
Change the number of active workers for current job.
PARFILE
Specify parameter file name.
QUERY
Predicate clause used to export a subset of a table.For example, QUERY=employees:"WHERE department_id > 10".
REMAP_DATA
Specify a data conversion function.For example, REMAP_DATA=EMP.EMPNO:REMAPPKG.EMPNO.
REUSE_DUMPFILES
Overwrite destination dump file if it exists [N].
SAMPLE
Percentage of data to be exported.
SCHEMAS
List of schemas to export [login schema].
SERVICE_NAME
Name of an active Service and associated resource group to constrain Oracle RAC resources.
SOURCE_EDITION
Edition to be used for extracting metadata.
STATUS
Frequency (secs) job status is to be monitored wherethe default [0] will show new status when available.
TABLES
Identifies a list of tables to export.For example, TABLES=HR.EMPLOYEES,SH.SALES:SALES_1995.
TABLESPACES
Identifies a list of tablespaces to export.
TRANSPORTABLE
Specify whether transportable method can be used.
Valid keyword values are: ALWAYS and [NEVER].
TRANSPORT_FULL_CHECK
Verify storage segments of all tables [N].
TRANSPORT_TABLESPACES
List of tablespaces from which metadata will be unloaded.
VERSION
Version of objects to export.
Valid keyword values are: [COMPATIBLE], LATEST or any valid database version.------------------------------------------------------------------------------The following commands are valid while in interactive mode.
Note: abbreviations are allowed.
ADD_FILEAdd dumpfile to dumpfile set.
CONTINUE_CLIENTReturn to logging mode. Job will be restarted if idle.
EXIT_CLIENT
Quit client session and leave job running.
FILESIZEDefault filesize (bytes) for subsequent ADD_FILE commands.
HELP
Summarize interactive commands.
KILL_JOB
Detach and delete job.
PARALLEL
Change the number of active workers for current job.
REUSE_DUMPFILES
Overwrite destination dump file if it exists [N].
START_JOBStart or resume current job.
Valid keyword values are: SKIP_CURRENT.
STATUS
Frequency (secs) job status is to be monitored wherethe default [0] will show new status when available.
STOP_JOB
Orderly shutdown of job execution and exits the client.
Valid keyword values are: IMMEDIATE.
[oracle@JiekeXu ~]$ impdp help=y
Import: Release 11.2.0.4.0 - Production on Thu Nov 4 18:50:07 2021Copyright (c) 1982, 2011, Oracle and/or its affiliates. All rights reserved.
The Data Pump Import utility provides a mechanism for transferring data objectsbetween Oracle databases. The utility is invoked with the following command:
Example: impdp scott/tiger DIRECTORY=dmpdir DUMPFILE=scott.dmp
You can control how Import runs by entering the 'impdp' command followedby various parameters. To specify parameters, you use keywords:
Format: impdp KEYWORD=value or KEYWORD=(value1,value2,...,valueN)
Example: impdp scott/tiger DIRECTORY=dmpdir DUMPFILE=scott.dmp
USERID must be the first parameter on the command line.------------------------------------------------------------------------------The available keywords and their descriptions follow. Default values are listed within square brackets.
ATTACH
Attach to an existing job.For example, ATTACH=job_name.
CLUSTER
Utilize cluster resources and distribute workers across the Oracle RAC.
Valid keyword values are: [Y] and N.
CONTENT
Specifies data to load.
Valid keywords are: [ALL], DATA_ONLY and METADATA_ONLY.
DATA_OPTIONS
Data layer option flags.
Valid keywords are: SKIP_CONSTRAINT_ERRORS.
DIRECTORY
Directory object to be used for dump, log and SQL files.
DUMPFILE
List of dump files to import from [expdat.dmp].For example, DUMPFILE=scott1.dmp, scott2.dmp, dmpdir:scott3.dmp.
ENCRYPTION_PASSWORD
Password key for accessing encrypted data within a dump file.Not valid for network import jobs.
ESTIMATE
Calculate job estimates.
Valid keywords are: [BLOCKS] and STATISTICS.
EXCLUDE
Exclude specific object types.For example, EXCLUDE=SCHEMA:"='HR'".
FLASHBACK_SCN
SCN used to reset session snapshot.
FLASHBACK_TIMETime used to find the closest corresponding SCN value.FULLImport everything from source [Y].
HELP
Display help messages [N].
INCLUDE
Include specific object types.For example, INCLUDE=TABLE_DATA.
JOB_NAME
Name of import job to create.
LOGFILE
Log file name [import.log].
NETWORK_LINK
Name of remote database link to the source system.
NOLOGFILE
Do not write log file [N].
PARALLEL
Change the number of active workers for current job.
PARFILE
Specify parameter file.
PARTITION_OPTIONS
Specify how partitions should be transformed.
Valid keywords are: DEPARTITION, MERGE and [NONE].
QUERY
Predicate clause used to import a subset of a table.For example, QUERY=employees:"WHERE department_id > 10".
REMAP_DATA
Specify a data conversion function.For example, REMAP_DATA=EMP.EMPNO:REMAPPKG.EMPNO.
REMAP_DATAFILE
Redefine data file references in all DDL statements.
REMAP_SCHEMA
Objects from one schema are loaded into another schema.
REMAP_TABLETable names are remapped to another table.For example, REMAP_TABLE=HR.EMPLOYEES:EMPS.
REMAP_TABLESPACE
Tablespace objects are remapped to another tablespace.
REUSE_DATAFILES
Tablespace will be initialized if it already exists [N].
SCHEMAS
List of schemas to import.
SERVICE_NAME
Name of an active Service and associated resource group to constrain Oracle RAC resources.
SKIP_UNUSABLE_INDEXESSkip indexes that were set to the Index Unusable state.
SOURCE_EDITION
Edition to be used for extracting metadata.
SQLFILE
Write all the SQL DDL to a specified file.
STATUS
Frequency (secs) job status is to be monitored wherethe default [0] will show new status when available.
STREAMS_CONFIGURATION
Enable the loading of Streams metadata
TABLE_EXISTS_ACTION
Action to take if imported object already exists.
Valid keywords are: APPEND, REPLACE, [SKIP] and TRUNCATE.
TABLES
Identifies a list of tables to import.For example, TABLES=HR.EMPLOYEES,SH.SALES:SALES_1995.
TABLESPACES
Identifies a list of tablespaces to import.
TARGET_EDITION
Edition to be used for loading metadata.
TRANSFORM
Metadata transform to apply to applicable objects.
Valid keywords are: OID, PCTSPACE, SEGMENT_ATTRIBUTES and STORAGE.
TRANSPORTABLE
Options for choosing transportable data movement.
Valid keywords are: ALWAYS and [NEVER].Only valid in NETWORK_LINK mode import operations.
TRANSPORT_DATAFILES
List of data files to be imported by transportable mode.
TRANSPORT_FULL_CHECK
Verify storage segments of all tables [N].
TRANSPORT_TABLESPACES
List of tablespaces from which metadata will be loaded.Only valid in NETWORK_LINK mode import operations.
VERSION
Version of objects to import.
Valid keywords are: [COMPATIBLE], LATEST or any valid database version.Only valid for NETWORK_LINK and SQLFILE.------------------------------------------------------------------------------The following commands are valid while in interactive mode.
Note: abbreviations are allowed.
CONTINUE_CLIENTReturn to logging mode. Job will be restarted if idle.
EXIT_CLIENT
Quit client session and leave job running.
HELP
Summarize interactive commands.
KILL_JOB
Detach and delete job.
PARALLEL
Change the number of active workers for current job.
START_JOBStart or resume current job.
Valid keywords are: SKIP_CURRENT.
STATUS
Frequency (secs) job status is to be monitored wherethe default [0] will show new status when available.
STOP_JOB
Orderly shutdown of job execution and exits the client.
Valid keywords are: IMMEDIATE.
[oracle@JiekeXu ~]$ impdp help=y | grep SQLFILE -A 3
SQLFILE
Write all the SQL DDL to a specified file.
STATUS
--
Only valid for NETWORK_LINK and SQLFILE.
VIEWS_AS_TABLES
Identifies one or more views to be imported as tables.
如上所示,SQLFILE 在 impdp 的参数中出现,可将所有 SQL 的 DDL 语句写入指定文件。
Write all the SQL DDL to a specified file. --将所有 SQL DDL写入指定文件。
SQLFILE
它是数据泵 impdp 的一个参数,使用此参数导入时不会实际执行导入命令,只会生成导出命令产生的 DDL 语句,这样便很好的利用它来生成创建表空间、创建用户、创建表、创建索引等的 SQL 语句。
下面以创建索引为例:
--使用 DBMS 包获取索引创建语句
set long 9999 line 9999 pages 9999
SELECT dbms_lob.substr(dbms_metadata.get_ddl('INDEX',INDEX_NAME,'SCOTT'))||';' from dba_indexes where owner='SCOTT';
--可以直接使用 GET_DDL 获取对象创建的 SQL 语句,当然也可以使用 sqlfile 参数,适用于索引,表等多个对象的创建。
select dbms_metadata.get_ddl('TABLE','TEST','SYS') from dual;
--业务用户
conn jieke/jiekexu123
select dbms_metadata.get_ddl('INDEX',u.object_name) from user_objects u where object_type='INDEX';
利用 impdp 的 sqlfile 参数功能生成创建索引的语句
以 scott 用户为例,也可全库导出。
expdp \'/ as sysdba \' directory=exp_dir dumpfile=index.dmp schemas=scott logfile=out_index.log cluster=n include=INDEX
impdp \'/ as sysdba \' directory=exp_dir dumpfile=index.dmp logfile=index.log cluster=n sqlfile=cre_index.sql include=INDEX
impdp hr/hr123 DIRECTORY=dpump_dir1 DUMPFILE=expfull.dmp SQLFILE=dpump_dir2:expfull.sql
--名为 expfull.sql 的 SQL 文件被写入 dpump_dir2 目录。
生成 sql文件包含该 dmp 文件的所有 ddl 语句,使用语法为
SQLFILE=[directory_object:]file_name
注意事项:
1.directory_object 可以不用和 impdp 的 DIRECTORY 参数一致,如果是一样,directory_object 可以省略。
2.SQLFILE 文件必须写入到磁盘之上,不能写入到 ASM 中。
3.SQLFILE 和 QUERY 参数冲突,不能同时使用。
4.密码不包含在 SQL 文件中。例如,如果一个 CONNECT 语句是运行的 DDL 的一部分,那么它会被替换为仅显示模式名称的注释。在以下示例中,破折号 ( --) 表示后面有注释。显示 hr模式名称,但不显示密码。
-- CONNECT hr
因此,在您可以运行 SQL 文件之前,您必须通过删除表示注释的破折号并添加hr 模式的密码来对其进行编辑。
5.Oracle Data Pump 将所有 ALTER SESSION 语句放在由 Oracle Data Pump 导入创建的 SQL 文件的顶部。如果导入操作有不同的连接语句,那么您必须手动复制每个 ALTER SESSION 语句,并将它们粘贴到相应的 CONNECT 语句之后。对于某些 Oracle 数据库选项,匿名 PL/SQL 块可以出现在 SQLFILE 输出中。不要直接运行这些 PL/SQL 块。
6.如果指定了 SQLFILE,那么 CONTENT 参数将被设置为 ALL 或 DATA_ONLY 时将被忽略。
更多详细信息可参考官方文档:
https://docs.oracle.com/en/database/oracle/oracle-database/21/sutil/oracle-data-pump.html#GUID-501A9908-BCC5-434C-8853-9A6096766B5A
全文完,希望可以帮到正在阅读的你,如果觉得有帮助,可以分享给你身边的朋友,同事,你关心谁就分享给谁,一起学习共同进步~~~
❤️ 欢迎关注我的公众号,一起学习新知识!!!
————————————————————————————
公众号:JiekeXu DBA之路
墨天轮:https://www.modb.pro/u/4347
CSDN :https://blog.csdn.net/JiekeXu
腾讯云:https://cloud.tencent.com/developer/user/5645107
————————————————————————————