The dbexport and dbimport utilities
The dbexport and dbimport utilities import and export a database and its schema to disk or tape.
The dbexport utility unloads an entire database into text files and creates a schema file. You can unload the database and its schema file either to disk or tape. If you prefer, you can unload the schema file to disk and unload the data to tape. You can use the schema file with the dbimport utility to re-create the database schema in another HCL Informix® environment, and you can edit the schema file to modify the database that dbimport creates.
The dbimport utility creates a database and loads it with data from text files on tape or disk. The input files consist of a schema file that is used to re-create the database and data files that contain the database data. Normally, you generate the input files with the dbexport utility, but you can use any properly formatted input files.
When you import a database, use the same environment variable settings and configuration parameter settings that were used when the database was created.
- If any environment variables or configuration parameters that affect fragmentation, constraints, triggers, or user-defined routines are set differently than they were when these database objects were created originally, the database that is created by the dbimport utility might not be an accurate reproduction of the original.
- Incompatible settings are likely to occur if you move data from an earlier version of the database server to a newer version. Over time, some configuration parameters or environment variables are deprecated, or their default values are changed. For example, assume that attached indexes were created by default in the original database. In the current version of the database server, detached indexes are created by default. If you want to maintain the original behavior, you can set the DEFAULT_ATTACH environment variable to 1 before you run the dbimport utility.
Also, the dbimport operation might fail when you attempt to import from a higher server version to a lower server version if the database schema changed between versions. For example, the am_expr_pushdown column was added to the sysams system catalog table in Informix® 11.70. The dbimport operation will fail if you attempt to import a database from Informix® 12.10 that contains the am_expr_pushdown column into a database from Informix® 11.50 that is missing that column. In that case, you must review the messages in the dbimport.out file, which is in your current directory. After you address the issues that caused the dbimport operation to fail, run the dbimport command again.
Requirements or limitations apply in the following cases:
- Compressed data
The dbexport utility uncompresses compressed data. You must recompress the data after you use the dbimport utility to import the data.
- Date values
Use four-digit years for date values. The date context for an object includes the date that the object was created, the values of the DBCENTURY and GL_DATE environment variables, and some other environment variables. If the date context during import is not the same as when these objects were created, you might get explicit errors, you might not be able to find your data, or a check constraint might not work as expected. Some of these problems do not generate errors.
Tip: By default, the dbexport utility exports dates in four-digit years unless environment variables are set to values that would override that format.- High-availability clusters
You cannot use the dbexport utility on HDR secondary servers or shared disk (SD) secondary servers.
The dbexport utility is supported on a remote standalone (RS) secondary server only if the server is set to stop applying log files. Use the STOP_APPLY configuration parameter to stop application of log files.
The dbimport utility is supported on all updatable secondary servers.
- Label-based access control (LBAC)
When you export data that is LBAC-protected, the data that is exported is limited to the data that your LBAC credentials allow you to read. If your LBAC credentials do not allow you to read a row, that row is not exported, but no error is returned. To export all the rows, you must be able to see all the rows.
- NLSCASE mode
Whether the NLSCASE mode of your source database is SENSITIVE or INSENSITIVE, you can reduce the risk of case-sensitivity issues by always migrating to a target database that has the same NLSCASE mode as the source database. For tables that include columns with case-variant NCHAR and NVARCHAR data values (for example, 'IBM', 'ibm', 'Ibm'), you might encounter the following differences after migration:
- ORDER BY and sorting operations can produce a different ordering of qualifying rows in query results, compared to the result of the same query before migration.
- Unique indexes and referential constraints with which the data were compliant before the migration might have integrity violations in the new database, if any index or constraint key column contains case-variant forms of the same character string.
- Queries with predicates that apply conditional operators to NCHAR or NVARCHAR values might return different results from the same data after migration.
- Nondefault database locales
If the database uses a nondefault locale and the GL_DATETIME environment variable has a nondefault setting, you must set the USE_DTENV environment variable to the value of
1
so that localized DATETIME values are processed correctly by the dbexport and dbimport utilities.- SELECT triggers on tables
You must disable SELECT triggers before you export a database with the dbexport utility. The dbexport utility runs SELECT statements during export. The SELECT statement triggers might modify the database content.
- Virtual tables for the Informix® MQ extension
The MQCreateVtiRead(), MQCreateVtiReceive(), and MCQCreateVtiWrite() functions create virtual tables, and map them to the appropriate WebSphere® MQ message queue. When the dbexport utility unloads data, it removes the messages from WebSphere® MQ queues. Before you use the dbexport utility, drop any MQ virtual tables. After you load the database with the dbimport utility, you can create the tables in the target database by using the appropriate functions.