Creating data mart definitions manually by using workload analysis
Workload analysis involves two main steps, gathering information about your query workload, which is known as query probing, and analysis of the query probing data.
Before you begin
- You must have an existing database with a star or snowflake schema on a local database of the server that you are connected to.
- You must have warehousing queries that match the acceleration criteria.
- A default sbspace must exist.
Procedure
- Connect to the database that contains the data warehouse
tables.
In these instructions, this database is referred to as the warehouse database.
- Update database statistics with the LOW option to generate
a minimum amount of statistics for the database by running the UPDATE
STATISTICS LOW statement.
Query probing is required to determine the fact table of the queries being analyzed. In many cases, the fact table is defined by the query (ANSI JOIN syntax), by star join optimization, or by optimizer directives that are provided by the user. In other cases, the table that contains the most rows is defined as the fact table. Running the UPDATE STATISTICS LOW statement ensures that database statistics contain the correct number of rows per table.
- Remove existing
probing results for the current database by running the following
statement:
SET ENVIRONMENT use_dwa 'probe cleanup';
- Enable query probing for the current user session
by running the following statement:
SET ENVIRONMENT use_dwa 'probe start';
- Optional: If
you want to customize your data mart definition to include only the
probing results of the queries run in step 4,
enable SQL tracing by running the following statement as user informix connected
to the sysadmin database:
EXECUTE FUNCTION task("set sql tracing on","1000","4","low","global");
- Optional: To
run the query probing more quickly by simulating running the query
workload, run the following statement:
SET EXPLAIN ON AVOID_EXECUTE;
Simulating the query workload does not provide results.
- Run your warehousing queries in the current user session.
- Disable query
probing for the current user session by running the following statement:
SET ENVIRONMENT use_dwa 'probe stop';
- Optional: If you enabled SQL tracing, view the SQL trace information about the workload by either running the onstat -g his command or by querying table syssqltrace in the sysmaster database.
- If
required, create a new logged database to store the data mart definition.
The data mart definition is stored in a set of tables called the data mart schema tables. These tables must reside in a logged database. The data mart schema tables are created automatically if they do not exist. Any logged database can contain them, including the warehouse database assuming it is a logged database. To separate the data mart schema tables from other user tables, create a separate database for them. This database is referred to as the schema database in the steps that follow.
- Convert the probing data into a data
mart definition by running the ifx_probe2Mart() procedure in the schema
database.
- To create a data mart definition from all of the probing data,
run the following statement:
EXECUTE PROCEDURE ifx_probe2mart('warehouse database', 'data mart name');
- To create a data mart definition or extend an existing data
mart definition by using the probing data of a single query, run the
following statement:
To refer to a specific query by its sql id, SQL tracing must be enabled, as described in 5.EXECUTE PROCEDURE ifx_probe2mart('warehouse database', 'data mart name', sql id);
- To create a data mart definition from all of the probing data,
run the following statement:
- Optional: Remove
the probing data from memory before you start any new query probing
sessions by running the following command in your warehouse database:
SET ENVIRONMENT use_dwa 'probe cleanup';
- Optional: You
can view the data mart definition by running the ifx_genMartDef()
function in the schema database.This function returns a CLOB that contains the data mart definition in XML format. You can store the CLOB in an operating system file on the client computer by using the lotofile() function:
EXECUTE FUNCTION lotofile(ifx_genmartdef('data mart name', 'file name', 'client'));
- Deploy the data
mart to your accelerator by using either:
- The ifx_createMart() function
in your warehouse database by running the following statement:
EXECUTE FUNCTION ifx_createMart('accelerator name', 'data mart name', 'schema database name');
- The file created in 10 to
deploy the data mart to your accelerator by running the following
statement:
EXECUTE FUNCTION ifx_createMart('accelerator name', filetoclob('file name', 'client'));
- The ifx_createMart() function
in your warehouse database by running the following statement:
- Load the data
mart on your accelerator by using the ifx_loadMart() function in your warehouse database
by running the following statement:
EXECUTE FUNCTION ifx_loadMart('accelerator name', 'data mart name', 'locking mode');