Performance tips for time series
You can improve the performance of loading and querying time series data with the following guidelines.
Run routines faster after server start
Load the time series extension into memory when the server starts instead of when you first run a time series routine. Set the PRELOAD_DLL_FILE configuration parameter to the name of the time series extension shared library file, TimeSeries.bld. For example, add the following line to your onconfig file:
PRELOAD_DLL_FILE $INFORMIXDIR/extend/TimeSeries.version/TimeSeries.bld
The version is the specific version number for the extension. Run the TimeSeriesRelease function to find the correct version number. The version number of the time series extension can change in any fix pack or release. After you upgrade, you must update the value of the PRELOAD_DLL_FILE configuration parameter if the version number of the time series extension changed.
Load data faster
To load data faster, follow these tips:
- Load the data in ascending order by time. Data that is out of order takes longer to load.
- Load data from files with a loader program. The TSL_Put function loads data directly from files.
- Reduce logging while loading data. Include the TSOPEN_REDUCED_LOG option when you run the PutElem, PutElemNoDups, PutNthElem, InsElem, BulkLoad, PutTimeSeries, or TSL_Flush function. Include the reduced_log flag in the TSVTMode parameter when you create a virtual table with the TSCreateVirtualTab procedure.
- Lock containers so that only one session can write to the container at a time. Run the TSContainerLock procedure. Data is loaded faster if only one session writes to the container.
Run queries faster
To run queries faster, follow these tips:
- Fragment your time series tables so that time series routines run in parallel.
- Fragment virtual tables that are based on fragmented time series tables so that queries on the virtual table are run in parallel.
- Aggregate an interval of time series data faster by including start and end dates in the TSRollup function.