| Custom Report Format
Files |
Use this field to specify IDs of reports that you want to
export in place of the default reports.
You can provide multiple report IDs separated by a comma. You
can navigate to Preferences of Test Performance (), and then select Show Report IDs checkbox to
view the report IDs.
You must use the Custom Report Format
Files field along with the Exported Statistical Report Data
File or Exportstatshtml field.
For example, you can provide
http as a value in the Custom Report
Format Files field to export a Performance Report.
|
| Dataset Override |
Use this field to replace the dataset values during a
test or schedule
run.
You must ensure that both original and new
datasets are in the same workspace and have the same column names. When you
enter a value for the Dataset Override field, you must also
include the path to the dataset. You must provide the values for the
Dataset Override field in the following format:
/project_name/ds_path/original_ds.csv:/project_name/ds_path/new_ds.csv
For example,
/proj1/Datasets/ds1.csv:/proj1023/Datasets/new_ds1.csv
You can override multiple datasets that are saved in a different
project by adding multiple paths to the dataset separated by a semicolon. For a
test or schedule, the
default value is the dataset specified in the test editor or schedule editor.
|
| Duration |
Use this field to change the duration of the stages in the rate schedule.
For example, Stage1=10s, Stage2=3m
The stage number specified must exist in the rate schedule.
Note: The Duration field creates a new
copy of the rate schedule that contains the specified number of duration.
|
| Exported HTTP Test log
file |
Use this field to specify the path of the file directory to store the exported HTTP
test log in a .txt format.
For example, C:/Users/Documents/tests/logexp.txt
|
| Exported Statistical Report Data
File |
Use this field to provide the complete path to a directory that
you can use to store exported reports in a comma-separated values (CSV) format.
For example,
C:/Users/Documents/tests
Note: If you do not specify a value for the Custom
Report Format Files field, then the reports that you selected in
Preferences of Test Performance () are exported.
|
| Exportstatsformat |
Use this field to specify a format for the report that you
want to export.
You must use at least one of the following options
along with the Exported Statistical Report Data File
field:
-
simple.csv
-
full.csv
-
simple.json
-
full.json
-
csv
-
json
For example, json.
You can add multiple formats for the report
separated by a comma. When you want to export both simple and full type of report in
a json or csv format, you can specify
json or csv as the format in the field.
The reports are saved to the location specified in the
Exported Statistical Report Data File field.
Note: The values provided in the
Exportstatsformat field always take precedence over the
Export Reports options set in the Preferences of Test Performance ().
|
| Exportstatshtml |
Use this field to provide the complete path to a directory
that you can use to export web analytic results.
The results are exported to the specified directory. You can
analyze the results on a web browser without using Test
Performance.
For example,
C:/Users/Documents/Reports
|
| History |
Use this field when you want to view a record of all events that occurred during a
test or schedule run.
You can use any of the following options:
-
jaeger: To send test logs to the Jaeger UI during the
test or schedule
run. Note: You must set the
JAEGER_AGENT_HOST property as an environment
variable by using the command line before you use the
jaeger option to send test logs to the Jaeger
UI.
-
testlog: To send test logs as traditional test logs in
Test Performance during the
test or schedule
run.
-
null: To send no test logs either to the Jaeger UI or
Test Performance during the
test or schedule
run.
For example, jaeger
You can add multiple options separated by a comma to send test logs
during the test or
schedule run to Test Performance and the Jaeger UI.
For example, jaeger,testlog
For more information about how to view test logs in the
Jaeger UI and Test Performance, see the
Related information section.
|
| IMShared Location |
Enter the complete path to the IMShared directory, if it is
not in the default location.
For example, D:\Testtool\HCL\HCLIMShared
The default location of the IMShared
directory is as follows:
| Operating system |
The default path to the directory |
| Windows® |
C:\Program
Files\HCL\HCLIMShared |
| Linux™ |
/opt/HCL/HCLIMShared |
| Mac |
/Application/HCL/HCLIMShared |
|
| Labels |
Use this field to add labels to test results when the test run is complete.
For example, label1, label2
You can add multiple labels to a test result separated by a comma.
When you run test assets, then the labels that you added are displayed on the
Performance Report in Test Performance.
The Results page of Test Hub displays the label that
you added in the Labels field for the specific test asset in
the following conditions:
Note: When you run tests by using the double
quotation marks ("") for the Labels field, then the labels in
the test result do not include double quotation marks. For example, if you provide
the value for the Labels field as "100"
users, then the labels display 100
users in the test result. To work around this problem, you
must create a command-line config file, and then run the test by using the
Config File field.
|
| Number of Virtual
Users |
Use this field to override the default number of virtual users in the
test or schedule
run.
For a schedule, the default is the number
of users specified in the schedule editor and for a
test, the default is one user.
Note: The Number of Virtual Users field
creates a new copy of the schedule that contains the
specified number of users.
|
| Overwrite Results file |
Select or clear this field to determine whether a result
file with the same name must be overwritten or not.
The default status of the field is selected. Therefore, the
file is overwritten and retains the same file name.
|
| Publish |
Use this field to publish test results to Test Hub from HCL®
Launch. Remember: Before you use the
Publish field, you must provide the offline user token of
Test Hub by using any of the
following methods:
-
Set the value of the OFFLINE_TOKEN environment variable to the
offline user token of Test Hub by using the
command-line interface.
-
Provide the offline user token of Test Hub in the preferences
of Test Performance ()
You must use one of the following values in the
Publish field:
-
serverURL#project.name=name_of_the_project&teamspace.name=name_of_the
_teamspace
-
serverURL#project.name=name_of_the_project&teamspace.alias=name_of_the
_teamspace_alias
- no
The no option is
useful if the product preferences are set to publish the results, but you do
not want to publish them.
For example, URL_OF_Test
Hub#project.name=test&teamspace.name=ts1 Where:
-
URL_OF_Test Hub is the URL of Test Hub.
-
test is the name of the project in Test Hub.
-
ts1 is the name of the team space.
Notes:
-
If you have a project with the same name in different team spaces, then you
must append either the
&teamspace.name=name_of_the
_teamspace or
&teamspace.alias=name_of_the_teamspace_alias
options.
-
If the name of the project or team space contains a special character, then
you must replace it with
%<Hexvalue_of_special_character>.
For example, if the name of the team space is Initial Team
Space, then you must provide it as
Intial%20Team%20Space.
Where %20 is the Hex value of
Space character.
-
The values provided in the Publish field always take
precedence over the Results options set in the product
preferences ().
The Reports
information section on the
Log file displays the names of the
report along with its corresponding URLs in the following
conditions:
- When you configured the URL of Test Hub in
Preferences of Test Performance ().
- When you set Publish result after execution as
Always or Prompt in the
Preferences of Test Performance ().
- When you used the Publish field or the
Publish field along with the
Publishreports field.
|
| Publish_for |
Use this field to publish test results to Test Hub from HCL®
Launch based on the completion status of the tests.
You must use the Publish_for field along
with the Publish field. You can add multiple options
separated by a comma. The following are the available options that you can use for
the Publish_for field:
-
ALL: You can use this option irrespective of status of
the test.
-
PASS: You can use this option to publish test results
for the tests that have passed.
-
FAIL: You can use this option to publish test results
for the tests that have failed.
-
ERROR: You can use this option to publish test results
for the tests that included errors.
-
INCONCLUSIVE: You can use this option to publish test
results for the inconclusive tests.
For example, FAIL,ERROR
|
| Publishreports |
Use this field to publish specific test results to Test Hub.
The options that you can use with Publishreports are as
follows:
For example, STATS
You must use the Publishreports field along with the
Publish field. You can prefix the value of
Publishreports with ! to publish the reports except the
specified one.
For example, !STATS
|
| Rate |
Use this field to change the rate of the rate runner group.
For example, Rate Runner Group1=1/s, 3/m
Where, Rate Runner Group1 is the
name of the Rate Runner group that has two stages. The desired rate for the first
stage is one iteration per second and the rate for the second stage is three
iterations per minute. Notes:
- The name of the Rate Runner group must match with the name in the rate
schedule.
- The Rate field creates a new copy of the rate
schedule that contains the specified number of rate.
|
| Resource Monitoring Labels
Override |
Use this field to perform any of the following actions:
-
Enable the Resource Monitoring from Service option for
a performance schedule if the Resource
Monitoring from Service option is not enabled from the schedule
editor in Test Performance.
-
Ignore Resource Monitoring sources that were set in the performance schedule
and change for a label matching mode.
-
Replace an existing set of Resource Monitoring labels that were set in the
performance schedule and run the schedule with a new set of Resource
Monitoring labels.
If you have added a label in Test Hub for a Resource Monitoring
source as rm1, then you can provide value as shown in the following
example to collect data from the source:
rm1
If your label for resource monitoring contains a comma (,) then
you must replace the single comma with the double comma while providing the value.
For example, if you added a label to a Resource Monitoring source as
rm1,test, then you must provide the value as shown in the following
example to collect data from source:
rm1,,test
Notes:
-
You can use this field only when you want to run a Rate schedule or VU
schedule.
-
You can add multiple Resource Monitoring labels separated by a comma.
-
You must add the Resource Monitoring labels to the Resource Monitoring
sources on the Resource Monitoring page in the Test Hub project.
|
| Results File |
Use this field to provide a different name to the result file.
The results file is stored in the Results directory. The
default name of the result file is the name of the test
or schedule with a timestamp appended.
|
| User Comments |
Use this field to add text that you want to display in the user comments row of the
report.
For example, test run with dataset
Note: When you run tests by using the double quotation marks
("") for the User Comments field, then the user comments row
of a report does not contain double quotation marks. For example, if you provide
the value for the User Comments field as test
run with "dataset", then the user comments row of a report
displays the value as test run with dataset. To
work around this problem, you must create a command-line config file, and then run
the test by using the Config File field.
|
| Var File |
Use this field to provide a complete path to an XML file that contains the variable
name and value pairs.
|
| VM Args |
Use this field to specify the maximum heap size for the Java process.
You can add multiple VM Arguments separated by a comma.
|