Output properties and data transfer between workflows

The output of a workflow can be defined to key-value pairs. You can export these output properties and utilize them as variables in other workflows. This type of data transfer is facilitated through subflow tasks that contains nested workflows.

You can transfer the data either from the main workflow to subflow tasks (nested workflows) using variables, or between subflow tasks using variables and output properties. To transfer data between workflows, they must be referenced within subflow tasks that are nested within a single workflow. For more information, see Managing a subflow task from the OCLI. Configure the main workflow in such a way that the subflow tasks are not run simultaneously. Create an internal dependency so that the subflow task with output properties is run initially. You can then transfer this output to a successor task using output properties and variables. If both tasks run simultaneously, the predecessor task's output property is not available initially, causing the successor task to fail because it depends on that output to resolve its task definition.

Data can also be exported between other task types in a single workflow using the syntax ${jobs.<job_name>.<property_name>}. For more information, see Passing properties from one task to another task in the same workflow instance.

Setting output properties for a workflow

In HCL Universal Orchestrator, the task output is combined to a single JSON file. You can query and export information from a task output using JSONata4Java. For more information, see Data exchange and transformation.

The output properties defined for a workflow definition are evaluated only when it ends in a final status such as SUCC, ABEND, or SUPPR. For any other status, the output properties are not evaluated.

You can define the output of a workflow in key-value pairs. For each workflow you can specify a property name, and value, a JSONata expression that is resolved during the run time of the workflow, as output properties. You can optionally provide a description for each key-value pair. To create a workflow definition with output properties, complete the workflow definition template as follows:
kind: JobStream
def:
  folder: Folder_name
  name: workflow_name
  workstation: Workstation_name
  outputProperties:
  - name:
    value:
    description:
  - name:
    value:
    description:
  - name:
    value:
    description:
  jobs:
  -name: task_name
Data transfer from main workflow to subflow tasks

You can transfer data from the main workflow to subflow tasks using variables. Variables are optional attributes, key-value pairs, that you can add to each subflow tasks in a workflow. This key-value pairs are are resolved when the task runs. Any terms in the subflow task definition that match keys in the key-value pair is replaced with their corresponding values, effectively resolving the subflow task definition. To create a subflow task with variables, see Managing a subflow task from the OCLI.

Data transfer between subflow tasks in a workflow
You can export the output properties of a subflow task to resolve a successive subflow task definition, leveraging dynamic output data to complete subsequent tasks. If output properties are defined for a nested workflow, they are automatically mapped as output properties for the subflow task that contains the nested workflow.
To export the data, configure the output properties of a subflow task as variables (key-value pairs) in a successor subflow task. Specify a key, and add the output property name preceded by the subflow task name as value. When the predecessor task is completed, the output properties are passed to the successor task through these variables. Any terms in the successor task definition that matches the key is replaced with their corresponding values (output property value), effectively resolving the subflow task definition. Thus the dynamic output properties a workflow is passed to a successive workflow through variables to resolve the definition.

Examples

  1. The following example shows the definition for QUERY_FOR_STATISTICS workflow, that contains the STATISTICS_EXTRACTOR task. The task generates total revenue and total expenditure as output properties:

    ---
    kind: JobStream
    def:
      folder: /REPORTING/
      name: QUERY_FOR_STATISTICS
      workstation: /CLOUD
      description: Queries downstream systems to retrieve report information
      outputProperties:
      - name: TOTAL_REVENUE
        value: "${jobs.STATISTICS_EXTRACTOR.TOTAL_REVENUE}"
        description: The total revenue for the selected month
      - name: TOTAL_EXPENDITURE
        value: "${jobs.STATISTICS_EXTRACTOR.TOTAL_EXPENDITURE}"
        description: The total expenditure for the selected month
      saturdayIsFree: true
      sundayIsFree: true
      priority: 10
      asap: false
      perJobLatestStart: false
      matchingCriteria:
        type: previous
      resourceDependencies: []
      jobs:
      - name: STATISTICS_EXTRACTOR
        workstation: /CLOUD
        jobDefinition:
          description: Extracts mocked information
          type: jsonata
          subType: queryInfo
          task:
            jsonata:
              Action:
                queryInfo:
                  format: json
                  document: "{}"
                  variablelist:
                    TOTAL_REVENUE: $random()*1000
                    TOTAL_EXPENDITURE: $random()*1000
          recovery:
            action: STOP
            repeatAffinity: false
        priority: 10
        asap: false
        resourceDependencies: []
  2. The following example shows the workflow SALES_MONTHLY_REPORT with two subflow tasks, RETRIEVE_STATISTICS and SEND_NOTIFICATIONS. The task definition for RETRIEVE_STATISTICS with output properties is given in example 1.

    The output properties of the RETRIEVE_STATISTICS task is reused as variables in the SEND_NOTIFICATIONS task. An internal dependency is also specified so that both subflow tasks are not run simultaneously, and the RETRIEVE_STATISTICS task runs first and the generated output is later reused in the SEND_NOTIFICATIONS task to resolve the definition.

    ---
    kind: JobStream
    def:
      folder: /MAIN/
      name: SALES_MONTHLY_REPORT
      workstation: /CLOUD
      description: Executes the monthly reports for the e-commerce. Sends notifications
        and alerts if needed.
      saturdayIsFree: true
      sundayIsFree: true
      priority: 10
      asap: false
      perJobLatestStart: false
      matchingCriteria:
        type: previous
      resourceDependencies: []
      jobs:
      - name: RETRIEVE_STATISTICS
        workstation: /CLOUD
        jobDefinition:
          description: Queries downstream systems to retrieve the monthly report information.
          type: subflow
          subType: nestWorkflow
          task:
            subflow:
              Action:
                nestWorkflow:
                  workflow: /CLOUD#/REPORTING/QUERY_FOR_STATISTICS
                  variables: {}
          recovery: {}
        priority: 10
        asap: false
        resourceDependencies: []
      - name: SEND_NOTIFICATIONS
        workstation: /CLOUD
        jobDefinition:
          description: Sends notifications based on the results of the report
          type: subflow
          subType: nestWorkflow
          task:
            subflow:
              Action:
                nestWorkflow:
                  workflow: /CLOUD#/COMPLIANCE/SEND_NOTIFICATIONS
                  variables:
                    TOTAL_REVENUE: "${jobs.RETRIEVE_STATISTICS.TOTAL_REVENUE}"
                    TOTAL_EXPENDITURE: "${jobs.RETRIEVE_STATISTICS.TOTAL_EXPENDITURE}"
          recovery: {}
        priority: 10
        asap: false
        internalPredecessors:
        - jobName: RETRIEVE_STATISTICS
        resourceDependencies: []