Unica Journey Configuration

This page explains how to configure HCL Detect for integration with Unica Journey. It covers the setup of journey endpoints, event mappings, and authentication parameters to enable real-time event-based customer engagement through Unica Journey workflows.

Unica Journey

Downstream Kafka is a distributed event store and stream-processing platform that captures events and send them to Unica Journey. You can configure the drive.json file with the kafka parameters that are necessary to map event details and ensure they are sent to the right destination. Based on the set parameters, events will display the set parameter values in the drop-down of the events files.

In the drive.json file, go to the eventConsumers section and set the Unica Journey parameters to send the event details. The sample configuration of downstream Kafka publishing events to Unica Journey is shown below:

"eventConsumers": [
  {
    "className": "com.hcl.drive.communication.UnicaJourneyEventConsumer",
    "name": "UnicaJourneyEventConsumer",
    "parameters": [
      {
        "name": "payloadEntrySourceCode",
        "stringValue": "ES-TEST002"
      },
      {
        "name": "databaseServerIp",
        "stringValue": "127.0.0.1"
      },
      {
        "name": "databaseServerUser",
        "stringValue": "drive"
      },
      {
        "name": "databaseServerPassword",
        "stringValue": "6F2FA7F6B3CD1570F19F0A0B2636E033"
      },
      {
        "int32Value": 3306,
        "name": "databaseServerPort"
      },
      {
        "name": "databaseName",
        "stringValue": "drive_acme_core"
      },
      {
        "name": "dateTimeAttributeName",
        "stringValue": "ts"
      },
      {
        "name": "producerConfigFilePath",
        "stringValue": "${HCL_HOME}/../${HCL_SOLUTION}/etc/producer_config.properties"
      },
      {
        "name": "producerKrb5ConfigFilePath",
        "stringValue": "/etc/krb5.conf"
      },
      {
        "name": "producerTopic",
        "stringValue": "JOURNEY_QUALIFIED_TRANS_TOPIC"
      }
    ]
  }
]
  1. eventConsumers.className: Name of the Java class which writes the events in the pre-defined format of the downstream system. For Unica Journey, the className must be set as "com.hcl.drive.communication.UnicaJourneyEventConsumer".
  2. eventConsumers.name: Name of the downstream system to be assigned.
  3. Parameters:
    1. Database related:
      1. databaseServerIp: Server IP address of the database to publish the downstream events.
      2. databaseServerUser: Username of the database.
      3. databaseServerPassword: Encrypted password of the database.
      4. databaseServerPort: Port number of the database.
      5. databaseName: Name of the database.
    2. Kafka related:
      1. producerConfigFilePath: Path to the producer_config.properties file. This file will have the configuration/properties for connecting to Kafka.
      2. producerKrb5ConfigFilePath: Path to the krb5.conf file if needed to Kafka connection needs to be Kerberos authenticated.
      3. producerTopic: Name of the Kafka Topic on which events will be published, and Unica Journey will read it from.
    3. payloadEntrySourceCode: Value of the "entrySourceCode" to be populated when pushing the events to Unica Journey.
Sample Message

A sample screenshot shown below within an event configured in Detect where we select the attributes and populate an event identifier to push to downstream kafka topic for Unica Journey to ingest.

Sample message output to the Kafka topic based on the configuration above.

{
  "entrySourceCode": "ES-00000363", # Value from payloadEntrySourceCode
  "data": [
    {
      "eventID": "EVNT00001",        # Value configured from UI
      "country": "India",            # Profile Attribute
      "creditScore": "700",          # Profile Attribute
      "MSISDN": "9881153226",        # Tuple Attribute
      "city": "Mumbai",              # Profile Attribute
      "transactionAmount": "600.0",  # Tuple Attribute
      "dataSource": "Recharge"       # Tuple Attribute
    }
  ]
}