Campaign Actuator
This section allows configuring the core components of the Campaign Actuator, including data ingestion from multiple feeds, data processing based on triggers, event execution aligned with contact policies.
"campaignActuator": {
"contactPolicyCheckerSettings": {
"batchSize": 1,
"perUserContactPolicy": {
"countBasedPolicies": [
{
"countThreshold": 50,
"timeUnit": "Days",
"windowSize": 7
}
]
}
},
"deduplicatorFsyncBatchSize": 1000,
"feedDataKafkaSourceSettings": {
"batchSize": 1000
},
"logLevel": "INFO",
"mergerSettings": {
"inputBufferSize": 1000,
"maxBlockingTimeInMillis": 500
},
"name": "Campaign Actuator",
"numParallelChannels": 2,
"responseMessageKafkaSourceSettings": {
"batchSize": 1000
},
"triggerEvaluatorSettings": {
"batchSize": 1,
"maxAllowedTimeLagInSeconds": 86400,
"stateExpirationCheckIntervalInMillis": 5000,
"logConfig": {
"keyValues": [ "KEY1" , "KEY2" ],
"triggerNames": [ "TRIGGER1", "TRIGGER2" ]
},
},
"useKafkaToProcessActuation": false
},
contactPolicyCheckerSettings
Configure the batch size and control the number of events to be sent to a user within a specific period.
batchSize: define the batch size of the events to be executed based
on the configuration in the Contact policy. Preferred batch size for production
environment is "100", and for test environment, it should be "one".
"countBasedPolicies": [ { "countThreshold": 50, "timeUnit": "Days",
"windowSize": 7}]: in the perUserContactPolicy section, you can set the
maximum number of campaign events that can be sent to a user in a specific time
interval. The timeUnit can be Days or
Months. In the above example, the system will send up to 50 events
for a specific user within seven days.
deduplicatorFsyncBatchSize: set the number of records to be
processed in a single batch during data synchronization to avoid duplicate data sync
up.
feedDataKafkaSourceSettings: set the number of records to be read by
campaign actuator from feed applications.
logLevel: configure the level of log information required from the
campaign actuator execution. You can set any one of the following options:- CRITICAL: logs only critical information.
- DEBUG: logs all the information including critical, error and warning.
- ERROR: logs only error message.
- INFO: logs basic execution information.
- WARNING: logs only warning messages.
mergerSettings
In mergerSettings, configure feed data processing parameters including batch size, collection time, and the number of parallel data channels to optimize data aggregation efficiency.
inputBufferSize: set the number of records to read the feed data
from various channels for merge process.
maxBlockingTimeInMillis: configure the maximum duration for
gathering feed data before initiating the merge process.
Parallel Channels: configure the number of concurrent data streams
to process feed data. In Kubernetes, this can dynamically scale up to 10 channels,
while in virtual machines, a fixed number of channels can be initiated
simultaneously.
responseMessageKafkaSourceSettings: set the number of messages
processed in each batch as Tomcat consumes data from Kafka topics and sends
acknowledgment messages to the Campaign Actuator.
triggerEvaluatorSettings
loglevel parameter to
DEBUG. If loglevel is set to any other
level, the logConfig will be ignored.batchSize: define the batch size of the events to be executed based
on the configuration in the Contact policy. Preferred batch size for production
environment is "100", and for test environment, it should be "one".
maxAllowedTimeLagInSeconds: set the maximum acceptable age for data
to be processed for the trigger evaluation. If data is older than this threshold,
the trigger is halted. For instance, a value of 86400 seconds, which is 1 day old,
would prevent processing data older than one day data.
stateExpirationCheckIntervalInMillis: configure the frequency of
checking reminder-based trigger intervals. This setting controls how often the
system evaluates if a reminder-based trigger should be activated.
logConfig: controls the conditional logging of data based on a
specific key and trigger value combination. This setting will log only when a
specified key value, trigger ID, or both matches. This setting is useful for
filtering logs to focus on specific data or events during debugging. - Key: The key to evaluate, such as the
Keyparameter in the data (e.g., MSISDN ). - Trigger: The trigger ID for the event. You can get the trigger ID
through the URL (https:%3CDETECT_DOMAIN_URL%3E/drive/rest/trigger_manager/current_trigger/all). Note: Usually the Trigger name starts with the letter T like T1884.
Key and Trigger value Behavior:
- Only Key values are set: Logs are recorded only for the matching key values.
- Only Trigger IDs are set: Logs are recorded only for the specified trigger IDs.
- Both Key and Trigger set: Logs are recorded only when both the key value and trigger ID match.
- No values set: All debug logs are recorded without filtering. This is the default behavior in DEBUG mode.
useKafkaToProcessActuation: by default, this is set as true for
production. In case of development activity or test, this field can be set as
false.