Campaign Actuator
This section allows configuring the core components of the Campaign Actuator, including data ingestion from multiple feeds, data processing based on triggers, event execution aligned with contact policies.
"campaignActuator": {
"contactPolicyCheckerSettings": {
"batchSize": 1,
"perUserContactPolicy": {
"countBasedPolicies": [
{
"countThreshold": 50,
"timeUnit": "Days",
"windowSize": 7
}
]
}
},
"deduplicatorFsyncBatchSize": 1000,
"feedDataKafkaSourceSettings": {
"batchSize": 1000
},
"logLevel": "INFO",
"mergerSettings": {
"inputBufferSize": 1000,
"maxBlockingTimeInMillis": 500
},
"name": "Campaign Actuator",
"numParallelChannels": 2,
"responseMessageKafkaSourceSettings": {
"batchSize": 1000
},
"triggerEvaluatorSettings": {
"batchSize": 1,
"maxAllowedTimeLagInSeconds": 86400,
"stateExpirationCheckIntervalInMillis": 5000
},
"useKafkaToProcessActuation": false
},
contactPolicyCheckerSettings
Configure the batch size and control the number of events to be sent to a user within a specific period.
batchSize: define the batch size of the events to be executed based
on the configuration in the Contact policy. Preferred batch size for production
environment is "100", and for test environment, it should be "one".
"countBasedPolicies": [ { "countThreshold": 50, "timeUnit": "Days",
"windowSize": 7}]: in the perUserContactPolicy section, you can set the
maximum number of campaign events that can be sent to a user in a specific time
interval. The timeUnit can be Days or
Months. In the above example, the system will send up to 50 events
for a specific user within seven days.
deduplicatorFsyncBatchSize: set the number of records to be processed
in a single batch during data synchronization to avoid duplicate data sync up.feedDataKafkaSourceSettings: set the number of records to be read by
campaign actuator from feed applications.logLevel: configure the level of log information required from the
campaign actuator execution. You can set any one of the following options:- CRITICAL: logs only critical information.
- DEBUG: logs all the information including critical, error and warning.
- ERROR: logs only error message.
- INFO: logs basic execution information.
- WARNING: logs only warning messages.
mergerSettings
In mergerSettings, configure feed data processing parameters including batch size, collection time, and the number of parallel data channels to optimize data aggregation efficiency.
inputBufferSize: set the number of records to read the feed data
from various channels for merge process.
maxBlockingTimeInMillis: configure the maximum duration for
gathering feed data before initiating the merge process.
Parallel Channels: configure the number of concurrent data streams
to process feed data. In Kubernetes, this can dynamically scale up to 10 channels,
while in virtual machines, a fixed number of channels can be initiated
simultaneously.
responseMessageKafkaSourceSettings: set the number of messages
processed in each batch as Tomcat consumes data from Kafka topics and sends
acknowledgment messages to the Campaign Actuator.triggerEvaluatorSettings
In the triggerEvaluatorSettings section, you can define the time lag to process data and trigger time for scheduler interval .
batchSize: define the batch size of the events to be executed based
on the configuration in the Contact policy. Preferred batch size for production
environment is "100", and for test environment, it should be "one".
maxAllowedTimeLagInSeconds: set the maximum acceptable age for data
to be processed for the trigger evaluation. If data is older than this threshold,
the trigger is halted. For instance, a value of 86400 seconds, which is 1 day old,
would prevent processing data older than one day data.
stateExpirationCheckIntervalInMillis: configure the frequency of
checking reminder-based trigger intervals. This setting controls how often the
system evaluates if a reminder-based trigger should be activated.
useKafkaToProcessActuation: by default, this is set as true for
production. In case of development activity or test, this field can be set as
false.