Unica Campaign integration with Kafka

Kafka instance configuration template - Affinium|Campaign|partitions|partition[n]|Kafka|(KafkaTemplate)

This template provided to add different Kafka instances for Campaign under selected partition.

Affinium|Campaign|partitions|partition[n]|Kafka|Campaign is the default configuration created for Campaign while installation.

Affinium|Campaign|partitions|partition[n]|Kafka|Journey is the default configuration created for Journey while installation.

Below are the configuration parameters for KafkaTemplate:

KafkaBrokerURL URL of the Kafka instance user wants to point and produce data on it. Kafka server being used with Journeys application. Example is IP-0A862D46:9092
CommunicationMechanism

Specify the connection mechanism to connect to Kafka server.

Possible values:

SASL_PLAINTEXT_SSL - Use this to connect to kafka with username/password and SSL enabled.

This is also used for connecting with cloud-based Kafka services like Confluent Cloud or Amazon MSK etc.

NO_SASL_PLAINTEXT_SSL - Use this to connect kafka with no authentication and no SSL.

SASL_PLAINTEXT - Use this to connect kafka with username and password only.

SSL - Use this to connect kafka without username/password but with SSL.
SaslMechanism

SASL mechanism used for client connections.

Possible values:

PLAIN: This is default value, Use this if client connection is without Kerberos authentication.

Choose this option for connecting with Confluent Kafka.

GSSAPI: Use this if client connection is with Kerberos authentication.

SCRAM-SHA-512: Use this if client connection is with SCRAM-SHA-512 for authentication mechanism.

KafkaKeyFile Specify the client key file if connection mechanism is using SSL. Example: /opt/ Unica/Kafkakeys/client_key.pem
KafkaCertificateFile Specify the certificate file if connection mechanism is using SSL. Example: /opt/ Unica/Kafkakeys/client_cert.pem
CertificateAuthorityFile

It is signed certificate of Kafka Server, it is required when connection mechanism is using SSL. Example - /opt/Unica/Kafkakeys/ca-cer

For Kafka services like Confluent Cloud or Amazon MSK, brokers use TLS certificates issued by trusted public Certificate Authorities (CAs). These CAs are included in the default system certificate stores on most operating systems, making the probe setting effective for automatic certificate retrieval.

When using SASL_PLAINTEXT_SSL for communication, the SSL certificate is verified using these default system certificates. By setting CertificateAuthorityFile=probe in the Kafka client's configuration, the client can automatically locate and use the trusted public CA certificates from the default system paths, simplifying setup and ensuring secure communication without manual CA configuration.

If the default system CA certificates are not sufficient for establishing secure communication with a Kafka service, you can obtain the specific CA certificate directly from the Kafka service provider.

UserForKafkaDataSource Marketing Platform user contains the datasource credentials for Kafka while connecting with username / password. Also, for Confluent Kafka, create a datasource under the Marketing Platform user.
KafkaDataSource DataSource containing the kafka user credentials. When setting up this Kafka datasource for Confluent Kafka, utilize the key and secret as the username and password.
TopicName CAMPAIGN_PB - Default value for Campaign Kafka Instance created. User can change this default topic name while configuring process box.

STREAMING_IMPORT - Default value for Journey Kafka Instance created. Journeys designated topic for Campaign to push data to Journey. Required value - STREAMING_IMPORT. Please do not change this as it would send data to Kafka topic which is not being used in Journey.

For KafkaTemplate the field Topicname is blank. User can set this default topic name while configuring Kafka instance.

NumberOfPartitions Number of partitions supports Kafka to hold user exported data. For Confluent Kafka, the partitions for the topics being created are predefined at the broker level. Obtain this number from your Confluent Kafka service provider.
NumberOfReplicas Each partition is replicated across a configurable number of servers for fault tolerance. For Confluent Kafka, the replication factor for the topics being created is predefined at the broker level. Obtain this number from your Confluent Kafka service provider.
RetentionPeriodInSeconds The maximum time Kafka will retain messages exported over topic. Once reten­tion period over Kafka clears all eligible exported messages.
SslKeyPasswordDataSource If KafkaKeyFile is password protected, then please create a separated data source which will include that password. User name is not used so can be anything. Mention that data source name as value of this field.
SaslKerberosServiceName

When using GSSAPI i.e., Kerberos authentication, configure a service name that matches the primary name of the Brokers configured in the Broker JAAS file.

Example is: kafka
SaslKerberosKeytabPath When using GSSAPI i.e., Kerberos authentication, set path of the keytab file created for kafka client.
SaslKerberosPrincipal When using GSSAPI i.e., Kerberos authentication, set Kerberos principal created for kafka client.