External Kafka Environment Configuration
Detect supports integration with externally hosted Kafka environments. This topic describes how to configure Detect to connect to external Kafka using consumer and producer properties.
In some environments, Detect services must connect to a Kafka infrastructure that is not
managed by Detect. In such cases, Detect reads Kafka connection details from external
property files. These property files, consumer.properties and
producer.properties, must be configured and referenced correctly
within the drive.json file.
Kafka consumer groups must be defined before starting Detect services.
Kafka Topics
Ensure that the required Kafka topics are created and available.
Prerequisites
Before you begin, ensure the following:
- Kafka server host and port information is available and accessible.
- Create Required Topics:
- Standard Topics:
INTERNAL_EVENT_COMMUNICATION_TOPICINTERNAL_PROCESS_ACTUATION_KAFKA_TOPICSYSTEM-RESPONSE_MESSAGES
- Feed-Specific Topics: Each feed-specific Kafka topic must follow
the format:
APPLICATION_<FEED_NAME>.Note:- All feed topics must begin with the prefix
APPLICATION_. - Create
<FEED_NAME>with the name of a configured feed from thedrive.feedApplicationssection of thedrive.jsonfile. - Use uppercase letters with underscores to
separate words in
<FEED_NAME>.
- All feed topics must begin with the prefix
- Standard Topics:
- Required Kafka consumer groups must be defined before starting Detect
services.
- Standard Consumer Groups
The following consumer groups must be created as-is:
campaign_actuator_source.kafka_groupapplication_tuple_collector-all_partitionsEventConsumerManagerCampaignCoreBackend
- Feed-Specific Consumer Groups: Each feed-specific consumer group
must follow this format:
campaign_actuator_source.<feed_name>.kafka_group.Note:- Replace
<feed_name>with the feed name defined in thedrive.feedApplicationssection of thedrive.jsonfile. - Use lowercase letters with underscores to
separate words in
<feed_name>.
- Replace
- Standard Consumer Groups
- Kafka Consumer Group to Topic Mapping
Consumer Group Kafka Topic EventConsumerManager INTERNAL_EVENT_COMMUNICATION_TOPIC CampaignCoreBackend INTERNAL_PROCESS_ACTUATION_KAFKA_TOPIC campaign_actuator_source.kafka_group SYSTEM-RESPONSE_MESSAGES campaign_actuator_source.<feed_name>.kafka_group APPLICATION-<FEED_NAME> application_tuple_collector-all_partitions APPLICATION-<FEED_NAME> Note:<feed_name>and<FEED_NAME>are placeholders. Use the actual feed name in lowercase and uppercase respectively, using underscores for multi-word names. - Required consumer.properties
Configuration.
bootstrap.servers=<kafka-server>:<port> # Required Detect Kafka consumer groups detect.event_consumer.group.id=EventConsumerManager detect.process_actuation.group.id=CampaignCoreBackend detect.application_tuple_collector.group.id=application_tuple_collector-all_partitions detect.campaign_actuator.group.id=campaign_actuator_source.kafka_group # Feed-specific group ID detect.campaign_actuator.<feed_name>.group.id=campaign_actuator_source.<feed_name>.kafka_group # Kafka Python unsupported properties (comma-separated) detect.python.kafka.config.ignore.properties= - Required producer.properties
Configuration.
bootstrap.servers=<kafka-server>:<port> - In the
$HCL_HOME/etc/drive.jsonfile, under theinternalKafkaConfigurationsection, include:"internalKafkaConfiguration": { "externallyHostedApplicationInternalKafkaConfiguration": { "consumerPropertyFilenameWithPath": "/path/to/consumer.properties", "producerPropertyFilenameWithPath": "/path/to/producer.properties" }, "kafkaPort": 41240, "zooKeeperPort": 41241 }Note: Replace/path/to/consumer.propertiesand/path/to/producer.propertieswith the actual file paths. - Additional Resources:Kafka Consumer Configuration:Kafka Producer Configuration:
Authenticated Kafka Setup
Detect supports connections to the following types of authenticated Kafka environments:
- SASL
- SSL
- Kerberos
To enable communication with an authenticated Kafka environment, you must configure the following files with the required authentication properties:
drive.json: Update this file with authentication properties if the source feed connects to an authenticated Kafka environment.consumer.properties: Include the necessary Kafka consumer authentication settings.producer.properties: Include the necessary Kafka producer authentication settings.producer_config.properties: If the downstream event consumer is hosted in an authenticated Kafka environment, configure this file with the appropriate properties.
Each authentication type (SASL, SSL, or Kerberos) requires a different set of properties in the above files. Refer to the respective authentication setup section for detailed configuration instructions.
Setting Up SASL-Authenticated Kafka Environment
This section describes the configuration required to enable SASL authentication for connecting Detect to a Kafka environment.
Authenticated Kafka SSL Configurations
Update the Kafka broker's server.properties file with the
following settings:
# Listener configuration
listeners=SASL_PLAINTEXT://<server-ip>:<port>
advertised.listeners=SASL_PLAINTEXT://<server-ip>:<port>
listener.security.protocol.map=SASL_PLAINTEXT:SASL_PLAINTEXT
inter.broker.listener.name=SASL_PLAINTEXT
# SASL authentication properties
sasl.enabled.mechanisms=PLAIN
sasl.mechanism.inter.broker.protocol=PLAIN
listener.name.sasl_plaintext.plain.sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required \
username="admin" \
password="admin-secret" \
user_admin="admin-secret" \
user_kafkabroker1="kafkabroker1-secret";
listener.name.sasl_ssl.scram-sha-256.sasl.jaas.config=...
consumer.properties
In addition to the required base properties, add the following SASL authentication configuration:
# SASL authentication
security.protocol=SASL_PLAINTEXT
sasl.mechanism=PLAIN
sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required \
username="admin" \
password="admin-secret";
sasl.username=admin
sasl.password=admin-secret
producer.properties
Add the following SASL authentication properties:
# SASL authentication
security.protocol=SASL_PLAINTEXT
sasl.mechanism=PLAIN
sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required \
username="admin" \
password="admin-secret";
sasl.username=admin
sasl.password=admin-secret
producer_config.properties
If publishing messages to downstream topics requires SASL authentication, include the following configuration:
# Configure SASL_SSL if TLS/SSL encryption is enabled, otherwise configure SASL_PLAINTEXT
security.protocol=SASL_PLAINTEXT
sasl.mechanism=PLAIN
sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required \
username="admin" \
password="admin-secret";
drive.json Configuration
In the $HCL_HOME/etc/drive.json file, update the
internalKafkaConfiguration section:
"internalKafkaConfiguration": { "externallyHostedApplicationInternalKafkaConfiguration": { "consumerPropertyFilenameWithPath": "/path/to/consumer.properties", "producerPropertyFilenameWithPath": "/path/to/producer.properties" }, "kafkaPort": 41240, "zooKeeperPort": 41241 }
Replace /path/to/... with the actual file paths used in your
environment.
Feed-Level Authentication Properties
In the feedApplication configuration section, include the
following authentication properties:
{
"name": "saslMechanism",
"stringValue": "PLAIN"
},
{
"name": "securityProtocol",
"stringValue": "SASL_PLAINTEXT"
},
{
"name": "saslUsername",
"stringValue": "admin"
},
{
"name": "saslPassword",
"stringValue": "admin-secret"
}
"admin" and
"admin-secret" with the actual credentials for your Kafka
deployment.Setting Up SSL-Authenticated Kafka Environment
This section explains how to configure Detect to connect to a Kafka environment using SSL authentication.
Prerequisites
Ensure that the required SSL certificates are created and available before you begin.
Kafka Server Configuration
Update the Kafka broker’s server.properties file with the
following SSL-related settings.
# Listener configuration
listeners=SSL://<kafka-host>:9092
advertised.listeners=SSL://<kafka-host>:9092
# SSL authentication settings
security.protocol=SSL
ssl.keystore.location=<cert-path>/kafka.server.keystore.jks
ssl.keystore.password=<keystore-password>
ssl.key.password=<key-password>
ssl.truststore.location=<cert-path>/kafka.server.truststore.jks
ssl.truststore.password=<truststore-password>
ssl.endpoint.identification.algorithm=
security.inter.broker.protocol=SSL
ssl.client.auth=required
consumer.properties
Add the following SSL configuration settings to your
consumer.properties file, in addition to the required base
properties:
security.protocol=SSL
ssl.keystore.location=<cert-path>/kafka.server.keystore.jks
ssl.keystore.password=<keystore-password>
ssl.key.password=<key-password>
ssl.truststore.location=<cert-path>/kafka.server.truststore.jks
ssl.truststore.password=<truststore-password>
ssl.endpoint.identification.algorithm=https
ssl.ca.location=/tmp/kafka-SSL-certificates/ca-cert
ssl.certificate.location=/opt/Kafka-Server/kafka_2.13-3.4.0/KAFKA_STORE/CERT/client-certificate.pem
ssl.key.location=/opt/Kafka-Server/kafka_2.13-3.4.0/KAFKA_STORE/CERT/key.pem
producer.properties
Add the same SSL configuration properties used in
consumer.properties:
security.protocol=SSL
ssl.keystore.location=<cert-path>/kafka.server.keystore.jks
ssl.keystore.password=<keystore-password>
ssl.key.password=<key-password>
ssl.truststore.location=<cert-path>/kafka.server.truststore.jks
ssl.truststore.password=<truststore-password>
ssl.endpoint.identification.algorithm=https
ssl.ca.location=/tmp/kafka-SSL-certificates/ca-cert
ssl.certificate.location=/opt/Kafka-Server/kafka_2.13-3.4.0/KAFKA_STORE/CERT/client-certificate.pem
ssl.key.location=/opt/Kafka-Server/kafka_2.13-3.4.0/KAFKA_STORE/CERT/key.pem
producer_config.properties
If you are publishing messages to a downstream system on an SSL-enabled Kafka environment, update this file as follows:
security.protocol=SSL
ssl.keystore.location=<cert-path>/kafka.server.keystore.jks
ssl.keystore.password=<keystore-password>
ssl.key.password=<key-password>
ssl.truststore.location=<cert-path>/kafka.server.truststore.jks
ssl.truststore.password=<truststore-password>
ssl.endpoint.identification.algorithm=https
drive.json Configuration
In the $HCL_HOME/etc/drive.json file, update the Kafka
configuration block:
"internalKafkaConfiguration": {
"externallyHostedApplicationInternalKafkaConfiguration": {
"consumerPropertyFilenameWithPath": "/path/to/consumer.properties",
"producerPropertyFilenameWithPath": "/path/to/producer.properties"
},
"kafkaPort": 41240,
"zooKeeperPort": 41241
}
/path/to/... with the actual
file locations in your environment.Feed-Level Authentication Properties
Set the following properties in the feedApplication section to
enable SSL authentication for specific feeds:
{
"name": "securityProtocol",
"stringValue": "SSL"
},
{
"name": "sslKeystoreLocation",
"stringValue": "/path/to/kafka.server.keystore.jks"
},
{
"name": "sslKeystorePassword",
"stringValue": "<keystore-password>"
},
{
"name": "sslKeyPassword",
"stringValue": "<key-password>"
},
{
"name": "sslTruststoreLocation",
"stringValue": "/path/to/kafka.server.truststore.jks"
},
{
"name": "sslTruststorePassword",
"stringValue": "<truststore-password>"
}
Setting Up Kerberos-Authenticated Kafka Environment
This section explains how to configure Detect to connect to a Kafka environment that uses Kerberos authentication.
Detect Environment Setup (Client Side)
Perform these steps on the system where Detect is installed.
- Install the following Kerberos
packages:
yum install cyrus-sasl-gssapi yum install cyrus-sasl - Edit the Kerberos configuration
file:
vi /etc/krb5.confUpdate the file with the following changes:
- Uncomment all commented lines.
- Replace
EXAMPLE.COMwith your actual domain name in uppercase. - Replace
example.comwith your domain name in lowercase. - Under the
[realms]section, define:kdc = <KAFKA SERVER HOST NAME> - admin_server = <KAFKA SERVER HOST NAME>
Save and close the file.
- Copy the Kafka client keytab file from the Kafka server to the Detect
listener
server:
scp hcluser@<kafka-server>:/var/kerberos/krb5kdc/kfkclient.keytab /var/kerberos/krb5kdc/ - Run the following commands to validate Kebros
authentication:
$> kinit -k -t /var/kerberos/krb5kdc/kfkclient.keytab kafkaclient@<DOMAIN NAME>.COM -r 7d $> klistYou should see a valid ticket for the Kerberos principal, such as:
Ticket cache: KEYRING:persistent:0:0 Default principal: kafkaclient@<DOMAIN NAME>.COM Valid starting Expires Service principal 09/06/2022 05:04:56 09/07/2022 05:04:56 krbtgt/<DOMAIN NAME>.COM@<DOMAIN NAME>.COM
Kafka Client Configuration
consumer.properties
In addition to the standard properties, include the following:
bootstrap.servers=<kerberos-kafka-server>:<port>
sasl.mechanism=GSSAPI
security.protocol=SASL_PLAINTEXT
sasl.kerberos.service.name=kafka
sasl.kerberos.keytab=/var/kerberos/krb5kdc/kfkclient.keytab
sasl.kerberos.principal=kafkaclient@<DOMAIN NAME>.COM
sasl.jaas.config=com.sun.security.auth.module.Krb5LoginModule required \
useKeyTab=true storeKey=true \
keyTab="/var/kerberos/krb5kdc/kfkclient.keytab" \
principal="kafkaclient@<DOMAIN NAME>.COM";
consumer.properties file must
reside on the Detect server and its path must be set in the
drive.json configuration file.producer.properties
Add the following Kerberos configuration:
bootstrap.servers=<kerberos-kafka-server>:<port>
sasl.mechanism=GSSAPI
security.protocol=SASL_PLAINTEXT
sasl.kerberos.service.name=kafka
sasl.kerberos.keytab=/var/kerberos/krb5kdc/kfkclient.keytab
sasl.kerberos.principal=kafkaclient@<DOMAIN NAME>.COM
sasl.jaas.config=com.sun.security.auth.module.Krb5LoginModule required \
useKeyTab=true storeKey=true \
keyTab="/var/kerberos/krb5kdc/kfkclient.keytab" \
principal="kafkaclient@<DOMAIN NAME>.COM";
producer.properties file must
reside on the Detect server and its path must be set in the
drive.json configuration file.producer_config.properties
If you are sending data to a downstream Kafka system that uses Kerberos, include the following:
bootstrap.servers=<kerberos-kafka-server>:<port>
sasl.mechanism=GSSAPI
security.protocol=SASL_PLAINTEXT
sasl.kerberos.service.name=kafka
sasl.jaas.config=com.sun.security.auth.module.Krb5LoginModule required \
useKeyTab=true storeKey=true \
keyTab="/var/kerberos/krb5kdc/kfkclient.keytab" \
principal="kafkaclient@<DOMAIN NAME>.COM";