Downstream Kafka based Custom Event Consumer Configuration
Kafka is a distributed event store and stream-processing platform that captures events and send them to custom event application. Using the Kafka-based configurations, customers can implement this configuration and use the Kafka route to handle their detected events.
For publishing events to downstream Kafka which is read by downstream system, follow the
steps below:
- In the acme/etc/producer_config.properties file, update the IP address of Kafka server.
- In the drive.json file, update the corresponding parameters in the
eventConsumers section:
- the className parameter with the name of the Java class.
- the parameters > name parameter with name of the downstream system.
- the producerTopic value with the downstream system Kafka topic name.
- Update the following database parameters values with the values from the
drive.json file:
- databaseServerIp: update it with database server IP address.
- databaseServerPort: update it with database server port.
- databaseServerUser: update it with database server userName.
- databaseServerPassword: update it with database server userPassword.
- Add other downstream specific parameter to this sample json.
- In the eventEndpoints parameters, add the endpoint name.Note: The value of APPLICATION_ENDPOINT_NAME should match with the event consumer class.
- After updating the values, ensure maven is installed on the server.
To build the jar for the custom_event_consumer, follow the steps below:
- Navigate to the HCL Detect installation folder.
- Change directory to custom_event_consumer using the below
command.
cd custom_event_consumer/
- By default, the custom_event_consumer folder contains the following
makefile, which to can create and move the custom event consumer jar file to installer
path
"/tomcat/webapps/drive/WEB-INF/lib/".
.PHONY: build move clean all all: build move clean build: mvn -f pom-custom.xml clean install move: cp target/custom_event_consumer-*.jar $(HCL_INSTANCE_HOME)/tomcat/webapps/drive/WEB-INF/lib/ clean: rm -rf target
- To execute the makefile, use the following
command.
“make”
- On Successful executing the makefile, the jar of the custom_event_consumer is moved to tomcat.
The sample configuration of downstream Kafka publishing events is shown
below:
"eventConsumers": [
{
"className": "com.detect.SampleKafkaBasedEventConsumer",
"name": "SampleKafkaBasedEventConsumer",
"parameters": [
{
"name": "payloadEntrySourceCode",
"stringValue": "ES-TEST002"
},
{
"name": "databaseServerIp",
"stringValue": "127.0.0.1"
},
{
"name": "databaseServerUser",
"stringValue": "drive"
},
{
"name": "databaseServerPassword",
"stringValue": "6F2FA7F6B3CD1570F19F0A0B2636E033"
},
{
"name": "databaseServerPort",
"int32Value": 3306
},
{
"name": "databaseName",
"stringValue": "drive_acme_core"
},
{
"name": "producerConfigFilePath",
"stringValue": "${HCL_HOME}/../${HCL_SOLUTION}/etc/producer_config.properties"
},
{
"name": "producerKrb5ConfigFilePath",
"stringValue": "/etc/krb5.conf"
},
{
"name": "producerTopic",
"stringValue": "CUSTOM_QUALIFIED_TRANS_TOPIC"
}
]
}
]
- eventConsumers.className: set the name of the Java class responsible for writing events in the pre-defined format of the downstream system.
- eventConsumers.name: name of the downstream system intended to assign.
- Parameters:
- Database related:
- databaseServerIp: Server IP of the database to publish downstream events to.
- databaseServerUser: Username of the database credentials.
- databaseServerPassword: Encrypted password of the database credentials.
- databaseServerPort: Port of the database.
- databaseName: Name of the database.
- Kafka related:
- producerConfigFilePath: Path to the producer_config.properties file. This file contains the configuration/properties for connecting to Kafka.
- producerKrb5ConfigFilePath: Path to the krb5.conf file if needed to the Kafka connection needs to be Kerberos authenticated.
- producerTopic: Name of the Kafka Topic onto which events will be published and downstream system reads from.
- Database related:
This Java class writes the events in the pre-defined format of the downstream system to implement the EventConsumer interface and its methods.
- initialize() - this method creates a database & downstream Kafka connection.
- consumeEvent() - this method receives the event object.
- consumeEventImpl() - this method sets database message, downstream Kafka message and writes the respective message to database and downstream Kafka topic.