kafka-observe

2.2.0-beta-5-logs-callback2 • Public • Published

kafka-observe

This library is a wrapper around kafkajs. kafka-observe allows for kafka topics to be transformed in rxjs Subjects. This way you can listen to events on kafka by subscribing to the topics instead of setting up the handling of streams yourself.

This library is very opinionated and supports our way of working at Bothive. However it works great for us and is very versatile so it might be the way you prefer working as wel!

Note: We have introduced braking changes in 2.1.0. 2.1.0 is NOT compatible with previous versions because payloads look differend. Make sure to upgrade all your kafka services at once if you upgrade from previous versions.

usage

This library exports a function with one parameter options to initialise the library.

Note: We have used the singleton patern to make sure that you don't create multiple connections and make sure you don't waste CPU and RAM on multiple connections. Only the first options passed will be applied. The rest will be ignored and pass the existing instance.

Require the library as follows:

const kafka = require("kafka-observe")(options);

The option object takes the following parameters:

const options =  {
    kafkaHost: "kafka host and port" // example: "localhost:9092"
    topicsToFollow: [
        {
            topic: "name of the kafka topic", // example: "users"
            strategy: "All|LoadBalanced|Both", // the listening strategies enabled for this topic
        },
        ...
    ],
    // enable required components
    enableConsumer: true, // set to true if listening strategy All or Both is used (disable if you only use LoadBalanced mode to save some CPU and RAM)
    enableConsumerGroup: true, // set to true if listening strategy LoadBalanced or Both is used (disable if you only use All mode to save some CPU and RAM)
    enableProducer: true, // set to true if you want to produce messages to kafka in this project (most likely you will want to set this to true)

    consumerGroupId: "consumer group name",
    consumerId: "consumer name",

    // liveness ping: It is best practice to have a liveness topic in your kafka. That way you can implement easy live checks of your components (only works if producer is enabled)
    liveness: {
        topic: "liveness", // name of topic to send liveness to
        interval: 2000, // frequency of ticks
        event: "ALIVE_TICK", // event name of tick
    },
    logAllEvents: "false", // you can log all kafka events to te console (can come in handy for debugging)
    ssl, // see kafkajs docs we just pass the variable through
    sasl, // see kafkajs docs we just pass the variable through
},

Passing these options will return an object set up to communicate with your kafka cluster. After initialisation the kafka object will have these properties and functions:

consumer: a connected consumer instance
consumerGroup: a connected consumer group instance
producer: instance of a connected producer
getTopicSubject: a function that returns a topic to subscribe to with the correct strategy. eventCallback: a function that sends out an event and listens for a response

getTopicSubject (recommended method)

Returns a subject you can subscribe to listen to events of a certain topic.

const { getTopicSubject } = require("kafka-observe")(options);

getTopicSubject({
    topic: "topic name",
    loadBalanced: "true|false", // True for Loadbalanced strategy false for All strategy
}).subscribe((event) => {
	// do something with the event
});

eventCallback

Returns a subject you can subscribe to listen to events of a certain topic.

const { eventCallback } = require("kafka-observe")(options);

eventCallback({
    topic: "topic name",
    sender: "event",
    listeners: ["listen_for_event"],
    payload: {} // An object you want send with the event (needs to be JSON convertable)
}).then((event) => {
	// do something with the event
});

writeStream

Produces a large object in streams over a topic

const { writeStream } = require("kafka-observe")(options);

writeStream({
    topic: "topic name",
    headers: {},
    event: "listen_for_event",
    limit: 2000000, // The max data size in byte each stream can contain
    payload: {} // An object you want send with the event (needs to be JSON convertable)
}).then((event) => {
	// do something with the event
});

readStream

Listens to a write stream and returns the complete object on finish

const { readStream } = require("kafka-observe")(options);

readStream({
    topic: "topic name",
    events: ["listen_for_event"],
    guard: () => true,
    onStatus: (status) => console.log(status) // A callback to receive status about the stream 
}).then((event) => {
	// do something with the event
});

streamReader

Helper to convert consumed stream data to json object. needs to be initialized and returns a function that can be used to read the consumed data parts

const { streamReader } = require("kafka-observe")(options);
const _streamReader = streamReader();

_streamReader({
    event: {},
    id: "uniqueId",
    loadBalanced: true,
    topic: "topic_name",
    onStatus: (status) => console.log(status) // A callback to receive status about the stream 
}).then((event) => {
	// do something with the event
});

streamCallback

Sends out an event and listens to the stream event

const { streamCallback } = require("kafka-observe")(options);

streamCallback({
    topic: "topic name",
    sender: "event",
    listeners: ["listen_for_event"],
    payload: {} // An object you want send with the event (needs to be JSON convertable)
    onStatus: (status) => console.log(status) // A callback to receive status about the stream 
}).then((event) => {
	// do something with the event
});

producer

Use this exported class to produce messages to kafka on any topic you want (no need to put the topic in the topicsToFollow if you only want to produce messages to that topic)

const { producer } = require("kafka-observe")(options);

producer.sendMessage({
    topic: "name of the topic",
    payload: {}, // An object you want to publish to kafka (needs to be JSON convertable)
    headers: {}, // will be injected into the payload
})

consumer & consumerGroup

If you want you can also get the topic subject directly from the comsumer or consumerGrup instances.

Consumer handles all the events with the All and Both strategy.
The Consumer group handles all Loadbalanced and Both strategy events.

const { consumer, consumerGroup } = require("kafka-observe")(options);

consumer.getTopicSubject("topic name")
    .subscribe((event) => {
	// do something with the event
});

OR

consumerGroup.getTopicSubject("topic name")
    .subscribe((event) => {
	// do something with the event
});

creators

This project is created and used by the Bothive team to streamline our kafka process. We used to have the same exact code in all our services and we created this library to unify our kafka code in an independent library. Feel free to use and modify as you like. We will continue to update this library as long as we use it ourselves.

Versions

Current Tags

Version History

Package Sidebar

Install

npm i kafka-observe

Weekly Downloads

0

Version

2.2.0-beta-5-logs-callback2

License

ISC

Unpacked Size

3.16 MB

Total Files

32

Last publish

Collaborators

  • brechtvalcke
  • thuurvdp
  • edbosscom