Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/nodefluent/node-sinek
:tophat: Most advanced high level Node.js Kafka client
https://github.com/nodefluent/node-sinek
backpressure consumer easy kafka kafka-client kerberos nodejs producer sasl ssl
Last synced: 6 days ago
JSON representation
:tophat: Most advanced high level Node.js Kafka client
- Host: GitHub
- URL: https://github.com/nodefluent/node-sinek
- Owner: nodefluent
- License: mit
- Created: 2017-02-21T21:57:44.000Z (almost 8 years ago)
- Default Branch: master
- Last Pushed: 2022-12-10T16:55:18.000Z (about 2 years ago)
- Last Synced: 2025-01-29T12:07:07.065Z (13 days ago)
- Topics: backpressure, consumer, easy, kafka, kafka-client, kerberos, nodejs, producer, sasl, ssl
- Language: TypeScript
- Homepage:
- Size: 1.17 MB
- Stars: 290
- Watchers: 12
- Forks: 52
- Open Issues: 24
-
Metadata Files:
- Readme: README.md
- Changelog: CHANGELOG.md
- License: LICENSE
Awesome Lists containing this project
- awesome-kafka - Node
README
# High Level Node.js Kafka Client
[![Build Status](https://travis-ci.org/nodefluent/node-sinek.svg?branch=master)](https://travis-ci.org/nodefluent/node-sinek)
[![npm version](https://badge.fury.io/js/sinek.svg)](https://badge.fury.io/js/sinek)The most advanced Kafka Client.
## Features
* easy promise based API
* a lot of Kafka pitfalls already taken care of
* backpressure and stream consume modes
* secure committing in backpressure (1:n, batch) mode
* plain Javascript implementation based on `kafka-node` and a super fast native implementation based on `node-rdkafka`
* SSL, SASL & Kerberos support
* auto reconnects
* auto partition recognition and deterministic spreading for producers
* **intelligent health-checks** and **analytic events** for consumers and producers## You might also like
* check out :goberserk: [node-kafka-streams](https://github.com/nodefluent/kafka-streams) for a stream processing kafka api
* check out :fire: [node-kafka-connect](https://github.com/nodefluent/kafka-connect) for a easy datastore <-> kafka transfer## Latest Changes
Can be found [here](CHANGELOG.md)
## Install
```shell
npm install --save sinek
```## Usage
### Usage - JS Client (based on kafka.js)
```javascript
const {
JSConsumer,
JSProducer
} = require("sinek");const jsProducerConfig = {
clientId: "my-app",
brokers: ["kafka1:9092"]
}(async () => {
const topic = "my-topic";
const producer = new JSProducer(jsProducerConfig);
const consumer = new JSConsumer(topic, jsConsumerConfig);producer.on("error", error => console.error(error));
consumer.on("error", error => console.error(error));await consumer.connect();
// consume from a topic.
consumer.consume(async (messages) => {
messages.forEach((message) => {
console.log(message);
})
});// Produce messages to a topic.
await producer.connect();
producer.send(topic, "a message")
})().catch(console.error);```
# Further Docs
* [Best-practice example](examples/best-practice-example)
* [SSL example](examples/ssl-example/)
* [SASL+SSL example](examples/sasl-ssl-example/)
* [Alpine based docker example](kafka-setup/alpine.Dockerfile)
* [Debian based docker example](kafka-setup/debian.Dockerfile)> make it about them, not about you
> - Simon Sinek