Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/rob3000/nestjs-kafka
NestJS integration with KafkaJS
https://github.com/rob3000/nestjs-kafka
Last synced: 14 days ago
JSON representation
NestJS integration with KafkaJS
- Host: GitHub
- URL: https://github.com/rob3000/nestjs-kafka
- Owner: rob3000
- License: unlicense
- Created: 2020-09-13T12:14:40.000Z (about 4 years ago)
- Default Branch: master
- Last Pushed: 2023-03-05T13:18:54.000Z (over 1 year ago)
- Last Synced: 2024-04-14T08:52:37.210Z (7 months ago)
- Language: TypeScript
- Size: 892 KB
- Stars: 117
- Watchers: 5
- Forks: 44
- Open Issues: 23
-
Metadata Files:
- Readme: README.md
- License: LICENSE.md
Awesome Lists containing this project
README
# NestJS + KafkaJS
Integration of KafkaJS with NestJS to build event driven microservices.
## Setup
Import and add the `KafkaModule` to the imports array of the module for which you would like to use Kafka.
### Synchronous Module Initialization
Register the `KafkaModule` synchronous with the `register()` method:
```javascript
@Module({
imports: [
KafkaModule.register([
{
name: 'HERO_SERVICE',
options: {
client: {
clientId: 'hero',
brokers: ['localhost:9092'],
},
consumer: {
groupId: 'hero-consumer'
}
}
},
]),
]
...
})```
### Asynchronous Module Initialization
Register the `KafkaModule` asynchronous with the `registerAsync()` method:
```javascript
import { ConfigModule, ConfigService } from '@nestjs/config';@Module({
imports: [
ConfigModule.forRoot(),
KafkaModule.registerAsync(['HERO_SERVICE'], {
useFactory: async (configService: ConfigService) => {
const broker = this.configService.get('broker');
return [
{
name: 'HERO_SERVICE',
options: {
clientId: 'hero',
brokers: [broker],
},
consumer: {
groupId: 'hero-consumer'
}
}
}
];
},
inject: [ConfigService]
})
]
...
})```
Full settings can be found:
| Config | Options |
| ------ | ------- |
| client | https://kafka.js.org/docs/configuration |
| consumer | https://kafka.js.org/docs/consuming#options |
| producer | https://kafka.js.org/docs/producing#options |
| serializer | |
| deserializer | |
| consumeFromBeginning | true/false |
| | |### Subscribing
Subscribing to a topic to accept messages.
```javascript
export class Consumer {
constructor(
@Inject('HERO_SERVICE') private client: KafkaService
) {}onModuleInit(): void {
this.client.subscribeToResponseOf('hero.kill.dragon', this)
}@SubscribeTo('hero.kill.dragon')
async getWorld(data: any, key: any, offset: number, timestamp: number, partition: number, headers: IHeaders): Promise {
...
}}
```
### Producing
Send messages back to kafka.
```javascript
const TOPIC_NAME = 'hero.kill.dragon';export class Producer {
constructor(
@Inject('HERO_SERVICE') private client: KafkaService
) {}async post(message: string = 'Hello world'): Promise {
const result = await this.client.send({
topic: TOPIC_NAME,
messages: [
{
key: '1',
value: message
}
]
});return result;
}}
```
### Schema Registry support.
By default messages are converted to JSON objects were possible. If you're using
AVRO you can add the `SchemaRegistry` deserializer to convert the messages. This uses the [KafkaJS Schema-registry module](https://github.com/kafkajs/confluent-schema-registry)In your `module.ts`:
```javascript
@Module({
imports: [
KafkaModule.register([
{
name: 'HERO_SERVICE',
options: {
client: {
clientId: 'hero',
brokers: ['localhost:9092'],
},
consumer: {
groupId: 'hero-consumer'
}
},
deserializer: new KafkaAvroResponseDeserializer({
host: 'http://localhost:8081'
}),
serializer: new KafkaAvroRequestSerializer({
config: {
host: 'http://localhost:8081/'
},
schemas: [
{
topic: 'test.topic',
key: join(__dirname, 'key-schema.avsc'),
value: join(__dirname, 'value-schema.avsc')
}
],
}),
},
]),
]
...
})
```See the [e2e test](https://github.com/rob3000/nestjs-kafka/tree/master/test/e2e/app) for example.
## TODO
* Tests
PRs Welcome :heart: