Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/osskit/dafka-producer

Dockerized kafka producer
https://github.com/osskit/dafka-producer

dockerized kafka messaging producer

Last synced: 9 days ago
JSON representation

Dockerized kafka producer

Awesome Lists containing this project

README

        






Dockerized kafka producer

## Overview
Dafka-producer is a dockerized Kafka producer used to abstract producing messages to a kafka topic.

Producing messages is as simple as sending a POST request from your service to the dafka-producer, with the topic, value, key and headers of the request being the kafka message.

## Motivation
Why use this over just a Kafka client?
* Abstracts away the messaging layer, could be replaced with RabbitMQ or any other producer.
* Separates configuration, everything that's related to Kafka is encapsulated in Dafka and not the service itself.
* When testing your service you only test your service's logic and not the messaging layer implementation details.

## Design Diagram
image

## Usage & Examples

### docker-compose
```
version: '3.9'

services:
producer:
image: osskit/dafka-producer
ports:
- 6000:6000
environment:
- PORT=6000
- KAFKA_BROKER=kafka:9092
depends_on:
- kafka
# Generic Kafka Setup Containers
zookeeper:
image: wurstmeister/zookeeper
kafka:
image: wurstmeister/kafka:2.12-2.2.0
ports:
- '9092:9092'
environment:
- KAFKA_ADVERTISED_HOST_NAME=kafka
- KAFKA_CREATE_TOPICS=foo:1:1
- KAFKA_ZOOKEEPER_CONNECT=zookeeper:2181
depends_on:
- zookeeper
```
in joint with [`dafka-consumer`](https://github.com/osskit/dafka-consumer):
```
version: '3.9'

services:
producer:
image: osskit/dafka-producer
ports:
- 6000:6000
environment:
- PORT=6000
- KAFKA_BROKER=kafka:9092
depends_on:
- kafka
consumer:
image: osskit/dafka-consumer
ports:
- 4001:4001
environment:
- KAFKA_BROKER=kafka:9092
- GROUP_ID=consumer_1
- TARGET_BASE_URL=http://target:8080
- TOPICS_ROUTES=foo:/consume
- MONITORING_SERVER_PORT=4001
depends_on:
- kafka
# Generic Kafka Setup Containers
zookeeper:
image: wurstmeister/zookeeper
kafka:
image: wurstmeister/kafka:2.12-2.2.0
ports:
- '9092:9092'
environment:
- KAFKA_ADVERTISED_HOST_NAME=kafka
- KAFKA_CREATE_TOPICS=foo:1:1
- KAFKA_ZOOKEEPER_CONNECT=zookeeper:2181
depends_on:
- zookeeper
```

### Kubernetes
You can use the provided [Helm Chart](https://github.com/osskit/dafka-producer-helm-chart), this gives you a `Deployment` separated from your service's `Pod`.

It's also possible to use this as a `Sidecar`.
## Parameters

Container images are configured using parameters passed at runtime.

| Parameter | Default Values | Description
| :----: | --- | ---- |
| `PORT` | `required` | Incoming requests' endpoint port |
| `KAFKA_BROKER` | `required` | URL of the Kafka Broker |
| `READINESS_TOPIC` | `null` | Producing to this topic will provide an healthcheck of the producer container |
| `LINGER_TIME_MS` | `0` | [Description for LINGER_TIME_MS](https://docs.confluent.io/platform/current/installation/configuration/producer-configs.html#producerconfigs_linger.ms)|
| `COMPRESSION_TYPE` | `"none"` | [Description for COMPRESSION_TYPE](https://docs.confluent.io/platform/current/installation/configuration/producer-configs.html#producerconfigs_compression.type)|
| `USE_SASL_AUTH` | `false` | Use SASL authentication |
| `SASL_USERNAME` | `required` if `USE_SASL_AUTH=true` | SASL Usernanme to authenticate |
| `SASL_PASSWORD` | `required` if `USE_SASL_AUTH=true` | SASL Password to authenticate |
| `TRUSTSTORE_FILE_PATH` | `null` | Truststore certificate file path |
| `TRUSTSTORE_PASSWORD` | `required` if `TRUSTORE_FILE_PATH != null` | Truststore's password |
| `USE_PROMETHEUS` | `false` | Export metrics to Prometheus |
| `PROMETHEUS_BUCKETS` | `0.003,0.03,0.1,0.3,1.5,10` | A list of Prometheus buckets to use |

## License
MIT License