{"id":23020989,"url":"https://github.com/osskit/dafka-consumer","last_synced_at":"2025-10-11T10:03:15.369Z","repository":{"id":37858297,"uuid":"382016166","full_name":"osskit/dafka-consumer","owner":"osskit","description":"Dockerized Kafka consumer","archived":false,"fork":false,"pushed_at":"2024-12-04T11:15:40.000Z","size":16520,"stargazers_count":8,"open_issues_count":0,"forks_count":2,"subscribers_count":2,"default_branch":"main","last_synced_at":"2025-10-04T03:49:28.585Z","etag":null,"topics":["consumer","dockerized","kafka","messaging"],"latest_commit_sha":null,"homepage":"","language":"Java","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/osskit.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":"CODE_OF_CONDUCT.md","threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null}},"created_at":"2021-07-01T11:52:43.000Z","updated_at":"2024-12-04T11:15:43.000Z","dependencies_parsed_at":"2023-12-03T09:26:48.575Z","dependency_job_id":"a68dd1cb-ffcb-4a09-932e-9a62c2eddd38","html_url":"https://github.com/osskit/dafka-consumer","commit_stats":null,"previous_names":[],"tags_count":152,"template":false,"template_full_name":null,"purl":"pkg:github/osskit/dafka-consumer","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/osskit%2Fdafka-consumer","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/osskit%2Fdafka-consumer/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/osskit%2Fdafka-consumer/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/osskit%2Fdafka-consumer/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/osskit","download_url":"https://codeload.github.com/osskit/dafka-consumer/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/osskit%2Fdafka-consumer/sbom","scorecard":null,"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":279006808,"owners_count":26084203,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","status":"online","status_checked_at":"2025-10-11T02:00:06.511Z","response_time":55,"last_error":null,"robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":true,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["consumer","dockerized","kafka","messaging"],"created_at":"2024-12-15T12:16:02.527Z","updated_at":"2025-10-11T10:03:15.351Z","avatar_url":"https://github.com/osskit.png","language":"Java","readme":"\u003cp align=\"center\"\u003e\r\n  \u003cimg width=\"300\" height=\"200\" src=\"https://user-images.githubusercontent.com/15312980/175078334-f284f44e-0366-4e24-8f09-5301b098ea64.svg\"/\u003e\r\n\r\n  \u003c/p\u003e\r\n \r\n\u003cdiv align=\"center\"\u003e\r\nDockerized kafka consumer\r\n  \r\n\u003c/div\u003e\r\n\r\n## Overview\r\nDafka-consumer is a dockerized Kafka consumer used to abstract consuming messages from a kafka topic.\r\n\r\nUsing dafka-consumer, consuming messages is as simple as getting a POST request to your service, with the body of the request being the kafka message.\r\n\r\n### Motivation\r\nWhy use this over just a Kafka client?\r\n* Abstracts away the messaging layer, could be replaced with RabbitMQ or any other consumer.\r\n* Separates configuration, everything that's related to Kafka is encapsulated in Dafka and not the service itself.\r\n* When testing your service you only test your service's logic and not the messaging layer implementation details.\r\n\r\n\u003cimg width=\"754\" alt=\"image\" src=\"https://user-images.githubusercontent.com/15312980/175814180-7ca374ac-da3b-4ea4-a482-9396bfbe11c4.png\"\u003e\r\n\r\n\r\n## Usage \u0026 Examples\r\n\r\n### docker-compose\r\n```\r\nversion: '3.9'\r\n\r\nservices:\r\n    consumer:\r\n        image: osskit/dafka-consumer\r\n        ports:\r\n            - 4001:4001\r\n        environment:\r\n            - KAFKA_BROKER=kafka:9092\r\n            - TARGET_BASE_URL=http://target:2000/\r\n            - TOPICS_ROUTES=foo:consume,bar:consume,^([^.]+).bar:consume\r\n            - DEAD_LETTER_TOPIC=dead-letter\r\n            - GROUP_ID=consumer_1\r\n            - MONITORING_SERVER_PORT=4001\r\n```\r\n\r\nIn joint with [`dafka-producer`](https://github.com/osskit/dafka-producer):\r\n\r\n```\r\nversion: '3.9'\r\n\r\nservices:\r\n    consumer:\r\n        image: osskit/dafka-consumer\r\n        ports:\r\n            - 4001:4001\r\n        environment:\r\n            - KAFKA_BROKER=kafka:9092\r\n            - TARGET_BASE_URL=http://target:2000/\r\n            - TOPICS_ROUTES=foo:consume,bar:consume,^([^.]+).bar:consume\r\n            - DEAD_LETTER_TOPIC=dead-letter # optional\r\n            - GROUP_ID=consumer_1\r\n\r\n    producer:\r\n        image: osskit/dafka-producer\r\n        ports:\r\n            - 6000:6000\r\n        environment:\r\n            - PORT=6000\r\n            - KAFKA_BROKER=kafka:9092\r\n```\r\n\r\n### Kubernetes\r\nYou can use the provided [Helm Chart](https://github.com/osskit/dafka-consumer-helm-chart), this gives you a `Deployment` separated from your service's `Pod`.\r\n\r\nIt's also possible to use this as a `Sidecar`.\r\n\r\n## Parameters\r\n\r\nContainer images are configured using parameters passed at runtime.\r\n\r\n| Parameter | Default Values | Description |\r\n| :----: | - | - |\r\n| `KAFKA_BROKER` | `required` | URL for the Kafka Broker |\r\n| `TARGET_BASE_URL` | `required` | The target's HTTP POST endpoint |\r\n| `GROUP_ID` |  `required` | A unique id for the consumer group | \r\n| `TOPICS_ROUTES` | `required` | A map between topics and their endpoint routes (e.g `topic:/consume`) |\r\n| `TARGET_TIMEOUT_MS` | `298000` | Timeout for the target's response. Must be lower then RETRY_POLICY_MAX_DURATION_MS |\r\n| `RETRY_POLICY_MAX_DURATION_MS` | `299000` | Maximum duration of all retry attempts. Must be lower then KAFKA_POLL_INTERVAL_MS | \r\n| `KAFKA_POLL_INTERVAL_MS` | `300000` (5 min) | The maximum delay between invocations of poll() when using consumer group management. See [max.poll.interval.ms](https://kafka.apache.org/documentation/#consumerconfigs_max.poll.interval.ms)  for more details.\r\n| `POLL_TIMEOUT` | `1000` | [Description of POLL_TIMEOUT](https://docs.confluent.io/platform/current/installation/configuration/consumer-configs.html#consumerconfigs_max.poll.records) |\r\n| `MAX_POLL_RECORDS` | `50` | Number of records to process in a single batch |\r\n| `SESSION_TIMEOUT` | `10000` | [Description of SESSION_TIMEOUT](https://docs.confluent.io/platform/current/installation/configuration/consumer-configs.html#consumerconfigs_session.timeout.ms) |\r\n| `RETRY_PROCESS_WHEN_STATUS_CODE_MATCH` | `5[0-9][0-9]` | Retry to process the record if the returning status code matches the regex |\r\n| `PRODUCE_TO_DEAD_LETTER_TOPIC_WHEN_STATUS_CODE_MATCH` | `^(?!2\\\\d\\\\d$)\\\\d{3}$` | Produce to dead letter topic when matching status code regex |\r\n| `RETRY_POLICY_EXPONENTIAL_BACKOFF` | `50,5000,10` | A list that represents the `[delay, maxDelay, delayFactor]` in retrying message processing |\r\n| `DEAD_LETTER_TOPIC` | `null` | Dead letter topic name |\r\n| `MONITORING_SERVER_PORT` | `0` | Consumer monitoring and healthcheck service port | \r\n| `TARGET_HEALTHCHECK` | `null` | Target's healthcheck endpoint to verify it's alive | \r\n| `USE_SASL_AUTH=false` | `false` | use SASL authentication |\r\n| `SASL_USERNAME` | `required` if `USE_SASL_AUTH=true` | SASL username to authenticate | \r\n| `SASL_PASSWORD` | `required` if `USE_SASL_AUTH=true` | SASL password to authenticate | \r\n| `TRUSTSTORE_FILE_PATH` | `null` | Truststore certificate file path |\r\n| `TRUSTSTORE_PASSWORD` | `required` if `TRUSTSTORE_FILE_PATH != null` | Truststore's password | \r\n| `USE_PROMETHEUS` | `false` | Export metrics to Prometheus |\r\n| `PROMETHEUS_BUCKETS` | `0.003,0.03,0.1,0.3,1.5,10` | A list of Prometheus buckets to use |\r\n\r\n\r\n\r\n## License\r\nMIT License\r\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fosskit%2Fdafka-consumer","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fosskit%2Fdafka-consumer","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fosskit%2Fdafka-consumer/lists"}