{"id":19882124,"url":"https://github.com/codeclimate/kafka","last_synced_at":"2025-03-01T03:21:50.374Z","repository":{"id":66245013,"uuid":"38078742","full_name":"codeclimate/kafka","owner":"codeclimate","description":"Centralized Kafka client gem (temporary name)","archived":false,"fork":false,"pushed_at":"2017-01-05T23:31:18.000Z","size":109,"stargazers_count":2,"open_issues_count":0,"forks_count":1,"subscribers_count":18,"default_branch":"master","last_synced_at":"2025-01-11T18:18:34.408Z","etag":null,"topics":[],"latest_commit_sha":null,"homepage":null,"language":"Ruby","has_issues":false,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/codeclimate.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2015-06-25T22:57:40.000Z","updated_at":"2019-08-18T16:47:06.000Z","dependencies_parsed_at":"2023-04-11T19:02:24.929Z","dependency_job_id":null,"html_url":"https://github.com/codeclimate/kafka","commit_stats":null,"previous_names":[],"tags_count":19,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/codeclimate%2Fkafka","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/codeclimate%2Fkafka/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/codeclimate%2Fkafka/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/codeclimate%2Fkafka/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/codeclimate","download_url":"https://codeload.github.com/codeclimate/kafka/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":241310527,"owners_count":19941972,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":[],"created_at":"2024-11-12T17:16:29.315Z","updated_at":"2025-03-01T03:21:50.368Z","avatar_url":"https://github.com/codeclimate.png","language":"Ruby","funding_links":[],"categories":[],"sub_categories":[],"readme":"# Code Climate Kafka\n\nA generic Kafka client tuned for our own usage.\n\nFeatures:\n\n- Messages are expected to be hashes and are `BSON`-serialized\n- Connections will be properly closed on exceptions\n- Consumer will stop gracefully on `SIGTERM`\n- *At most once* consumer semantics are used\n- Production via an HTTP proxy (including SSL)\n\n## Usage\n\n```rb\nrequire \"cc/kafka\"\n```\n\n### Producer\n\n```rb\nproducer = CC::Kafka::Producer.new(\"kafka://host:1234/topic\", \"client-id\")\nproducer.send_message(foo: :bar, baz: :bat)\nproducer.close\n```\n\n### Consumer\n\n```rb\nconsumer = CC::Kafka::Consumer.new(\"client-id\", [\"kafka://host:1234\", \"...\"], \"topic\", 0)\nconsumer.on_message do |message|\n  # Given the producer above, message will be\n  #\n  #   {\n  #     \"foo\" =\u003e :bar,\n  #     \"baz\" =\u003e :bat,\n  #     CC::Kafka::MESSAGE_OFFSET_KEY =\u003e \"topic-0-1\",\n  #   }\n  #\nend\n\nconsumer.start\n```\n\nNote: the value for the `MESSAGE_OFFSET_KEY` identifies the message's offset\nwithin the given topic and partition as `\u003ctopic\u003e-\u003cpartition\u003e-\u003coffset\u003e`. It can\nbe used by consumers to tie created data to the message that lead to it and\nprevent duplicate processing.\n\n## Configuration\n\n- `CC::Kafka.offset_model`\n\n  Must respond to `find_for_create!(attributes)` and return an object that\n  responds to `set(attributes)`.\n\n  The `attributes` used are `topic`, `partition`, and `current`. And the object\n  returned from `find_or_create!` must expose methods for each of these.\n\n  A [`Minidoc`][minidoc]-based module is included that can be included in client code for an offset model implementation that will work for many clients.\n\n  [minidoc]: https://github.com/brynary/minidoc\n\n  ```rb\n  class KafkaOffset \u003c Minidoc\n    include CC::Kafka::OffsetStorage::Minidoc\n  end\n\n  CC::Kafka.offset_model = KafkaOffset\n  ```\n\n  *Note*: This is only necessary if using `Consumer`.\n\n- `Kafka.logger`\n\n  This is optional and defaults to `Logger.new(STDOUT)`. The configured object\n  must have the same interface as the standard Ruby logger.\n\n  Example:\n\n  ```rb\n  Kafka.logger = Rails.logger\n  ```\n\n- `Kafka.statsd`\n\n  This is optional and defaults to a null object. The configured object should\n  represent a [statsd][] client and respond to the usual methods, `increment`,\n  `time`, etc.\n\n  [statsd]: https://github.com/reinh/statsd\n\n- `Kafka.ssl_ca_file`\n\n  Path to a custom SSL Certificate Authority file.\n\n  Will result in:\n\n  ```rb\n  http.ca_file = Kafka.ssl_ca_file\n  ```\n\n- `Kafka.ssl_pem_file`\n\n  Path to a custom SSL Certificate (and key) in concatenated, PEM format.\n\n  Will result in:\n\n  ```rb\n  pem = File.read(Kafka.ssl_pem_file)\n\n  http.cert = OpenSSL::X509::Certificate.new(pem)\n  http.key = OpenSSL::PKey::RSA.new(pem)\n  ```\n\n## Copyright\n\nSee [LICENSE](LICENSE)\n","project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fcodeclimate%2Fkafka","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fcodeclimate%2Fkafka","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fcodeclimate%2Fkafka/lists"}