{"id":36470671,"url":"https://github.com/qntfy/frafka","last_synced_at":"2026-01-12T00:36:28.500Z","repository":{"id":55311353,"uuid":"167266234","full_name":"qntfy/frafka","owner":"qntfy","description":"Frizzle for Apache Kafka","archived":false,"fork":false,"pushed_at":"2021-01-05T17:26:04.000Z","size":55,"stargazers_count":13,"open_issues_count":2,"forks_count":4,"subscribers_count":11,"default_branch":"master","last_synced_at":"2025-08-14T14:11:35.324Z","etag":null,"topics":["consumer","golang","golang-library","kafka","message-bus","pipeline","producer","stream-processing","streaming-data"],"latest_commit_sha":null,"homepage":null,"language":"Go","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/qntfy.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null}},"created_at":"2019-01-23T22:49:15.000Z","updated_at":"2021-01-17T00:31:25.000Z","dependencies_parsed_at":"2022-08-14T20:40:41.712Z","dependency_job_id":null,"html_url":"https://github.com/qntfy/frafka","commit_stats":null,"previous_names":[],"tags_count":9,"template":false,"template_full_name":null,"purl":"pkg:github/qntfy/frafka","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/qntfy%2Ffrafka","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/qntfy%2Ffrafka/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/qntfy%2Ffrafka/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/qntfy%2Ffrafka/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/qntfy","download_url":"https://codeload.github.com/qntfy/frafka/tar.gz/refs/heads/master","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/qntfy%2Ffrafka/sbom","scorecard":null,"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":286080680,"owners_count":28329805,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2026-01-12T00:13:06.322Z","status":"ssl_error","status_checked_at":"2026-01-12T00:04:50.725Z","response_time":60,"last_error":"SSL_connect returned=1 errno=0 peeraddr=140.82.121.6:443 state=error: unexpected eof while reading","robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":false,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["consumer","golang","golang-library","kafka","message-bus","pipeline","producer","stream-processing","streaming-data"],"created_at":"2026-01-12T00:36:27.956Z","updated_at":"2026-01-12T00:36:28.495Z","avatar_url":"https://github.com/qntfy.png","language":"Go","readme":"# Frafka\n\n[![Travis Build Status](https://img.shields.io/travis/qntfy/frafka.svg?branch=master)](https://travis-ci.org/qntfy/frafka)\n[![Coverage Status](https://coveralls.io/repos/github/qntfy/frafka/badge.svg?branch=master)](https://coveralls.io/github/qntfy/frafka?branch=master)\n[![MIT licensed](https://img.shields.io/badge/license-MIT-blue.svg)](./LICENSE)\n[![GitHub release](https://img.shields.io/github/release/qntfy/frafka.svg?maxAge=3600)](https://github.com/qntfy/frafka/releases/latest)\n[![Go Report Card](https://goreportcard.com/badge/github.com/qntfy/frafka)](https://goreportcard.com/report/github.com/qntfy/frafka)\n[![GoDoc](https://godoc.org/github.com/qntfy/frafka?status.svg)](http://godoc.org/github.com/qntfy/frafka)\n\nFrafka is a Kafka implementation for [Frizzle](https://github.com/qntfy/frizzle) based on [confluent-go-kafka](https://github.com/confluentinc/confluent-kafka-go).\n\nFrizzle is a magic message (`Msg`) bus designed for parallel processing w many goroutines.\n\n* `Receive()` messages from a configured `Source`\n* Do your processing, possibly `Send()` each `Msg` on to one or more `Sink` destinations\n* `Ack()` (or `Fail()`) the `Msg`  to notify the `Source` that processing completed\n\n## Prereqs / Build instructions\n\n### Install librdkafka\n\nThe underlying kafka library,\n[confluent-kafka-go](https://github.com/confluentinc/confluent-kafka-go#installing-librdkafka)\nhas some particularly important nuances:\n\n* alpine builds (e.g. `FROM golang-1.14-alpine` should run all go commands with `-tags musl`\n  * e.g. `go test -tags musl ./...`\n* all builds producing an executable should run with `CGO_ENABLED=1`\n  * not necessary for libraries, however.\n\nOtherwise, should be good to go with\n\n```sh\ngo get github.com/qntfy/frafka\ncd frafka\ngo build\n```\n\n## Basic API usage\n\n### Sink\n\nCreate a new sink with `NewSink`:\n\n``` golang\n// error omitted - handle in proper code\nsink, _ := frafka.NewSink(\"broker1:15151,broker2:15151\", 16 * 1024)\n```\n\n## Running the tests\n\nFrafka has integration tests which require a kafka broker to test against. `KAFKA_BROKERS` environment variable is\nused by tests. [simplesteph/kafka-stack-docker-compose](https://github.com/simplesteph/kafka-stack-docker-compose)\nhas a great simple docker-compose setup that is used in frafka CI currently.\n\n```sh\ncurl --silent -L -o kafka.yml https://raw.githubusercontent.com/simplesteph/kafka-stack-docker-compose/v5.1.0/zk-single-kafka-single.yml\nDOCKER_HOST_IP=127.0.0.1 docker-compose -f kafka.yml up -d\n# takes a while to initialize; can use a tool like wait-for-it.sh in scripting\nexport KAFKA_BROKERS=127.0.0.1:9092\ngo test -v --cover ./...\n```\n\n## Configuration\n\nFrafka Sources and Sinks are configured using [Viper](https://godoc.org/github.com/spf13/viper).\n\n```golang\nfunc InitSink(config *viper.Viper) (*Sink, error)\n\nfunc InitSource(config *viper.Viper) (*Source, error)\n```\n\nWe typically initialize Viper through environment variables (but client can do whatever it wants,\njust needs to provide the configured Viper object with relevant values). The application might\nuse a prefix before the below values.\n\n| Variable | Required | Description | Default |\n|---------------------------|:--------:|-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-------:|\n| KAFKA_BROKERS | required | address(es) of kafka brokers, space separated |  |\n| KAFKA_TOPICS | source | topic(s) to read from |  |\n| KAFKA_CONSUMER_GROUP | source | consumer group value for coordinating multiple clients |  |\n| KAFKA_CONFIG | optional | Add librdkafka client config, format `key1=value1 key2=value2 ...` |  |\n| KAFKA_CONFIG_FILE | optional | relative or absolute file path to a config file for librdkafka client config (see notes) |  |\n\n### Kafka Client Configuration Notes\n\n* `KAFKA_CONFIG` allows setting arbitrary\n[librdkafka configuration](https://github.com/edenhill/librdkafka/blob/v1.4.2/CONFIGURATION.md)\nsuch as `retries=10 max.in.flight=1000 delivery.report.only.error=true`\n* `KAFKA_CONFIG_FILE` allows another method for arbitrary config similar to KAFKA_CONFIG. `KAFKA_CONFIG` takes priority\nover `KAFKA_CONFIG_FILE`. The specified file is parsed with [viper](https://github.com/spf13/viper) which supports a range of\nconfig file formats, for simplicity we recommend using yaml similar to the provided example file (used in tests).\n* Required config set via environment variables listed above (e.g. `KAFKA_BROKERS`) will always take priority over\noptional values - if `bootstrap.servers` is set in `KAFKA_CONFIG` to a different value, it will be ignored.\n* Sensible defaults are set for several additional config values, see variables in `source.go` and `sink.go` for specifics\n* An earlier version of frafka also supported setting specific optional kafka configs via environment variables, such as\ncompression. This functionality has been removed to simplify config logic and reduce confusion if values are set in multiple\nplaces.\n\n#### Suggested Kafka Config\n\nSome values that we commonly set, particularly in a memory constrained environment (e.g. running a producer/consumer service against a 9 partition topic with average message size less than 10KB and less than 200MB memory available).\n\n* queued.max.messages.kbytes: 2048 (up to 16384)\n* auto.offset.reset: (latest|earliest)\n* receive.message.max.bytes: 2000000\n* fetch.max.bytes: 1000000\n* compression.type: snappy (and possibly linger.ms value depending on throughput/latency requirements) are great to set to reduce network traffic and disk usage on brokers\n\n## Async Error Handling\n\nSince records are sent in batch fashion, Kafka may report errors or other information asynchronously.\nEvent can be recovered via channels returned by the `Sink.Events()` and `Source.Events()` methods.\nPartition changes and EOF will be reported as non-error Events, other errors will conform to `error` interface.\nWhere possible, Events will retain underlying type from [confluent-kafka-go](https://github.com/confluentinc/confluent-kafka-go)\nif more information is desired.\n\n## Contributing\n\nContributions welcome! Take a look at open issues.\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fqntfy%2Ffrafka","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fqntfy%2Ffrafka","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fqntfy%2Ffrafka/lists"}