{"id":17874426,"url":"https://github.com/marcosschroh/faust-docker-compose-example","last_synced_at":"2025-03-21T22:31:54.288Z","repository":{"id":80586285,"uuid":"176002885","full_name":"marcosschroh/faust-docker-compose-example","owner":"marcosschroh","description":"Faust dockerized application","archived":false,"fork":false,"pushed_at":"2022-12-01T12:29:43.000Z","size":54,"stargazers_count":68,"open_issues_count":0,"forks_count":12,"subscribers_count":5,"default_branch":"master","last_synced_at":"2025-03-18T05:43:59.222Z","etag":null,"topics":["data-stream-processing","docker","faust","kafka","schema-registry","zookeeper"],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/marcosschroh.png","metadata":{"files":{"readme":"README.md","changelog":"CHANGELOG.md","contributing":null,"funding":".github/FUNDING.yml","license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null},"funding":null},"created_at":"2019-03-16T17:28:49.000Z","updated_at":"2024-05-26T10:38:44.000Z","dependencies_parsed_at":null,"dependency_job_id":"9e794f9f-460b-48dc-856b-c29e928c5fbb","html_url":"https://github.com/marcosschroh/faust-docker-compose-example","commit_stats":null,"previous_names":[],"tags_count":11,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/marcosschroh%2Ffaust-docker-compose-example","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/marcosschroh%2Ffaust-docker-compose-example/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/marcosschroh%2Ffaust-docker-compose-example/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/marcosschroh%2Ffaust-docker-compose-example/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/marcosschroh","download_url":"https://codeload.github.com/marcosschroh/faust-docker-compose-example/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":244880351,"owners_count":20525507,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["data-stream-processing","docker","faust","kafka","schema-registry","zookeeper"],"created_at":"2024-10-28T11:08:53.355Z","updated_at":"2025-03-21T22:31:54.272Z","avatar_url":"https://github.com/marcosschroh.png","language":"Python","readme":"# Faust-Docker-Compose\n\n[![Build Status](https://travis-ci.org/marcosschroh/faust-docker-compose-example.svg?branch=master)](https://travis-ci.org/marcosschroh/faust-docker-compose-example)\n[![License](https://img.shields.io/github/license/marcosschroh/faust-docker-compose-example.svg?logo=MIT)](https://github.com/marcosschroh/faust-docker-compose-example/blob/master/LICENSE)\n\nAn example to show how to include a `faust` project as a service using `docker compose`, with [Kafka](https://kafka.apache.org/), [Zookeeper](https://zookeeper.apache.org/) and [Schema Registry](https://docs.confluent.io/current/schema-registry/docs/index.html)\n\nNotice that everything runs using `docker-compose`, including the faust example application. For\nlocal development is preferable to run the `kafka` cluster separate from the `faust app`.\n\nIf you want to generate a `faust` project from scratch, please use the [cookiecutter-faust](https://github.com/marcosschroh/cookiecutter-faust)\n\nRead more about Faust here: https://github.com/robinhood/faust\n\n## Project\n\nThe project skeleton is defined as a medium/large project according to [faust layout](https://faust.readthedocs.io/en/latest/userguide/application.html#projects-and-directory-layout)\n\nThe `setup.py` has the entrypoint to resolve the [entrypoint problem](https://faust.readthedocs.io/en/latest/userguide/application.html#problem-entrypoint)\n\n## Applications\n\n* *Page Views*: This application corresponds to [Tutorial: Count page views](https://faust.readthedocs.io/en/latest/playbooks/pageviews.html)\n* *Leader Election*: This application corresponds to [Tutorial: Leader Election](https://faust.readthedocs.io/en/latest/playbooks/leaderelection.html)\n* *Users*: This is a custom application to demostrate how to integrate `Faust` with `Avro Schema`.\n\n## Faust Project Dockerfile\n\nThe `Dockerfile` is based on  `python:3.7-slim`. The most important here is that the [`entrypoint`]() will wait for `kafka` too be ready and after that execute the script [`run.sh`]()\n\n## Docker compose\n\n`docker-compose.yaml` includes `zookeepeer`, `kafka` and `schema-registry` based on `confluent-inc`.\nFor more information you can go to [confluentinc](https://docs.confluent.io/current/installation/docker/docs/index.html) and see the docker compose example [here](https://github.com/confluentinc/cp-docker-images/blob/master/examples/cp-all-in-one/docker-compose.yml#L23-L48)\n\nUseful ENVIRONMENT variables that you may change:\n\n|Variable| description  | example |\n|--------|--------------|---------|\n| WORKER | Entrypoint in setup.py | `example`|\n| WORKER_PORT | Worker port | `6066` |\n| KAFKA_BOOSTRAP_SERVER | Kafka servers | `kafka://kafka:9092` |\n| KAFKA_BOOSTRAP_SERVER_NAME | Kafka server name| `kafka` |\n| KAFKA_BOOSTRAP_SERVER_PORT | Kafka server port | `9092` |\n| SCHEMA_REGISTRY_SERVER | Schema registry server name | `schema-registry` |\n| SCHEMA_REGISTRY_SERVER_PORT | Schema registry server port | `8081` |\n| SCHEMA_REGISTRY_URL | Schema Registry Server url | `http://schema-registry:8081` |\n\n## Commands\n\n* Start application: `make run-dev`. This command start both the *Page Views* and *Leader Election* applications\n* Stop and remove containers: `make clean`\n* List topics: `make list-topics`\n* Send events to page_view topic/agent: `make send-page-view-event payload='{\"id\": \"foo\", \"user\": \"bar\"}'`\n\n## Avro Schemas, Custom Codecs and Serializers\n\nBecause we want to be sure that the message that we encode are valid we use [Avro Schemas](https://docs.oracle.com/database/nosql-12.1.3.1/GettingStartedGuide/avroschemas.html).\nAvro is used to define the data schema for a record's value. This schema describes the fields allowed in the value, along with their data types.\n\nFor our demostration in the `Users` application we are using the following schema:\n\n```json\n{\n    \"type\": \"record\",\n    \"namespace\": \"com.example\",\n    \"name\": \"AvroUsers\",\n    \"fields\": [\n        {\"name\": \"first_name\", \"type\": \"string\"},\n        {\"name\": \"last_name\", \"type\": \"string\"}\n    ]\n}\n```\n\nIn order to use `avro schemas` with `Faust` we need to define a custom codec, a custom serializer and be able to talk with the `schema-registry`.\nYou can find the custom codec called `avro_users` registered using the [codec registation](https://faust.readthedocs.io/en/latest/userguide/models.html#codec-registry) approach described by faust.\nThe [AvroSerializer](https://github.com/marcosschroh/faust-docker-compose-example/blob/fix/replace-helpers-with-schemaregistry-library/faust-project/example/codecs/serializers.py#L8) is in charge to `encode` and `decode` messages using the [schema registry client](https://github.com/marcosschroh/python-schema-registry-client).\n\nNow the final step is to integrate the faust model with the `AvroSerializer`.\n\n```python\n# users.models\n\nclass UserModel(faust.Record, serializer='avro_users'):\n    first_name: str\n    last_name: str\n```\n\nNow our application is able to send and receive message using arvo schemas!!!! :-)\n\n## Tests\n\nRun tests with `tox`. Make sure that you have installed it.\n\n```bash\ntox\n```\n\n## Achievements\n\n* [x] Application examples\n* [x] Integration with Schma Registry\n* [x] Schema Registry Client\n* [x] Custom codecs\n* [x] Custom Serializers\n* [x] Avro Schemas\n* [x] Make Schema Registry Client and Serializers a python package\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fmarcosschroh%2Ffaust-docker-compose-example","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fmarcosschroh%2Ffaust-docker-compose-example","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fmarcosschroh%2Ffaust-docker-compose-example/lists"}