https://github.com/aiven-open/kio
Python data types for the Apache Kafka® Protocol.
https://github.com/aiven-open/kio
binary-serialization kafka python
Last synced: 8 months ago
JSON representation
Python data types for the Apache Kafka® Protocol.
- Host: GitHub
- URL: https://github.com/aiven-open/kio
- Owner: Aiven-Open
- License: apache-2.0
- Created: 2023-01-19T10:51:08.000Z (about 3 years ago)
- Default Branch: main
- Last Pushed: 2025-06-10T08:54:25.000Z (8 months ago)
- Last Synced: 2025-06-10T09:40:58.433Z (8 months ago)
- Topics: binary-serialization, kafka, python
- Language: Python
- Homepage: https://aiven-open.github.io/kio/
- Size: 3.66 MB
- Stars: 22
- Watchers: 11
- Forks: 5
- Open Issues: 6
-
Metadata Files:
- Readme: README.md
- Contributing: .github/CONTRIBUTING.md
- License: LICENSE
- Code of conduct: .github/CODE_OF_CONDUCT.md
- Codeowners: .github/CODEOWNERS
- Security: .github/SECURITY.md
Awesome Lists containing this project
README
kio
Python data types for the Apache Kafka® Protocol.
Checkout the complete documentation →
## Features
- Exposes immutable dataclass entities for all protocol messages, generated from the
[same source][schema-source] as used internally in Apache Kafka®.
- Message classes are simple light-weight data containers that do not inherit anything
or expose any methods other than a regular dataclass. Encoding and decoding is enabled
by making all the necessary details about Kafka encoding introspectable.
- Widely compatible encoding and decoding of messages through the `IO[bytes]` interface.
- Test suite with focus on roundtrip property tests using Hypothesis, including
compatibility testing against the internals of upstream Apache Kafka®.
[schema-source]:
https://github.com/apache/kafka/tree/trunk/clients/src/main/resources/common/message
## Installation
```shell
$ pip install --require-virtualenv kio
```
## Development
Install development requirements.
```shell
$ pip install --require-virtualenv -e .[all]
```
The test suite contains integration tests that expects to be able to connect to an
Apache Kafka® instance running on `127.0.0.1:9092`. There is a Docker Compose file in
`container/compose.yml` that you can use to conveniently start up an instance.
```shell
$ docker compose up -d kafka
```
Run tests.
```shell
$ python3 -X dev -m pytest --cov
```
Setup pre-commit to run on push.
```shell
$ pre-commit install -t pre-push
```
> [!WARNING]\
> Building the schema will delete the `src/kio/schema` directory and recreate it again, hence
> all of the files under this directory will be deleted. Make sure to not put unrelated files
> there and accidentally wipe out your own work.
Fetch, generate, and format schema.
```shell
$ make build-schema
```