Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/ekozynin/asyncapi-kafka-template

Confluent Kafka template for the AsyncAPI Generator
https://github.com/ekozynin/asyncapi-kafka-template

asyncapi confluent-kafka generator kafka template

Last synced: 2 months ago
JSON representation

Confluent Kafka template for the AsyncAPI Generator

Awesome Lists containing this project

README

        

![Confluent Kafka logo](./docos/confluent-kafka.png)

# Confluent Kafka generator

This template will generate python scripts to create kafka topology in an existing Kafka cluster based on your AsyncApi.

The following Confluent Kafka components can be defined using AsyncApi specification:
- topics
- schema definitions
- [TODO] connectors
- [TODO] KSql

__Table of Contents__

- [Usage](#usage)
- [Run it](#run-it)
- [Special Considerations](#special-considerations)
* [Servers](#servers)
* [Security](#security)
+ [Environment variables](#environment-variables)
- [AsyncApi Extensions](#asyncapi-extensions)
- [Contributors](#contributors)

## Usage

```bash
Usage: ag [options] @ekozynin/asyncapi-kafka-template

Options:
--force-write force writing of the generated files to given directory (defaults to false)
-o, --output directory where to put the generated files (defaults to current directory)
```

## Run it

The generated scripts were tested with python 3.9.7

Go to the root folder of the generated code and run this command (you need python):
```bash
Usage: python main.py -e

Options:
environment one of the "servers" definitions in your asyncapi file
```

First time you may need to install additional python dependencies. Go to the root folder of the generated code and install required python dependencies:

```bash
pip install -U -r python-requirements.txt
```

## Special Considerations

### Servers
Kafka cluster usually has different components to deploy Kafka topology to: brokers, schema registry, connectors, KSql.

To name the servers follow this rules:
- server name should start with the environment name (such as local, dev, prod)
- followed by dash '-'
- followed by component type. One of: 'broker', 'schemaRegistry'

### Security

To connect to Confluent Cloud Cluster, the servers above should have 'security' attribute defined. Security schemas must be defined exactly as below. [Refer to full example.](./examples)

```yaml
components:
securitySchemes:
confluentBroker:
type: userPassword
x-configs:
security.protocol: sasl_ssl
sasl.mechanisms: PLAIN
sasl.username: '{{ CLUSTER_API_KEY }}'
sasl.password: '{{ CLUSTER_API_SECRET }}'

confluentSchemaRegistry:
type: userPassword
x-configs:
basic.auth.user.info: '{{ SCHEMA_REGISTRY_API_KEY }}:{{ SCHEMA_REGISTRY_API_SECRET }}'
```

#### Environment variables
Use environment variables to pass security details when deploying kafka topology.

The names for environment variables are:
- CLUSTER_API_KEY
- CLUSTER_API_SECRET
- SCHEMA_REGISTRY_API_KEY
- SCHEMA_REGISTRY_API_SECRET

## AsyncApi Extensions
[Additional extensions to the AsyncApi specification that this generator understands.](./EXTENSIONS.md)

## Contributors



Eugen Kozynin

💻 📖 🎨 🤔 🚧

Jake Bayer

💻 👀 🤔