Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/terascope/kafka-assets
teraslice asset for kafka operations
https://github.com/terascope/kafka-assets
Last synced: 2 months ago
JSON representation
teraslice asset for kafka operations
- Host: GitHub
- URL: https://github.com/terascope/kafka-assets
- Owner: terascope
- License: mit
- Created: 2018-06-27T16:38:49.000Z (over 6 years ago)
- Default Branch: master
- Last Pushed: 2024-04-22T11:22:43.000Z (9 months ago)
- Last Synced: 2024-04-22T12:39:49.333Z (9 months ago)
- Language: TypeScript
- Size: 7.41 MB
- Stars: 1
- Watchers: 6
- Forks: 1
- Open Issues: 11
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Kafka Asset Bundle
[![Build Status](https://travis-ci.org/terascope/kafka-assets.svg?branch=master)](https://travis-ci.org/terascope/kafka-assets)
[![codecov](https://codecov.io/gh/terascope/kafka-assets/branch/master/graph/badge.svg)](https://codecov.io/gh/terascope/kafka-assets)> A bundle of [Kafka](https://kafka.apache.org/) operations and apis for [Teraslice](https://github.com/terascope/teraslice).
- [Kafka Asset Bundle](#kafka-asset-bundle)
- [Releases](#releases)
- [Getting Started](#getting-started)
- [Connectors](#connectors)
- [Kafka Connector](#kafka-connector)
- [Development](#development)
- [Tests](#tests)
- [Build](#build)
- [Contributing](#contributing)
- [License](#license)## Releases
You can find a list of releases, changes, and pre-built asset bundles [here](https://github.com/terascope/kafka-assets/releases).
## Getting Started
This asset bundle requires a running Teraslice cluster, you can find the documentation [here](https://github.com/terascope/teraslice/blob/master/README.md).
```bash
# Step 1: make sure you have teraslice-cli installed
yarn global add teraslice-cli# Step 2:
# FIXME: this should be accurate
teraslice-cli asset deploy ...
```**IMPORTANT:** Additionally make sure have installed the required [connectors](#connectors).
## Connectors
### Kafka Connector> Terafoundation connector for Kafka producer and consumer clients.
To install from the root of your terafoundation based service.
```bash
npm install terafoundation_kafka_connector
```**Configuration:**
The terafoundation level configuration is as follows:
| Configuration | Description | Type | Notes |
| --------- | -------- | ------ | ------ |
| brokers | List of seed brokers for the kafka environment | String[] | optional, defaults to `["localhost:9092"]` |
| security_protocol | Protocol used to communicate with brokers, may be set to `plaintext` or `ssl` | String | optional, defaults to `plaintext` |
| ssl_ca_location | File or directory path to CA certificate(s) for verifying the broker's key | String | only used when `security_protocol` is set to `ssl` |
| ssl_certificate_location | Path to client's public key (PEM) used for authentication | String | only used when `security_protocol` is set to `ssl` |
| ssl_crl_location | Path to CRL for verifying broker's certificate validity | String | only used when `security_protocol` is set to `ssl` |
| ssl_key_location | Path to client's private key (PEM) used for authentication | String | only used when `security_protocol` is set to `ssl` |
| ssl_key_password | Private key passphrase | String | only used when `security_protocol` is set to `ssl` | |When using this connector in code, this connector exposes two different client implementations. One for producers `type: producer` and one for consumers `type: consumer`.
| Configuration | Description | Type | Notes |
| --------- | -------- | ------ | ------ |
| options | Consumer or Producer specific options | Object | required, see below |
| topic_options | [librdkafka defined settings](https://github.com/edenhill/librdkafka/blob/v0.11.5/CONFIGURATION.md) that apply per topic | Object | optional, defaults to `{}` |
| rdkafka_options | [librdkafka defined settings](https://github.com/edenhill/librdkafka/blob/v0.11.5/CONFIGURATION.md) that are not subscription specific | Object | optional, defaults to `{}` |The `options` object enables setting a few properties
| Configuration | Description | Type | Notes |
| --------- | -------- | ------ | ------ |
| type | What type of connector is required, either `consumer` or `producer`. | String | required, defaults to `consumer` |
| group | For type 'consumer', what consumer group to use | String | optional |
| poll_interval | For type 'producer', how often (in milliseconds) the producer connection is polled to keep it alive. | Number | optional, defaults to `100` |**Consumer connector configuration example:**
```js
{
options: {
type: 'consumer',
group: 'example-group'
},
topic_options: {
'enable.auto.commit': false
},
rdkafka_options: {
'fetch.min.bytes': 100000
}
}
```**Producer connector configuration example:**
```js
{
options: {
type: 'producer',
poll_interval: 1000,
},
topic_options: {},
rdkafka_options: {
'compression.codec': 'gzip',
'topic.metadata.refresh.interval.ms': -1,
'log.connection.close': false,
}
}
```**Terafoundation configuration example:**
```yaml
terafoundation:
connectors:
kafka:
default:
brokers: "localhost:9092"
```## Development
### Tests
Run the kafka tests
**Requirements:**
- `kafka` - A running instance of kafka
**Environment:**
- `KAFKA_BROKERS` - Defaults to `localhost:9092`
```bash
yarn test
```### Build
Build a compiled asset bundle to deploy to a teraslice cluster.
**Install Teraslice CLI**
```bash
yarn global add teraslice-cli
``````bash
teraslice-cli assets build
```## Contributing
Pull requests are welcome. For major changes, please open an issue first to discuss what you would like to change.
Please make sure to update tests as appropriate.
## License
[MIT](./LICENSE) licensed.