Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/terascope/kafka-assets

teraslice asset for kafka operations
https://github.com/terascope/kafka-assets

Last synced: 2 months ago
JSON representation

teraslice asset for kafka operations

Awesome Lists containing this project

README

        

# Kafka Asset Bundle

[![Build Status](https://travis-ci.org/terascope/kafka-assets.svg?branch=master)](https://travis-ci.org/terascope/kafka-assets)
[![codecov](https://codecov.io/gh/terascope/kafka-assets/branch/master/graph/badge.svg)](https://codecov.io/gh/terascope/kafka-assets)

> A bundle of [Kafka](https://kafka.apache.org/) operations and apis for [Teraslice](https://github.com/terascope/teraslice).

- [Kafka Asset Bundle](#kafka-asset-bundle)
- [Releases](#releases)
- [Getting Started](#getting-started)
- [Connectors](#connectors)
- [Kafka Connector](#kafka-connector)
- [Development](#development)
- [Tests](#tests)
- [Build](#build)
- [Contributing](#contributing)
- [License](#license)

## Releases

You can find a list of releases, changes, and pre-built asset bundles [here](https://github.com/terascope/kafka-assets/releases).

## Getting Started

This asset bundle requires a running Teraslice cluster, you can find the documentation [here](https://github.com/terascope/teraslice/blob/master/README.md).

```bash
# Step 1: make sure you have teraslice-cli installed
yarn global add teraslice-cli

# Step 2:
# FIXME: this should be accurate
teraslice-cli asset deploy ...
```

**IMPORTANT:** Additionally make sure have installed the required [connectors](#connectors).

## Connectors
### Kafka Connector

> Terafoundation connector for Kafka producer and consumer clients.

To install from the root of your terafoundation based service.

```bash
npm install terafoundation_kafka_connector
```

**Configuration:**

The terafoundation level configuration is as follows:

| Configuration | Description | Type | Notes |
| --------- | -------- | ------ | ------ |
| brokers | List of seed brokers for the kafka environment | String[] | optional, defaults to `["localhost:9092"]` |
| security_protocol | Protocol used to communicate with brokers, may be set to `plaintext` or `ssl` | String | optional, defaults to `plaintext` |
| ssl_ca_location | File or directory path to CA certificate(s) for verifying the broker's key | String | only used when `security_protocol` is set to `ssl` |
| ssl_certificate_location | Path to client's public key (PEM) used for authentication | String | only used when `security_protocol` is set to `ssl` |
| ssl_crl_location | Path to CRL for verifying broker's certificate validity | String | only used when `security_protocol` is set to `ssl` |
| ssl_key_location | Path to client's private key (PEM) used for authentication | String | only used when `security_protocol` is set to `ssl` |
| ssl_key_password | Private key passphrase | String | only used when `security_protocol` is set to `ssl` | |

When using this connector in code, this connector exposes two different client implementations. One for producers `type: producer` and one for consumers `type: consumer`.

| Configuration | Description | Type | Notes |
| --------- | -------- | ------ | ------ |
| options | Consumer or Producer specific options | Object | required, see below |
| topic_options | [librdkafka defined settings](https://github.com/edenhill/librdkafka/blob/v0.11.5/CONFIGURATION.md) that apply per topic | Object | optional, defaults to `{}` |
| rdkafka_options | [librdkafka defined settings](https://github.com/edenhill/librdkafka/blob/v0.11.5/CONFIGURATION.md) that are not subscription specific | Object | optional, defaults to `{}` |

The `options` object enables setting a few properties

| Configuration | Description | Type | Notes |
| --------- | -------- | ------ | ------ |
| type | What type of connector is required, either `consumer` or `producer`. | String | required, defaults to `consumer` |
| group | For type 'consumer', what consumer group to use | String | optional |
| poll_interval | For type 'producer', how often (in milliseconds) the producer connection is polled to keep it alive. | Number | optional, defaults to `100` |

**Consumer connector configuration example:**

```js
{
options: {
type: 'consumer',
group: 'example-group'
},
topic_options: {
'enable.auto.commit': false
},
rdkafka_options: {
'fetch.min.bytes': 100000
}
}
```

**Producer connector configuration example:**

```js
{
options: {
type: 'producer',
poll_interval: 1000,
},
topic_options: {},
rdkafka_options: {
'compression.codec': 'gzip',
'topic.metadata.refresh.interval.ms': -1,
'log.connection.close': false,
}
}
```

**Terafoundation configuration example:**

```yaml
terafoundation:
connectors:
kafka:
default:
brokers: "localhost:9092"
```

## Development

### Tests

Run the kafka tests

**Requirements:**

- `kafka` - A running instance of kafka

**Environment:**

- `KAFKA_BROKERS` - Defaults to `localhost:9092`

```bash
yarn test
```

### Build

Build a compiled asset bundle to deploy to a teraslice cluster.

**Install Teraslice CLI**

```bash
yarn global add teraslice-cli
```

```bash
teraslice-cli assets build
```

## Contributing

Pull requests are welcome. For major changes, please open an issue first to discuss what you would like to change.

Please make sure to update tests as appropriate.

## License

[MIT](./LICENSE) licensed.