Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/3axap4ehko/kafka-console

CLI kafka client
https://github.com/3axap4ehko/kafka-console

kafka kafka-cli kafka-client

Last synced: 18 days ago
JSON representation

CLI kafka client

Awesome Lists containing this project

README

        

# Kafka CLI tool

Command line tool to sufficiently and easy work with Kafka

[![NPM version][npm-image]][npm-url]
[![Downloads][downloads-image]][npm-url]

## Table of Contents

- [Features](#features)
- [Installing](#installing)
- [Examples](#examples)
- [Consumer](#consumer)
- [Producer](#producer)
- [Formatters](#formatters)
- [Environment](#environment)
- [License](#license)

## Features

- Producer
- Consumer groups with seek and timeout
- Built-in message encoders/decoders with types: json, js, raw
- Custom message encoders/decoders as a js module
- Message headers
- GZIP compression
- Plain, SSL and SASL_SSL implementations
- Admin client
- TypeScript support

## Installing

```sh
npm install -g kafka-console
```

## Examples

### Common options
```
-b, --brokers bootstrap server host (default: "localhost:9092")
-l, --log-level log level
-t, --timeout set a timeout of operation (default: "0")
-p, --pretty pretty print (default: false)
--ssl enable ssl (default: false)
--mechanism sasl mechanism
--username sasl username
--password sasl password
--auth-id sasl aws authorization identity
--access-key-id sasl aws access key id
--secret-access-key sasl aws secret access key
--session-token sasl aws session token
--oauth-bearer sasl oauth bearer token
-V, --version output the version number
-h, --help display help for command
```

### Commands
```
consume [options] Consume kafka topic events
produce [options] Produce kafka topic events
metadata Displays kafka server metadata
list|ls [options] Lists kafka topics
config [options] Describes config for specific resource
topic:create Creates kafka topic
topic:delete Deletes kafka topic
topic:offsets [timestamp] Shows kafka topic offsets
help [command] display help for command
```

### Consumer

`kcli consume [options] `

#### Options
```
-g, --group consumer group name (default: "kafka-console-consumer-TIMESTAMP")
-d, --data-format messages data-format: json, js, raw (default: "json")
-o, --output write output to specified filename
-f, --from read messages from the specific timestamp in milliseconds or ISO 8601 format. Set 0 to read from the beginning
-c, --count a number of messages to read (default: null)
-s, --skip a number of messages to skip (default: 0)
-h, --help display help for command
```

General usage with authentication
```sh
kcli consume $KAFKA_TOPIC -g $KAFKA_TOPIC_GROUP -b $KAFKA_BROKERS --ssl --mechanism plain --username $KAFKA_USERNAME --password $KAFKA_PASSWORD
```

Stdout from timestamp `jq` example
```sh
kcli consume $KAFKA_TOPIC --from '1970-01-01T00:00:00.000Z' | jq .value
```

Custom data formatter example
```sh
kcli consume $KAFKA_TOPIC --data-format ./formatter/avro.js | jq
```

### Producer

`kcli produce [options] `

#### Options
```
-d, --data-format messages data-format: json, js, raw (default: "json")
-i, --input input filename
-w, --wait wait the time in ms after sending a message (default: 0)
-h, --header set a static header (default: [])
--help display help for command
```

General usage
```sh
kcli produce $KAFKA_TOPIC -b $KAFKA_BROKERS --ssl --mechanism plain --username $KAFKA_USERNAME --password $KAFKA_PASSWORD
```

Produce a json data from stdin with custom formatter
```sh
cat payload.txt|kcli produce $KAFKA_TOPIC --data-format ./formatter/avro.js
```

Produce a json data from stdin
```sh
node payloadGenerator.js|kcli produce $KAFKA_TOPIC
```

Produce a json array data from stdin
```sh
cat payload.json|jq -r -c .[]|kcli produce $KAFKA_TOPIC
```

Payload single message input interface
```typescript
interface Payload {
key?: string; // kafka
value: any;
headers?: { [key: string]: value };
}
```

### Formatters

```typescript
export interface Encoder {
(value: T): Promise | string | Buffer;
}

export interface Decoder {
(value: Buffer): Promise | T;
}

export interface Formatter {
encode: Encoder;
decode: Decoder;
}
```

## Environment

- KAFKA_BROKERS
- KAFKA_TIMEOUT
- KAFKA_MECHANISM
- KAFKA_USERNAME
- KAFKA_PASSWORD
- KAFKA_AUTH_ID
- KAFKA_ACCESS_KEY_ID
- KAFKA_SECRET_ACCESS_KEY
- KAFKA_SESSION_TOKEN
- KAFKA_OAUTH_BEARER

## License
License [The MIT License](http://opensource.org/licenses/MIT)
Copyright (c) 2024 Ivan Zakharchanka

[npm-url]: https://www.npmjs.com/package/kafka-console
[downloads-image]: https://img.shields.io/npm/dw/kafka-console.svg?maxAge=43200
[npm-image]: https://img.shields.io/npm/v/kafka-console.svg?maxAge=43200