Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/nodefluent/gcloud-pubsub-kafka-connect
https://github.com/nodefluent/gcloud-pubsub-kafka-connect
Last synced: 3 months ago
JSON representation
- Host: GitHub
- URL: https://github.com/nodefluent/gcloud-pubsub-kafka-connect
- Owner: nodefluent
- License: mit
- Created: 2017-07-10T14:27:52.000Z (over 7 years ago)
- Default Branch: master
- Last Pushed: 2023-01-12T07:08:29.000Z (about 2 years ago)
- Last Synced: 2024-04-14T07:54:11.576Z (10 months ago)
- Language: JavaScript
- Size: 1.49 MB
- Stars: 4
- Watchers: 3
- Forks: 1
- Open Issues: 26
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
- awesome-kafka - GCloud pubsub
README
# gcloud-pubsub-kafka-connect
[![Greenkeeper badge](https://badges.greenkeeper.io/nodefluent/gcloud-pubsub-kafka-connect.svg)](https://greenkeeper.io/)
Kafka Connect connector for Google Cloud's PubSub[![Build Status](https://travis-ci.org/nodefluent/glouc-pubsub-kafka-connect.svg?branch=master)](https://travis-ci.org/nodefluent/glouc-pubsub-kafka-connect)
[![Coverage Status](https://coveralls.io/repos/github/nodefluent/glouc-pubsub-kafka-connect/badge.svg?branch=master)](https://coveralls.io/github/nodefluent/glouc-pubsub-kafka-connect?branch=master)
## Use API
```
npm install --save glouc-pubsub-kafka-connect
```### pubsub -> kafka
```es6
const { runSourceConnector } = require("glouc-pubsub-kafka-connect");
runSourceConnector(config, [], onError).then(config => {
//runs forever until: config.stop();
});
```### kafka -> pubsub
```es6
const { runSinkConnector } = require("glouc-pubsub-kafka-connect");
runSinkConnector(config, [], onError).then(config => {
//runs forever until: config.stop();
});
```### kafka -> pubsub (with custom topic (no source-task topic))
```es6
const { runSinkConnector, ConverterFactory } = require("glouc-pubsub-kafka-connect");const etlFunc = (messageValue, callback) => {
return callback(null, {
id: messageValue.payload.id,
name: messageValue.payload.name,
info: messageValue.payload.info
});
};const converter = ConverterFactory.createSinkSchemaConverter({}, etlFunc);
runSinkConnector(config, [converter], onError).then(config => {
//runs forever until: config.stop();
});/*
this example would be able to store kafka message values
that look like this (so completely unrelated to messages created by a default SourceTask)
{
payload: {
id: 1,
name: "first item",
info: "some info"
},
type: "publish"
}
*/
```## Use CLI
note: in BETA :seedling:```
npm install -g gcloud-pubsub-kafka-connect
``````
# run source etl: pubsub -> kafka
nkc-pubsub-source --help
``````
# run sink etl: kafka -> pubsub
nkc-pubsub-sink --help
```## Config(uration)
```es6
const config = {
kafka: {
zkConStr: "localhost:2181/",
logger: null,
groupId: "kc-pubsub-test",
clientName: "kc-pubsub-test-name",
workerPerPartition: 1,
options: {
sessionTimeout: 8000,
protocol: ["roundrobin"],
fromOffset: "earliest", //latest
fetchMaxBytes: 1024 * 100,
fetchMinBytes: 1,
fetchMaxWaitMs: 10,
heartbeatInterval: 250,
retryMinTimeout: 250,
requireAcks: 1,
//ackTimeoutMs: 100,
//partitionerType: 3
}
},
topic: "pubsub-test-topic",
partitions: 1,
maxTasks: 1,
pollInterval: 2000,
produceKeyed: true,
produceCompressionType: 0,
connector: {
batchSize: 500,
maxPollCount: 500,
projectId: "bq-project-id",
subscription: "ps_subscription",
topic: "ps_topic",
idColumn: "id"
},
http: {
port: 3149,
middlewares: []
},
enableMetrics: true
};
```