Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/invadergir/kafka-streams-interactive-queries
Example kafka streams app to help test storage and recovery of state.
https://github.com/invadergir/kafka-streams-interactive-queries
interactive-queries kafka-streams scala
Last synced: 1 day ago
JSON representation
Example kafka streams app to help test storage and recovery of state.
- Host: GitHub
- URL: https://github.com/invadergir/kafka-streams-interactive-queries
- Owner: invadergir
- License: apache-2.0
- Created: 2017-12-15T04:33:07.000Z (almost 7 years ago)
- Default Branch: master
- Last Pushed: 2017-12-20T03:36:38.000Z (almost 7 years ago)
- Last Synced: 2024-06-22T08:53:55.540Z (5 months ago)
- Topics: interactive-queries, kafka-streams, scala
- Language: Scala
- Size: 14.6 KB
- Stars: 1
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE.txt
Awesome Lists containing this project
README
Roughly Following:
https://docs.confluent.io/current/streams/developer-guide/interactive-queries.html#streams-developer-guide-interactive-queries-discoveryCode example:
https://github.com/confluentinc/kafka-streams-examples/tree/4.0.x/src/main/java/io/confluent/examples/streams/interactivequeries# Purpose
You can use this in combination with kafka-key-value-producer to locally test some availability & scalability scenarios. Keys 'a' through 'j' are printed when you query the state store. Spin up more than 1 instance, send it some data, and see where the keys go. Your input topic ('input-topic') must have > 1 partition. Once you sent some data, kill one or the other instance and see what happens. It's a valuable exercise to understand how kafka streams stores data with partitioning.# To create input & output topics:
see bin/create-topics.sh # default num topics is 2, can specify differently
# To run console producer:
kafka-console-producer.sh --broker-list localhost:9092 --topic input-topic# To run console consumer to see output:
## For WordCount: (when testing word counts - see Main.scala to select active stream processor)
kafka-console-consumer.sh --bootstrap-server localhost:9092 \
--topic output-topic \
--from-beginning \
--formatter kafka.tools.DefaultMessageFormatter \
--property print.key=true \
--property print.value=true \
--property key.deserializer=org.apache.kafka.common.serialization.StringDeserializer \
--property value.deserializer=org.apache.kafka.common.serialization.LongDeserializer## For KVStreamProcessor: (when testing generic String-String store, see Main.scala)
kafka-console-consumer.sh --bootstrap-server localhost:9092 \
--topic output-topic \
--from-beginning \
--formatter kafka.tools.DefaultMessageFormatter \
--property print.key=true \
--property print.value=true \
--property key.deserializer=org.apache.kafka.common.serialization.StringDeserializer \
--property value.deserializer=org.apache.kafka.common.serialization.StringDeserializer# To create jar file:
sbt assembly