Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/lbaiyat/kinesis-producer-consumer-demo

This repo contains a utility to manage Kinesis Data Streams in AWS, as well as a sample stream producer and consumer.
https://github.com/lbaiyat/kinesis-producer-consumer-demo

aws aws-sample boto3 cloudformation kinesis-stream python python3

Last synced: about 1 month ago
JSON representation

This repo contains a utility to manage Kinesis Data Streams in AWS, as well as a sample stream producer and consumer.

Awesome Lists containing this project

README

        

Kinesis Producer and Consumer Demo




Table of Contents




Overview:



This is a demo to programmatically manage a Kinesis Data Stream in AWS, with a stream producer and consumer.
The data being sent with the producer can be modified to send different payloads, and the consumer logic
can be modified to process the incoming payload in different manners.



About AWS Kinesis Data Streams:



AWS Kinesis Data Streams is a scalable real-time data streaming service designed to handle streams of data per second, enabling real-time data processing and analytics. It allows applications to continuously ingest and process large amounts of streaming data, such as log and event data, from various sources.



Instructions:


Getting the Data Stream up and running requires a few steps outlined below.


Setup virtual environment




python3 -m venv venv;
pip install -r requirements.txt;


Export AWS Access Key and Secret Key to environment variables




export AWS_ACCESS_KEY_ID="{ACCESS_KEY_ID_STRING}";
export AWS_SECRET_ACCESS_KEY="{AWS_SECRET_ACCESS_KEY_STRING}";


Create the Kinesis Data Stream with the Stream Manager Script



Note: The 'create' keyword after the script is needed. Creating this Data Stream will incur AWS charges




python3 kinesis_stream_manager.py create;


Run the stream producer




python3 producer.py;


Once the stream producer is running, you should see logs of the data being produced to the stream:



Run the stream consumer




python3 consumer.py;


Once the stream consumer is running, you should see logs of the data being consumed from the stream:



Cleanup:



Once you are done with the Kinesis Data Stream, you should delete the instance to stop incurring further charges.


Delete the Kinesis Data Stream with the Stream Manager Script



Note: The 'delete' keyword after the script is needed. The CloudFormation stack will be deleted as well as the Kinesis Stream defined by it.




python3 kinesis_stream_manager.py delete;