https://github.com/hanydief/Kafka_Producer_Consumer_to_SQL_Project
Serverless Data Platform: Creating a Kafka producer.py, consumer.py & PgAdmin SQL DB
https://github.com/hanydief/Kafka_Producer_Consumer_to_SQL_Project
gitbash pgadmin4 upstash visual-studio
Last synced: 12 months ago
JSON representation
Serverless Data Platform: Creating a Kafka producer.py, consumer.py & PgAdmin SQL DB
- Host: GitHub
- URL: https://github.com/hanydief/Kafka_Producer_Consumer_to_SQL_Project
- Owner: hanydief
- Created: 2023-06-10T22:07:49.000Z (over 2 years ago)
- Default Branch: main
- Last Pushed: 2023-06-10T22:42:06.000Z (over 2 years ago)
- Last Synced: 2024-10-24T06:27:47.525Z (over 1 year ago)
- Topics: gitbash, pgadmin4, upstash, visual-studio
- Language: Python
- Homepage:
- Size: 800 KB
- Stars: 1
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Kafka_Producer_Consumer_to_SQL_Project
Creating a Kafka producer.py, consumer.py & PgAdmin SQL DB
# Required installations if not existing on the same environment:
- pip install kafka-python
- conda install -c conda-forge kafka-python
- conda install -c anaconda sqlalchemy
- conda install -c anaconda psycopg2
# What the project about:
- Created Upstash account
- Created New single replica Cluster
- producer.py Python base code

- On VS created a producer.py to publish random message to Kafka every 5 seconds

- consumer.py Python base code

- On VS created a consumer.py to fitch the producer.py random messages from Kafka every 2 seconds

- After activating the environment: $ conda activate kafka-env
- On GitBash: $ python producer.py

- After activating the environment: $ conda activate kafka-env
- On GitBash: $ python consumer.py

- using engine Pushing all produced data thru consumer.py to PgAdmin SQL Database

# Benificial links:
https://upstash.com/?gclid=CjwKCAjwvpCkBhB4EiwAujULMu611BMTjPj1328j0xQ8lKUNu55HbCBBtwRG8ysGnVgpFEML33sTRxoCWcMQAvD_BwE
https://pypi.org/project/psycopg2-binary/