Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/lpsm-dev/twitter-realtime-processing-covid
✔️ Twitter Realtime Processing about Covid-19 + Docker + Docker Compose + Apache Kafka + Elastic Stack (Elasticsearch + Kibana)
https://github.com/lpsm-dev/twitter-realtime-processing-covid
alpine covid covid-19 docker docker-compose elasticsearch kafka kafka-consumer kafka-producer kibana python streaming twitter twitter-api twitter-streaming-api
Last synced: 4 days ago
JSON representation
✔️ Twitter Realtime Processing about Covid-19 + Docker + Docker Compose + Apache Kafka + Elastic Stack (Elasticsearch + Kibana)
- Host: GitHub
- URL: https://github.com/lpsm-dev/twitter-realtime-processing-covid
- Owner: lpsm-dev
- License: wtfpl
- Created: 2020-06-11T13:25:15.000Z (over 4 years ago)
- Default Branch: main
- Last Pushed: 2023-10-29T01:46:43.000Z (about 1 year ago)
- Last Synced: 2024-04-20T09:22:57.521Z (7 months ago)
- Topics: alpine, covid, covid-19, docker, docker-compose, elasticsearch, kafka, kafka-consumer, kafka-producer, kibana, python, streaming, twitter, twitter-api, twitter-streaming-api
- Language: Python
- Homepage:
- Size: 352 KB
- Stars: 8
- Watchers: 3
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
- Codeowners: .github/CODEOWNERS
Awesome Lists containing this project
README
Welcome to Twitter Realtime Processing Repository
Python Realtime Processing Tweets COVID-19 using Kafka + Elasticsearch + Kibana + Docker + Docker-Compose## ➤ Getting Started
If you want use this repository you need to make a **git clone**:
```bash
git clone --depth 1 https://github.com/lpmatos/twitter-realtime-processing-covid.git -b master
```This will give access on your **local machine**.
## ➤ Pre-Requisites
To this project you yeed:
* Python 3.8.
* Docker and Docker Compose.
* Kafka ecosystem.
* Elasticsearch.
* Kiabana## ➤ How to use it?
#### Locale
1. Set the application environment variables.
2. Install python packages in requirements.txt.
3. Run docker-compose.yml to deploy all kafka and elastic ecosystem.
4. Profit.#### Docker
1. Set all environment variables in dot-env files.
2. Creathe a docker network.
3. Run docker-compose.yml to deploy all kafka and elastic ecosystem.
4. Run docker-compose-tools.to run the application.
5. Profit.This system is fully containerised. You will need [Docker](https://docs.docker.com/install/) and [Docker Compose](https://docs.docker.com/compose/) to run it.
You simply need to create a Docker network called `kafka-network` to enable communication between the Kafka cluster and the apps:
```bash
$ docker network create kafka-network
```All set!
### Sending Data to Elasticsearch
```bash
curl -X POST kafka-connect:8083/connectors -H "Content-Type: application/json" -d '{
"name": "elasticsearch-sink-kafka",
"config": {
"connector.class": "io.confluent.connect.elasticsearch.ElasticsearchSinkConnector",
"type.name": "kafka-connect",
"key.converter.schemas.enable": "false",
"tasks.max": "1",
"topics": "dados-tweets",
"value.converter.schemas.enable": "false",
"key.ignore": "true",
"connection.url": " http://elasticsearch:9200",
"value.converter": "org.apache.kafka.connect.json.JsonConverter",
"key.converter": "org.apache.kafka.connect.storage.StringConverter",
"schema.ignore": "true"
}
}'
```### Environment variables
**Name** | **Description**
:---: | :---:
**TWITTER_CONSUMER_KEY** | Twitter Consumer Key
**TWITTER_CONSUMER_SECRET** | Twitter Consumer Secret
**TWITTER_ACCESS_TOKEN** | Twitter Access Token
**TWITTER_ACCESS_TOKEN_SECRET** | Twitter Access Token Secret
**LOG_PATH** | Just the Log Path
**LOG_FILE** | Just the Log File
**LOG_LEVEL** | Just the Log Level
**LOGGER_NAME** | Just the Logger name
**KAFKA_BROKER_URL** | Kafka Broker URL
**KAFKA_TOPIC** | Kafka Topic Name### Environment file
We use decouple for strict separation of settings from code. It helps us with to store parameters in .env file and properly convert values to correct data type.
Copy the file .env-example to a .env file and replace the values inside of it.
Ways to run and use this project.
Docker
Steps to build the Docker Image.
#### Build
```bash
docker image build -t -f
docker image build -t . (This context)
```#### Run
Steps to run the Docker Container.
* **Linux** running:
```bash
docker container run -d -p
docker container run -it --rm --name -p
```* **Windows** running:
```
winpty docker.exe container run -it --rm
```For more information, access the [Docker](https://docs.docker.com/) documentation or [this](docs/docker.md).
Docker-Compose
Build and run a docker-compose.
```bash
docker-compose up --build
```Down all services deployed by docker-compose.
```bash
docker-compose down
```Down all services and delete all images.
```bash
docker-compose down --rmi all
```Depending on what you are making, it can be a good idea to include screenshots or even a video (you'll frequently see GIFs rather than actual videos). Tools like ttygif can help, but check out Asciinema for a more sophisticated method.
👤 **Lucca Pessoa**
Hey!! If you like this project or if you find some bugs feel free to contact me in my channels:
>
> * Email: [email protected]
> * Website: https://github.com/lpmatos
> * GitHub: [@lpmatos](https://github.com/lpmatos)
> * GitLab: [@lpmatos](https://gitlab.com/lpmatos)
>To check the change history, please access the [**CHANGELOG.md**](CHANGELOG.md) file.
If you have any problems, please contact [me](https://github.com/lpmatos).
This project is currently undergoing a reorganization 👾.
Give me a ⭐️ if this project helped you!
Made with 💜 by [me](https://github.com/lpmatos) :wave: inspired on [readme-md-generator](https://github.com/kefranabg/readme-md-generator)