Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/lpsm-dev/twitter-realtime-processing-covid

✔️ Twitter Realtime Processing about Covid-19 + Docker + Docker Compose + Apache Kafka + Elastic Stack (Elasticsearch + Kibana)
https://github.com/lpsm-dev/twitter-realtime-processing-covid

alpine covid covid-19 docker docker-compose elasticsearch kafka kafka-consumer kafka-producer kibana python streaming twitter twitter-api twitter-streaming-api

Last synced: 4 days ago
JSON representation

✔️ Twitter Realtime Processing about Covid-19 + Docker + Docker Compose + Apache Kafka + Elastic Stack (Elasticsearch + Kibana)

Awesome Lists containing this project

README

        


python-kafka

Welcome to Twitter Realtime Processing Repository


Python Realtime Processing Tweets COVID-19 using Kafka + Elasticsearch + Kibana + Docker + Docker-Compose



Open Source


GitHub Contributors


GitHub Language Count


GitHub Top Language


GitHub Stars


GitHub Last Commit


Repository Size


Repository Issues


MIT License

## ➤ Getting Started

If you want use this repository you need to make a **git clone**:

```bash
git clone --depth 1 https://github.com/lpmatos/twitter-realtime-processing-covid.git -b master
```

This will give access on your **local machine**.

## ➤ Pre-Requisites

To this project you yeed:

* Python 3.8.
* Docker and Docker Compose.
* Kafka ecosystem.
* Elasticsearch.
* Kiabana

## ➤ How to use it?

#### Locale

1. Set the application environment variables.
2. Install python packages in requirements.txt.
3. Run docker-compose.yml to deploy all kafka and elastic ecosystem.
4. Profit.

#### Docker

1. Set all environment variables in dot-env files.
2. Creathe a docker network.
3. Run docker-compose.yml to deploy all kafka and elastic ecosystem.
4. Run docker-compose-tools.to run the application.
5. Profit.

This system is fully containerised. You will need [Docker](https://docs.docker.com/install/) and [Docker Compose](https://docs.docker.com/compose/) to run it.

You simply need to create a Docker network called `kafka-network` to enable communication between the Kafka cluster and the apps:

```bash
$ docker network create kafka-network
```

All set!

## ➤ Description

### Sending Data to Elasticsearch

```bash
curl -X POST kafka-connect:8083/connectors -H "Content-Type: application/json" -d '{
"name": "elasticsearch-sink-kafka",
"config": {
"connector.class": "io.confluent.connect.elasticsearch.ElasticsearchSinkConnector",
"type.name": "kafka-connect",
"key.converter.schemas.enable": "false",
"tasks.max": "1",
"topics": "dados-tweets",
"value.converter.schemas.enable": "false",
"key.ignore": "true",
"connection.url": " http://elasticsearch:9200",
"value.converter": "org.apache.kafka.connect.json.JsonConverter",
"key.converter": "org.apache.kafka.connect.storage.StringConverter",
"schema.ignore": "true"
}
}'
```

### Environment variables

**Name** | **Description**
:---: | :---:
**TWITTER_CONSUMER_KEY** | Twitter Consumer Key
**TWITTER_CONSUMER_SECRET** | Twitter Consumer Secret
**TWITTER_ACCESS_TOKEN** | Twitter Access Token
**TWITTER_ACCESS_TOKEN_SECRET** | Twitter Access Token Secret
**LOG_PATH** | Just the Log Path
**LOG_FILE** | Just the Log File
**LOG_LEVEL** | Just the Log Level
**LOGGER_NAME** | Just the Logger name
**KAFKA_BROKER_URL** | Kafka Broker URL
**KAFKA_TOPIC** | Kafka Topic Name

### Environment file

We use decouple for strict separation of settings from code. It helps us with to store parameters in .env file and properly convert values to correct data type.

Copy the file .env-example to a .env file and replace the values inside of it.

## ➤ Usage

Ways to run and use this project.

Docker

Steps to build the Docker Image.

#### Build

```bash
docker image build -t -f
docker image build -t . (This context)
```

#### Run

Steps to run the Docker Container.

* **Linux** running:

```bash
docker container run -d -p
docker container run -it --rm --name -p
```

* **Windows** running:

```
winpty docker.exe container run -it --rm
```

For more information, access the [Docker](https://docs.docker.com/) documentation or [this](docs/docker.md).

Docker-Compose

Build and run a docker-compose.

```bash
docker-compose up --build
```

Down all services deployed by docker-compose.

```bash
docker-compose down
```

Down all services and delete all images.

```bash
docker-compose down --rmi all
```

## ➤ Visuals

Depending on what you are making, it can be a good idea to include screenshots or even a video (you'll frequently see GIFs rather than actual videos). Tools like ttygif can help, but check out Asciinema for a more sophisticated method.

## ➤ Author

👤 **Lucca Pessoa**

Hey!! If you like this project or if you find some bugs feel free to contact me in my channels:

>
> * Email: [email protected]
> * Website: https://github.com/lpmatos
> * GitHub: [@lpmatos](https://github.com/lpmatos)
> * GitLab: [@lpmatos](https://gitlab.com/lpmatos)
>

## ➤ Versioning

To check the change history, please access the [**CHANGELOG.md**](CHANGELOG.md) file.

## ➤ Troubleshooting

If you have any problems, please contact [me](https://github.com/lpmatos).

## ➤ Project status

This project is currently undergoing a reorganization 👾.

## ➤ Show your support

Give me a ⭐️ if this project helped you!


gif-header

Made with 💜 by [me](https://github.com/lpmatos) :wave: inspired on [readme-md-generator](https://github.com/kefranabg/readme-md-generator)