https://github.com/rflcnunes/crawler_email_py
In this project I'm creating a web crawler to check email boxes and handle incoming messages.
https://github.com/rflcnunes/crawler_email_py
aws-bucket aws-bucket-s3 aws-s3 crawler crawler-python email python rabbitmq
Last synced: 3 months ago
JSON representation
In this project I'm creating a web crawler to check email boxes and handle incoming messages.
- Host: GitHub
- URL: https://github.com/rflcnunes/crawler_email_py
- Owner: rflcnunes
- Created: 2022-12-23T04:50:15.000Z (over 2 years ago)
- Default Branch: main
- Last Pushed: 2022-12-27T05:12:42.000Z (over 2 years ago)
- Last Synced: 2025-02-01T03:27:40.491Z (5 months ago)
- Topics: aws-bucket, aws-bucket-s3, aws-s3, crawler, crawler-python, email, python, rabbitmq
- Language: Python
- Homepage:
- Size: 29.3 KB
- Stars: 1
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# crawler_email_py
This is a simple email crawler written in Python.
## Requirements
- Python 3.6+
- Docker## Docker
- RabbitMQ
- Minio## Usage
Create data and logs folder in the root directory.
```bash
mkdir data
mkdir logs
mkdir attachments
```Added your variables in the .env file.
````bash
cp .env.example .env
````Required variables:
- `IMAP_GMAIL_HOST`
- `IMAP_GMAIL_EMAIL`
- `IMAP_GMAIL_PASSWORD`
- `LOCAL_FILE_PATH`Start RabbitMQ and Minio
```bash
docker-compose up -d
```Access RabbitMQ at http://localhost:15672/ with username and password `guest`.
Access Minio at http://localhost:9002/ with username and password `minioadmin`.
Install dependencies
```bash
pip install -r requirements.txt
```Run the script
```bash
python3 main.py
```