https://github.com/taskiq-python/taskiq-collector
Taskiq runtime info collector
https://github.com/taskiq-python/taskiq-collector
Last synced: 9 months ago
JSON representation
Taskiq runtime info collector
- Host: GitHub
- URL: https://github.com/taskiq-python/taskiq-collector
- Owner: taskiq-python
- Created: 2022-08-12T20:16:48.000Z (over 3 years ago)
- Default Branch: master
- Last Pushed: 2022-08-17T08:41:57.000Z (over 3 years ago)
- Last Synced: 2025-04-10T04:08:50.807Z (9 months ago)
- Language: Python
- Size: 967 KB
- Stars: 1
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# taskiq_collector
This project was generated using fastapi_template.
## Poetry
This project uses poetry. It's a modern dependency management
tool.
To run the project use this set of commands:
```bash
poetry install
poetry run python -m taskiq_collector
```
This will start the server on the configured host.
You can find swagger documentation at `/api/docs`.
You can read more about poetry here: https://python-poetry.org/
## Docker
You can start the project with docker using this command:
```bash
docker-compose -f deploy/docker-compose.yml --project-directory . up --build
```
If you want to develop in docker with autoreload add `-f deploy/docker-compose.dev.yml` to your docker command.
Like this:
```bash
docker-compose -f deploy/docker-compose.yml -f deploy/docker-compose.dev.yml --project-directory . up
```
This command exposes the web application on port 8000, mounts current directory and enables autoreload.
But you have to rebuild image every time you modify `poetry.lock` or `pyproject.toml` with this command:
```bash
docker-compose -f deploy/docker-compose.yml --project-directory . build
```
## Project structure
```bash
$ tree "taskiq_collector"
taskiq_collector
├── conftest.py # Fixtures for all tests.
├── db # module contains db configurations
│ ├── dao # Data Access Objects. Contains different classes to inteact with database.
│ └── models # Package contains different models for ORMs.
├── __main__.py # Startup script. Starts uvicorn.
├── services # Package for different external services such as rabbit or redis etc.
├── settings.py # Main configuration settings for project.
├── static # Static content.
├── tests # Tests for project.
└── web # Package contains web server. Handlers, startup config.
├── api # Package with all handlers.
│ └── router.py # Main router.
├── application.py # FastAPI application configuration.
└── lifetime.py # Contains actions to perform on startup and shutdown.
```
## Configuration
This application can be configured with environment variables.
You can create `.env` file in the root directory and place all
environment variables here.
All environment variabels should start with "TASKIQ_COLLECTOR_" prefix.
For example if you see in your "taskiq_collector/settings.py" a variable named like
`random_parameter`, you should provide the "TASKIQ_COLLECTOR_RANDOM_PARAMETER"
variable to configure the value. This behaviour can be changed by overriding `env_prefix` property
in `taskiq_collector.settings.Settings.Config`.
An exmaple of .env file:
```bash
TASKIQ_COLLECTOR_RELOAD="True"
TASKIQ_COLLECTOR_PORT="8000"
TASKIQ_COLLECTOR_ENVIRONMENT="dev"
```
You can read more about BaseSettings class here: https://pydantic-docs.helpmanual.io/usage/settings/
## Opentelemetry
If you want to start your project with opentelemetry collector
you can add `-f ./deploy/docker-compose.otlp.yml` to your docker command.
Like this:
```bash
docker-compose -f deploy/docker-compose.yml -f deploy/docker-compose.otlp.yml --project-directory . up
```
This command will start opentelemetry collector and jaeger.
After sending a requests you can see traces in jaeger's UI
at http://localhost:16686/.
This docker configuration is not supposed to be used in production.
It's only for demo purpose.
You can read more about opentelemetry here: https://opentelemetry.io/
## Pre-commit
To install pre-commit simply run inside the shell:
```bash
pre-commit install
```
pre-commit is very useful to check your code before publishing it.
It's configured using .pre-commit-config.yaml file.
By default it runs:
* black (formats your code);
* mypy (validates types);
* isort (sorts imports in all files);
* flake8 (spots possibe bugs);
* yesqa (removes useless `# noqa` comments).
You can read more about pre-commit here: https://pre-commit.com/
## Migrations
If you want to migrate your database, you should run following commands:
```bash
# To run all migrations untill the migration with revision_id.
alembic upgrade ""
# To perform all pending migrations.
alembic upgrade "head"
```
### Reverting migrations
If you want to revert migrations, you should run:
```bash
# revert all migrations up to: revision_id.
alembic downgrade
# Revert everything.
alembic downgrade base
```
### Migration generation
To generate migrations you should run:
```bash
# For automatic change detection.
alembic revision --autogenerate
# For empty file generation.
alembic revision
```
## Running tests
If you want to run it in docker, simply run:
```bash
docker-compose -f deploy/docker-compose.yml --project-directory . run --rm api pytest -vv .
docker-compose -f deploy/docker-compose.yml --project-directory . down
```
For running tests on your local machine.
1. you need to start a database.
I prefer doing it with docker:
```
docker run -p "5432:5432" -e "POSTGRES_PASSWORD=taskiq_collector" -e "POSTGRES_USER=taskiq_collector" -e "POSTGRES_DB=taskiq_collector" postgres:13.6-bullseye
```
2. Run the pytest.
```bash
pytest -vv .
```