https://github.com/thoth-station/package-update-job
Aggregate updates from Packages hosted in Indices...
https://github.com/thoth-station/package-update-job
hacktoberfest python thoth
Last synced: 2 months ago
JSON representation
Aggregate updates from Packages hosted in Indices...
- Host: GitHub
- URL: https://github.com/thoth-station/package-update-job
- Owner: thoth-station
- License: gpl-3.0
- Created: 2020-02-03T19:10:57.000Z (over 5 years ago)
- Default Branch: master
- Last Pushed: 2023-10-18T00:34:54.000Z (over 1 year ago)
- Last Synced: 2025-03-24T13:51:22.717Z (3 months ago)
- Topics: hacktoberfest, python, thoth
- Language: Python
- Homepage:
- Size: 1.08 MB
- Stars: 2
- Watchers: 7
- Forks: 9
- Open Issues: 5
-
Metadata Files:
- Readme: README.rst
- Changelog: CHANGELOG.md
- License: LICENSE
- Codeowners: .github/CODEOWNERS
Awesome Lists containing this project
README
Thoth Package Update Job
--------------------------.. image:: https://img.shields.io/github/v/tag/thoth-station/package-update-job?style=plastic
:target: https://github.com/thoth-station/package-update-job/tags
:alt: GitHub tag (latest by date).. image:: https://quay.io/repository/thoth-station/package-update-job/status
:target: https://quay.io/repository/thoth-station/package-update-job?tab=tags
:alt: Quay - BuildThis job iterates over the packages in our database to ensure that they:
* still exist
* haven't changedThis job is run periodically as an OpenShift CronJob. The job checks the availability
of packages as well as their hashes to make sure they match what Thoth has stored.Logic behind package update
============================================We get a list of all packages which we have analyzed. Then, we check whether.
* the package still exists on that index
* that version of the package still exists
* if the SHA256 from source matches what we have storedIf we find any of these issues we post to a Kafka topic so that a consumer can
decide how to handle the update.Installation and Deployment
===========================The job is an OpenShift s2i build, the deployment is done via Kustomize,
deployment templates are live in the `core repository `__.Running the job locally
=======================You can run this job locally without a cluster deployment. To do so, prepare
your virtual environment:.. code-block:: console
$ pipenv install --dev # Install all the requirements
After that, you need to run a local instance of database - follow
`instructions in the README `__ file for
more info and prepare the database schema:$ pipenv run python3 ./app.py
Job will talk to your local database instance by default which is located at
```localhost:5432`` by default. And your local Kafka instance which is ``localhost:9092``
by default.