{"id":16442886,"url":"https://github.com/albertodonato/query-exporter","last_synced_at":"2025-04-13T08:55:57.638Z","repository":{"id":40605313,"uuid":"84195415","full_name":"albertodonato/query-exporter","owner":"albertodonato","description":"Export Prometheus metrics from SQL queries","archived":false,"fork":false,"pushed_at":"2025-04-03T11:30:11.000Z","size":599,"stargazers_count":474,"open_issues_count":24,"forks_count":108,"subscribers_count":13,"default_branch":"main","last_synced_at":"2025-04-06T05:07:17.767Z","etag":null,"topics":["database","metrics","metrics-exporter","prometheus","prometheus-exporter","query","sql"],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"gpl-3.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/albertodonato.png","metadata":{"files":{"readme":"README.rst","changelog":"CHANGES.rst","contributing":"docs/contributing.rst","funding":null,"license":"LICENSE.txt","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2017-03-07T12:14:52.000Z","updated_at":"2025-04-03T11:30:14.000Z","dependencies_parsed_at":"2024-06-19T03:03:06.543Z","dependency_job_id":"6a522008-c182-468c-b99f-3c75389e43f4","html_url":"https://github.com/albertodonato/query-exporter","commit_stats":{"total_commits":378,"total_committers":21,"mean_commits":18.0,"dds":0.06878306878306883,"last_synced_commit":"c2160925ec3ebdfdb6c9715b27aada20b2f66057"},"previous_names":[],"tags_count":60,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/albertodonato%2Fquery-exporter","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/albertodonato%2Fquery-exporter/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/albertodonato%2Fquery-exporter/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/albertodonato%2Fquery-exporter/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/albertodonato","download_url":"https://codeload.github.com/albertodonato/query-exporter/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":248688545,"owners_count":21145764,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["database","metrics","metrics-exporter","prometheus","prometheus-exporter","query","sql"],"created_at":"2024-10-11T09:18:54.822Z","updated_at":"2025-04-13T08:55:57.619Z","avatar_url":"https://github.com/albertodonato.png","language":"Python","readme":"|query-exporter logo|\n\nExport Prometheus metrics from SQL queries\n==========================================\n\n|Latest Version| |Build Status| |PyPI Downloads| |Docker Pulls| |Snap Package|\n\n``query-exporter`` is a Prometheus_ exporter which allows collecting metrics\nfrom database queries, at specified time intervals.\n\nIt uses SQLAlchemy_ to connect to different database engines, including\nPostgreSQL, MySQL, Oracle and Microsoft SQL Server.\n\nEach query can be run on multiple databases, and update multiple metrics.\n\nThe application is simply run as::\n\n  query-exporter\n\nwhich will look for a ``config.yaml`` configuration file in the current\ndirectory, containing the definitions of the databases to connect and queries\nto perform to update metrics.  The configuration file can be overridden by\npassing the ``--config`` option (or setting the ``QE_CONFIG`` environment\nvariable).  The option can be provided multiple times to pass partial\nconfiguration files, the resulting configuration will be the merge of the\ncontent of each top-level section (``databases``, ``metrics``, ``queries``).\n\nA sample configuration file for the application looks like this:\n\n.. code:: yaml\n\n    databases:\n      db1:\n        dsn: sqlite://\n        connect-sql:\n          - PRAGMA application_id = 123\n          - PRAGMA auto_vacuum = 1\n        labels:\n          region: us1\n          app: app1\n      db2:\n        dsn: sqlite://\n        keep-connected: false\n        labels:\n          region: us2\n          app: app1\n\n    metrics:\n      metric1:\n        type: gauge\n        description: A sample gauge\n      metric2:\n        type: summary\n        description: A sample summary\n        labels: [l1, l2]\n        expiration: 24h\n      metric3:\n        type: histogram\n        description: A sample histogram\n        buckets: [10, 20, 50, 100, 1000]\n      metric4:\n        type: enum\n        description: A sample enum\n        states: [foo, bar, baz]\n\n    queries:\n      query1:\n        interval: 5\n        databases: [db1]\n        metrics: [metric1]\n        sql: SELECT random() / 1000000000000000 AS metric1\n      query2:\n        interval: 20\n        timeout: 0.5\n        databases: [db1, db2]\n        metrics: [metric2, metric3]\n        sql: |\n          SELECT abs(random() / 1000000000000000) AS metric2,\n                 abs(random() / 10000000000000000) AS metric3,\n                 \"value1\" AS l1,\n                 \"value2\" AS l2\n      query3:\n        schedule: \"*/5 * * * *\"\n        databases: [db2]\n        metrics: [metric4]\n        sql: |\n          SELECT value FROM (\n            SELECT \"foo\" AS metric4 UNION\n            SELECT \"bar\" AS metric3 UNION\n            SELECT \"baz\" AS metric4\n          )\n          ORDER BY random()\n          LIMIT 1\n\n\nSee the `configuration file format`_ documentation for complete details on\navailble configuration options.\n\n\nExporter options\n----------------\n\nThe exporter provides the following options, that can be set via command-line\nswitches, environment variables or through the ``.env`` file:\n\n.. table::\n   :widths: auto\n\n  +-------------------------+------------------------+-----------------+-------------------------------------------------------------------+\n  | Command-line option     | Environment variable   | Default         | Description                                                       |\n  +=========================+========================+=================+===================================================================+\n  | ``-H``, ``--host``      | ``QE_HOST``            | ``localhost``   | Host addresses to bind. Multiple values can be provided.          |\n  +-------------------------+------------------------+-----------------+-------------------------------------------------------------------+\n  |  ``-p``, ``--port``     | ``QE_PORT``            | ``9560``        | Port to run the webserver on.                                     |\n  +-------------------------+------------------------+-----------------+-------------------------------------------------------------------+\n  | ``--metrics-path``      | ``QE_METRICS_PATH``    | ``/metrics``    | Path under which metrics are exposed.                             |\n  +-------------------------+------------------------+-----------------+-------------------------------------------------------------------+\n  | ``-L``, ``--log-level`` | ``QE_LOG_LEVEL``       | ``info``        | Minimum level for log messages level.                             |\n  |                         |                        |                 | One of ``critical``, ``error``, ``warning``, ``info``, ``debug``. |\n  +-------------------------+------------------------+-----------------+-------------------------------------------------------------------+\n  | ``--log-format``        | ``QE_LOG_FORMAT``      | ``plain``       | Log output format. One of ``plain``, ``json``.                    |\n  +-------------------------+------------------------+-----------------+-------------------------------------------------------------------+\n  | ``--process-stats``     | ``QE_PROCESS_STATS``   | ``false``       | Include process stats in metrics.                                 |\n  +-------------------------+------------------------+-----------------+-------------------------------------------------------------------+\n  | ``--ssl-private-key``   | ``QE_SSL_PRIVATE_KEY`` |                 | Full path to the SSL private key.                                 |\n  +-------------------------+------------------------+-----------------+-------------------------------------------------------------------+\n  | ``--ssl-public-key``    | ``QE_SSL_PUBLIC_KEY``  |                 | Full path to the SSL public key.                                  |\n  +-------------------------+------------------------+-----------------+-------------------------------------------------------------------+\n  | ``--ssl-ca``            | ``QE_SSL_CA``          |                 | Full path to the SSL certificate authority (CA).                  |\n  +-------------------------+------------------------+-----------------+-------------------------------------------------------------------+\n  | ``--check-only``        | ``QE_CHECK_ONLY``      | ``false``       | Only check configuration, don't run the exporter.                 |\n  +-------------------------+------------------------+-----------------+-------------------------------------------------------------------+\n  | ``--config``            | ``QE_CONFIG``          | ``config.yaml`` | Configuration files. Multiple values can be provided.             |\n  +-------------------------+------------------------+-----------------+-------------------------------------------------------------------+\n  |                         | ``QE_DOTENV``          | ``$PWD/.env``   | Path for the dotenv file where environment variables can be       |\n  |                         |                        |                 | provided.                                                         |\n  +-------------------------+------------------------+-----------------+-------------------------------------------------------------------+\n\n\nMetrics endpoint\n----------------\n\nThe exporter listens on port ``9560`` providing the standard ``/metrics``\nendpoint.\n\nBy default, the port is bound on ``localhost``. Note that if the name resolves\nboth IPv4 and IPv6 addressses, the exporter will bind on both.\n\n\nBuiltin metrics\n---------------\n\nThe exporter provides a few builtin metrics which can be useful to track query execution:\n\n``database_errors{database=\"db\"}``:\n  a counter used to report number of errors, per database.\n\n``queries{database=\"db\",query=\"q\",status=\"[success|error|timeout]\"}``:\n  a counter with number of executed queries, per database, query and status.\n\n``query_interval{query=\"q\"}``:\n  a gauge reporting the configured execution interval in seconds, if set, per query.\n\n``query_latency{database=\"db\",query=\"q\"}``:\n  a histogram with query latencies, per database and query.\n\n``query_timestamp{database=\"db\",query=\"q\"}``:\n  a gauge with query last execution timestamps, per database and query.\n\nIn addition, metrics for resources usage for the exporter process can be\nincluded by passing ``--process-stats`` in the command line.\n\n\nDatabase engines\n----------------\n\nSQLAlchemy_ doesn't depend on specific Python database modules at\ninstallation. This means additional modules might need to be installed for\nengines in use. These can be installed as follows::\n\n  pip install SQLAlchemy[postgresql] SQLAlchemy[mysql] ...\n\nbased on which database engines are needed.\n\nSee `supported databases`_ for details.\n\n\nRun in Docker\n=============\n\n``query-exporter`` can be run inside Docker_ containers, and is available from\nthe `Docker Hub`_::\n\n  docker run --rm -it -p 9560:9560/tcp -v \"$CONFIG_DIR:/config\" adonato/query-exporter:latest\n\nwhere ``$CONFIG_DIR`` is the absolute path of a directory containing a\n``config.yaml`` file, the configuration file to use. Alternatively, a volume\nname can be specified.\n\nIf a ``.env`` file is present in the specified volume for ``/config``, its\ncontent is loaded and applied to the environment for the exporter. The location\nof the dotenv file can be customized by setting the ``QE_DOTENV`` environment\nvariable.\n\nThe image has support for connecting the following databases:\n\n- PostgreSQL (``postgresql://``)\n- MySQL (``mysql://``)\n- SQLite (``sqlite://``)\n- Microsoft SQL Server (``mssql://``)\n- IBM DB2 (``db2://``)\n- Oracle (``oracle://``)\n- ClickHouse (``clickhouse+native://``)\n\nA `Helm chart`_ to run the container in Kubernetes is also available.\n\nAutomated builds from the ``main`` branch are available on the `GitHub container registry`_ via::\n\n  docker pull ghcr.io/albertodonato/query-exporter:main\n\n\nODBC driver version\n-------------------\n\nA different ODBC driver version to use can be specified during image building,\nby passing ``--build-arg ODBC_bVERSION_NUMBER``, e.g.::\n\n  docker build . --build-arg ODBC_DRIVER_VERSION=17\n\n\nInstall from Snap\n=================\n\n|Get it from the Snap Store|\n\n``query-exporter`` can be installed from `Snap Store`_ on systems where Snaps\nare supported, via::\n\n  sudo snap install query-exporter\n\nThe snap provides both the ``query-exporter`` command and a daemon instance of\nthe command, managed via a Systemd service.\n\nTo configure the daemon:\n\n- create or edit ``/var/snap/query-exporter/current/config.yaml`` with the\n  configuration\n- optionally, create a ``/var/snap/query-exporter/current/.env`` file with\n  environment variables definitions for additional config options\n- run ``sudo snap restart query-exporter``\n\nThe snap has support for connecting the following databases:\n\n- PostgreSQL (``postgresql://``)\n- MySQL (``mysql://``)\n- SQLite (``sqlite://``)\n- Microsoft SQL Server (``mssql://``)\n- IBM DB2 (``db2://``) on supported architectures (x86_64, ppc64le and\n  s390x)\n\n\nContributing\n============\n\nThe project welcomes contributions of any form. Please refer to the\n`contribution guide`_ for details on how to contribute.\n\nFor general purpose questions, you can use `Discussions`_ on GitHub.\n\n\n.. _Prometheus: https://prometheus.io/\n.. _SQLAlchemy: https://www.sqlalchemy.org/\n.. _`supported databases`:\n   http://docs.sqlalchemy.org/en/latest/core/engines.html#supported-databases\n.. _`Snap Store`: https://snapcraft.io\n.. _Docker: http://docker.com/\n.. _`Docker Hub`: https://hub.docker.com/r/adonato/query-exporter\n.. _`configuration file format`: docs/configuration.rst\n.. _`contribution guide`: docs/contributing.rst\n.. _`Helm chart`: https://github.com/makezbs/helm-charts/tree/main/charts/query-exporter\n.. _`GitHub container registry`: https://github.com/albertodonato/query-exporter/pkgs/container/query-exporter\n.. _`Discussions`: https://github.com/albertodonato/query-exporter/discussions\n\n.. |query-exporter logo| image:: https://raw.githubusercontent.com/albertodonato/query-exporter/main/logo.svg\n   :alt: query-exporter logo\n.. |Latest Version| image:: https://img.shields.io/pypi/v/query-exporter.svg\n   :alt: Latest Version\n   :target: https://pypi.python.org/pypi/query-exporter\n.. |Build Status| image:: https://github.com/albertodonato/query-exporter/workflows/CI/badge.svg\n   :alt: Build Status\n   :target: https://github.com/albertodonato/query-exporter/actions?query=workflow%3ACI\n.. |Snap Package| image:: https://snapcraft.io/query-exporter/badge.svg\n   :alt: Snap Package\n   :target: https://snapcraft.io/query-exporter\n.. |Get it from the Snap Store| image:: https://snapcraft.io/static/images/badges/en/snap-store-black.svg\n   :alt: Get it from the Snap Store\n   :target: https://snapcraft.io/query-exporter\n.. |Docker Pulls| image:: https://img.shields.io/docker/pulls/adonato/query-exporter\n   :alt: Docker Pulls\n   :target: https://hub.docker.com/r/adonato/query-exporter\n.. |PyPI Downloads| image:: https://static.pepy.tech/badge/query-exporter/month\n   :alt: PyPI Downloads\n   :target: https://pepy.tech/projects/query-exporter\n","funding_links":[],"categories":["SQL Server Web Resources","Monitoring/Statistics/Perfomance"],"sub_categories":["Prometheus"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Falbertodonato%2Fquery-exporter","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Falbertodonato%2Fquery-exporter","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Falbertodonato%2Fquery-exporter/lists"}