Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/danielstankw/grafana-metrics-scraper
Reduce the cardinality of your metrics! - scrape all the metrics from Grafana dashboard and make informed decision.
https://github.com/danielstankw/grafana-metrics-scraper
cardinality grafana grafana-dashboard monitoring prometheus prometheus-metrics python3 relabel scrape
Last synced: about 1 month ago
JSON representation
Reduce the cardinality of your metrics! - scrape all the metrics from Grafana dashboard and make informed decision.
- Host: GitHub
- URL: https://github.com/danielstankw/grafana-metrics-scraper
- Owner: danielstankw
- Created: 2024-07-17T15:37:20.000Z (7 months ago)
- Default Branch: main
- Last Pushed: 2024-07-19T12:58:16.000Z (7 months ago)
- Last Synced: 2025-01-05T11:42:35.256Z (about 1 month ago)
- Topics: cardinality, grafana, grafana-dashboard, monitoring, prometheus, prometheus-metrics, python3, relabel, scrape
- Language: Python
- Homepage:
- Size: 34.2 KB
- Stars: 1
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
## About
The `grafana_metrics_used.py` script extends the functionality of my [Query Scraper](https://github.com/danielstankw/Grafana-metrics/tree/main/query_scraper).
It connects to the Grafana API and scrapes all the dashboards for two key items: **variables** and **PromQL queries**.
The goal is to gather all the metrics used in the dashboards.
After collecting the queries and variables, the script preprocesses them to extract only the metrics. It then removes any duplicates and sorts the metrics in alphabetical order.#### Step 1: We go from this
```json
{
"dashboard": "MyDashboard / K8s / Cluster / Ephemeral Storage",
"metric_names": [
"node_filesystem_avail_bytes",
"node_filesystem_size_bytes",
"up"
]
},
{
"dashboard": "MyDashboard / K8s / Cluster View",
"metric_names": [
"kube_namespace_created",
"container_cpu_usage_seconds_total",
"kube_ingress_info",```
#### Step 2: ... to this - clean :)
```json
"cluster:namespace:pod_cpu:active:kube_pod_container_resource_limits",
"cluster:namespace:pod_cpu:active:kube_pod_container_resource_requests",
"cluster:namespace:pod_memory:active:kube_pod_container_resource_limits",
"cluster:namespace:pod_memory:active:kube_pod_container_resource_requests",
"container_cpu_usage_seconds_total",
"container_memory_cache",
"container_memory_rss",
"container_memory_swap",
"container_memory_working_set_bytes",
"container_network_receive_bytes_total",
"container_network_receive_packets_total",
"container_network_transmit_bytes_total",
"container_network_transmit_packets_total",
"kube_configmap_info",
```By obtaining this list of metrics, users can identify the exact metrics required for the existing dashboards to function correctly. Any unnecessary metrics can be restricted or deleted, reducing the number of metrics scraped by Prometheus. This results in faster query times, saved memory, and a more robust monitoring solution.
## How to run the script?
# 1. Docker
# 2. Python
To run you will need to modify the following in the `grafana_query_scraper.py`, and provide your grafana URL as well as Grafana API key (*read further to see how to get it*).
```py
if __name__ == "__main__":# Your grafana url
grafana_url = ""
# API key - read the readme for instructions.
api_key = ""
# File name where the result of scrape will be saved
file_name = 'grafana_queries.json'
```Install the relevant dependancies and execute the script:
```bashrc
pip install requests
python3 grafana_query_scrapper.py
```## Getting Grafana API key
To connect with Grafana API an API key is required. Below I present guide on how to get it.
![]()
![]()
![]()
![]()
![]()
![]()