{"id":24946394,"url":"https://github.com/machinezone/spark-metrics","last_synced_at":"2025-03-28T19:14:33.237Z","repository":{"id":76807783,"uuid":"139510751","full_name":"machinezone/Spark-Metrics","owner":"machinezone","description":null,"archived":false,"fork":false,"pushed_at":"2018-07-03T17:29:30.000Z","size":9,"stargazers_count":1,"open_issues_count":0,"forks_count":2,"subscribers_count":3,"default_branch":"master","last_synced_at":"2025-02-02T20:28:18.797Z","etag":null,"topics":[],"latest_commit_sha":null,"homepage":null,"language":"Scala","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"bsd-3-clause","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/machinezone.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2018-07-03T01:03:31.000Z","updated_at":"2018-07-09T20:22:32.000Z","dependencies_parsed_at":"2023-07-04T22:16:22.939Z","dependency_job_id":null,"html_url":"https://github.com/machinezone/Spark-Metrics","commit_stats":null,"previous_names":[],"tags_count":1,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/machinezone%2FSpark-Metrics","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/machinezone%2FSpark-Metrics/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/machinezone%2FSpark-Metrics/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/machinezone%2FSpark-Metrics/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/machinezone","download_url":"https://codeload.github.com/machinezone/Spark-Metrics/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":246085638,"owners_count":20721212,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":[],"created_at":"2025-02-02T20:24:20.683Z","updated_at":"2025-03-28T19:14:33.231Z","avatar_url":"https://github.com/machinezone.png","language":"Scala","readme":"\n# Metrics\n\nA lightweight custom metrics library that exposes Apache Spark's internal metric registry.\n\n# Motivation\n\nThis library is a lightweight way to inject custom metrics into your Apache Spark application leveraging Spark's \ninternal metric registry. \n\nDo use this library if you want to send metrics to remote system (e.g. graphite)\n\nDon't use this library if you want to see all of the metrics aggregated on the driver.\n\n# Usage\n\n```scala\nimport org.apache.spark.metrics.mz.CustomMetrics\n\nval metrics = new CustomMetrics(\"my-metrics\")\nval meter = metrics.meter(\"met\")\nmeter.mark(10)\n```\n\nyou will see metrics populated under the key `APP_ID.EXECUTOR_ID.my-metrics.met`\n\n# Development\n\nClone this repository and run `mvn clean test`\n\nTo build for a custom version of Spark/Scala, run \n`mvn clean package \\\n-Dscala.major.version=\u003cSCALA_MAJOR\u003e \\\n-Dscala.minor.version=\u003cSCALA_MINOR\u003e \\\n-Dspark.version=\u003cSPARK_VERSION\u003e`\n\ne.g. \n```bash\nmvn clean compile \\\n-Dscala.major.version=2.10 \\\n-Dscala.minor.version=2.10.5 \\\n-Dspark.version=1.5.2\n```\n\n## build profiles\n\nAlternatively one can build against a limited number of pre-defined profiles.\nSee the [pom](pom.xml) for a list of the profiles.\n\nExample build with profiles: \n\n`mvn clean package -Pspark_2.3,scala_2.11`\n\n`mvn clean package -Pspark_2.0,scala_2.10`\n\n`mvn clean package -Pspark_1.6,scala_2.11`\n\n# Support\n\nHere is a handy table of supported build version combinations:\n\n| Apache Spark | Scala |\n|:------------:|:-----:|\n| 1.5.x        | 2.10  |\n| 1.5.x        | 2.11  |\n| 1.6.x        | 2.10  |\n| 1.6.x        | 2.11  |\n| 2.0.x        | 2.10  |\n| 2.0.x        | 2.11  | \n| 2.1.x        | 2.10  |\n| 2.1.x        | 2.11  |\n| 2.2.x        | 2.10  |\n| 2.2.x        | 2.11  |\n| 2.3.x        | 2.11  |\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fmachinezone%2Fspark-metrics","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fmachinezone%2Fspark-metrics","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fmachinezone%2Fspark-metrics/lists"}