Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/humphd/github-contrib-stats
Download and count GitHub Issue, Repo, and PR stats from web pages with GitHub URLs included
https://github.com/humphd/github-contrib-stats
Last synced: 14 days ago
JSON representation
Download and count GitHub Issue, Repo, and PR stats from web pages with GitHub URLs included
- Host: GitHub
- URL: https://github.com/humphd/github-contrib-stats
- Owner: humphd
- License: mit
- Created: 2019-04-12T19:07:16.000Z (over 5 years ago)
- Default Branch: master
- Last Pushed: 2022-06-24T23:24:47.000Z (over 2 years ago)
- Last Synced: 2024-04-11T12:59:24.468Z (7 months ago)
- Language: JavaScript
- Size: 36.1 KB
- Stars: 0
- Watchers: 3
- Forks: 0
- Open Issues: 3
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
### GitHub Contrib Stats
This is a set of scripts I use to help me extract GitHub contribution
stats for my open source classes. It takes a set of URLs to web pages
with links to GitHub Issues and/or Pull Requests, then downloads and
extracts data about them.### Setup
First, install all deps:
```
$ npm install
```Next, copy the [env.sample](env.sample) file to `.env` and alter it
to add your information. See [the Octokit docs](https://github.com/octokit/rest.js/#authentication) for details on how to get a GitHub token.You need to include a list of URLs to the web pages that contain
the URLs you want to process. These can be HTML, markdown, or any
other text format. The page(s) will be downloaded and the URLs extracted.```
# List of URLs to process, separated by spaces
URL_LIST="https://url1.com/file.html https://url2.com/file.md"
```### How to Run
With everything installed and the `.env` setup, you can now run the analysis:
```
npm start
```This will do a number of things:
1. Use the URLs in `URL_LIST` to download all URLs specified
1. Extract all GitHub URLs from the text at each page
1. Build a list of repos, issues, and pull requests in `data/github-data.json` and make requests for of these items from the GitHub API. All results will be cached in `data/`, so you can interrupt it, or retry later.
1. Generate stats for all Issues in `data/issue-data.json`
1. Generate stats for all Pull Requests in `data/pr-data.json`