Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/arshadkazmi42/github-scanner-local
Locally scan all the repositories of a github organization
https://github.com/arshadkazmi42/github-scanner-local
bounty bug bug-bounty crawler github local no-api scanner
Last synced: 3 months ago
JSON representation
Locally scan all the repositories of a github organization
- Host: GitHub
- URL: https://github.com/arshadkazmi42/github-scanner-local
- Owner: arshadkazmi42
- License: mit
- Created: 2021-11-05T13:58:27.000Z (about 3 years ago)
- Default Branch: main
- Last Pushed: 2024-03-08T19:54:16.000Z (11 months ago)
- Last Synced: 2024-10-07T06:41:26.941Z (4 months ago)
- Topics: bounty, bug, bug-bounty, crawler, github, local, no-api, scanner
- Language: Shell
- Homepage:
- Size: 24.4 KB
- Stars: 3
- Watchers: 3
- Forks: 2
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# github-scanner-local
Locally scan all the repositories of a github organization.
## Setup
- Clone below repositories as same location as github-scanner-local
- [gh-crawl](https://github.com/arshadkazmi42/gh-crawl)
- [bash-scripts](https://github.com/arshadkazmi42/bash-scripts)- Install [npm-name-cli](https://github.com/sindresorhus/npm-name-cli) globally
- Install [get-dependencies](https://github.com/SharonGrossman/get-dependencies) globally
- Install [pip-name](https://github.com/danishprakash/pip-name)```
$ npm i npm-name-cli -g$ npm i get-dependencies -g
$ pip install pip-name
```## Usage
```
$ sh run.sh {GITHUB_ORGANIZATION_NAME}$ sh runNpmDeps.sh {FOLDER_NAME}
$ sh runPipDeps.sh {FOLDER_NAME}
```## Working
#### `run.sh`
Uses `clone.sh`, `grepURL.sh` and `broken.sh`#### `clone.sh`
Clones all the repositories of the organanization (Note: It does not clones `archived` repositories)
#### `grepURL.sh`
Find all the urls from all the repositories of the organization (Searches in the cloned folder) and stores in results directory
#### `broken.sh`
Checks all the urls found by `grepURL.sh` and captures the status code of the URL
## Commands
- Search for string in all cloned repositories
```
ls -d -1 target/* | grep -v target.txt | xargs -I {} sh search.sh {} "{{STRING_TO_SEARCH}}"
``````
grep -r -o "{{STRING_TO_SEARCH}}" target/.
```