Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/gsmith257-cyber/GraphCrawler
GraphQL automated security testing toolkit
https://github.com/gsmith257-cyber/GraphCrawler
api api-hacking automated-testing cybersecurity graphql graphql-api graphql-security pentesting
Last synced: about 1 month ago
JSON representation
GraphQL automated security testing toolkit
- Host: GitHub
- URL: https://github.com/gsmith257-cyber/GraphCrawler
- Owner: gsmith257-cyber
- License: mit
- Created: 2022-06-17T20:54:16.000Z (over 2 years ago)
- Default Branch: main
- Last Pushed: 2024-02-20T22:15:38.000Z (10 months ago)
- Last Synced: 2024-10-30T08:26:14.313Z (about 1 month ago)
- Topics: api, api-hacking, automated-testing, cybersecurity, graphql, graphql-api, graphql-security, pentesting
- Language: Python
- Homepage:
- Size: 1.4 MB
- Stars: 300
- Watchers: 5
- Forks: 23
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
- Support: support/graphql-path-enum
Awesome Lists containing this project
- awesome-rainmana - gsmith257-cyber/GraphCrawler - GraphQL automated security testing toolkit (Python)
- Awesome-Pentest - GraphCrawler - GraphQL automated security testing toolkit. (Web Tools / GraphQL)
- awesome-graphql-security - GraphCrawler - A GraphQL automated security toolkit. Grab introspection, search for sensitive queries, and then test authorization. (Offensive Security / Exploitation)
- awesome-hacking-lists - gsmith257-cyber/GraphCrawler - GraphQL automated security testing toolkit (Python)
README
# GraphCrawler
![](https://github.com/gsmith257-cyber/GraphCrawler/raw/main/GraphCrawler.PNG)
## THE GraphQL automated testing tookit
Graph Crawler is the most powerful automated testing toolkit for any GraphQL endpoint.
### Version 1.2 is out!!
NEW: Can search for endpoints for you using Escape Technology's powerful [Graphinder](https://github.com/Escape-Technologies/graphinder) tool. Just point it towards a domain and add the '-e' option and Graphinder will do subdomain enumeration + search popular directories for GraphQL endpoints. After all this GraphCrawler will take over and work through each find. If the schema can not be introspected, you can supply it as a json file with the '-s' option.It will run through and check if mutation is enabled, check for any sensitive queries available, such as users and files, and it will also test any easy queries it find to see if authentication is required.
If introspection is not enabled on the endpoint it will check if it is an Apollo Server and then can run [Clairvoyance](https://github.com/nikitastupin/clairvoyance) to brute force and grab the suggestions to try to build the schema ourselves. (See the Clairvoyance project for greater details on this).
It will then score the findings 1-10 with 10 being the most critical.If you want to dig deeper into the schema you can also use [graphql-path-enum](https://gitlab.com/dee-see/graphql-path-enum/) to look for paths to certain types, like user IDs, emails, etc.
I hope this saves you as much time as it has for me
## Usage
1. As Baremetal Install
```bash
python graphCrawler.py -u https://test.com/graphql/api -o -a ""██████╗ ██████╗ █████╗ ██████╗ ██╗ ██╗ ██████╗██████╗ █████╗ ██╗ ██╗██╗ ███████╗██████╗
██╔════╝ ██╔══██╗██╔══██╗██╔══██╗██║ ██║██╔════╝██╔══██╗██╔══██╗██║ ██║██║ ██╔════╝██╔══██╗
██║ ███╗██████╔╝███████║██████╔╝███████║██║ ██████╔╝███████║██║ █╗ ██║██║ █████╗ ██████╔╝
██║ ██║██╔══██╗██╔══██║██╔═══╝ ██╔══██║██║ ██╔══██╗██╔══██║██║███╗██║██║ ██╔══╝ ██╔══██╗
╚██████╔╝██║ ██║██║ ██║██║ ██║ ██║╚██████╗██║ ██║██║ ██║╚███╔███╔╝███████╗███████╗██║ ██║
╚═════╝ ╚═╝ ╚═╝╚═╝ ╚═╝╚═╝ ╚═╝ ╚═╝ ╚═════╝╚═╝ ╚═╝╚═╝ ╚═╝ ╚══╝╚══╝ ╚══════╝╚══════╝╚═╝ ╚═╝
```
The output option is not required and by default it will output to schema.json
2. As Docker
Note: As graphcrawler uses some options that need files as input or produce output, it is advised to keep all those file a working directory and mount it at container path /workspace. Your all output will be saved to that mounted directory
```bash
docker build -t graphcrawler:1.2 .
docker run -v /path_to_workspace:/workspace -it graphcrawler:1.2 -u https://test.com/graphql/api -o -a ""
```
### Example output:
### Requirements
- Python3
- Docker
- Install all Python dependencies with pipWordlist from [google-10000-english](https://github.com/first20hours/google-10000-english)
### TODO
- Add option for "full report" following the endpoint search where it will run clairvoyance and all other aspects of the toolkit on the endpoints found
- Default to "simple scan" to just find endpoints when this feature is added
- Way Future: help craft queries based of the shema provided