Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/waived/google-drive-crawler
Proxy-based crawler to expose public (shared) Google Drive links
https://github.com/waived/google-drive-crawler
crawler crawler-python file-crawler google-drive-api shared-folders web-spider
Last synced: about 2 months ago
JSON representation
Proxy-based crawler to expose public (shared) Google Drive links
- Host: GitHub
- URL: https://github.com/waived/google-drive-crawler
- Owner: waived
- Created: 2024-11-21T20:08:33.000Z (2 months ago)
- Default Branch: main
- Last Pushed: 2024-12-04T13:47:49.000Z (about 2 months ago)
- Last Synced: 2024-12-04T14:37:21.239Z (about 2 months ago)
- Topics: crawler, crawler-python, file-crawler, google-drive-api, shared-folders, web-spider
- Language: Python
- Homepage:
- Size: 46.9 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.txt
Awesome Lists containing this project
README
\\\\---====================================---////
>>>> Google Drive - Shared Folder Crawler <<<<
////---====================================---\\\\Purpose:
Ultimately the goal of this script is to mimic the URL API scheme
generated by Google Drives and attempt to generate valid public
shared-folders. Here, one can snoop on the content and potentially
spy some sensitive/interesting information.Because of the security implementations by Google, this script forces
the end-user to utilize SOCKS4 proxies. If you need a fresh/checked
list of proxies (SOCKS4 or otherwise) you can load them from my other
project here: https://github.com/waived/proxy-scraperReport:
If the request yields a 'OK 200' response the URL will be alerted to
the user. Otherwise, any other error code (ex: Error 504) will not
be reported and the URL will be considered non-existed/dead.Bugs:
None. However, please feel free to report any :)