Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/k0nxt3d/web-scrapers
Web Scraping Scripts in PhP and Bash
https://github.com/k0nxt3d/web-scrapers
bash bot clone cloning crawler curl curlphp download mirroring scraping scraping-websites seo seo-optimization shell-script spider wget
Last synced: 3 days ago
JSON representation
Web Scraping Scripts in PhP and Bash
- Host: GitHub
- URL: https://github.com/k0nxt3d/web-scrapers
- Owner: K0NXT3D
- License: gpl-3.0
- Created: 2021-12-06T05:34:13.000Z (almost 3 years ago)
- Default Branch: main
- Last Pushed: 2021-12-31T07:04:24.000Z (almost 3 years ago)
- Last Synced: 2023-12-24T11:27:27.642Z (11 months ago)
- Topics: bash, bot, clone, cloning, crawler, curl, curlphp, download, mirroring, scraping, scraping-websites, seo, seo-optimization, shell-script, spider, wget
- Language: Shell
- Homepage:
- Size: 164 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# web-scrapers
Web Scraping Scripts in PhP and BashTerrorBot - Paparazzi - BashKat
These are all nice and simple, easy on the processor and server loads.
These worked great with proxychains, so if you're still using that..The code for each is really the same.
General variations to the "bot" itself and how I use them on my end.
Using Cron is the best way if you're able to proxychains your IP address.
I only suggest that because these can get you blacklisted.
Other options include sleep 6h6m5s etc, BUT... it helps to strart a completely
new instance of the bot and refresh your IP/Browser and I'll be working on the domain spoofing.
It's probably right in front of me.Normally I don't even run the filenames and I have them in their own folders as "bot"
chmod +x and run ./bot.
I have the multi-bot launcher of course, I'm sure it's around here somewhere?Dir BashKat/
./bot
Pretty Simple, Have fun making it your own.