Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/creekorful/bathyscaphe
Fast, highly configurable, cloud native dark web crawler.
https://github.com/creekorful/bathyscaphe
architecture crawler crawling elasticsearch golang hidden-services kibana tor web-crawler
Last synced: 2 months ago
JSON representation
Fast, highly configurable, cloud native dark web crawler.
- Host: GitHub
- URL: https://github.com/creekorful/bathyscaphe
- Owner: creekorful
- License: gpl-3.0
- Created: 2020-03-12T09:06:33.000Z (almost 5 years ago)
- Default Branch: main
- Last Pushed: 2023-07-03T06:20:26.000Z (over 1 year ago)
- Last Synced: 2024-10-13T13:31:40.310Z (2 months ago)
- Topics: architecture, crawler, crawling, elasticsearch, golang, hidden-services, kibana, tor, web-crawler
- Language: Go
- Homepage: https://blog.creekorful.com/building-fast-modern-web-crawler/
- Size: 830 KB
- Stars: 92
- Watchers: 6
- Forks: 22
- Open Issues: 4
-
Metadata Files:
- Readme: README.md
- Changelog: CHANGELOG.md
- License: LICENSE
Awesome Lists containing this project
README
# Bathyscaphe dark web crawler
![CI](https://github.com/creekorful/bathyscaphe/workflows/CI/badge.svg)
Bathyscaphe is a Go written, fast, highly configurable, cloud-native dark web crawler.
# How to start the crawler
To start the crawler, one just need to execute the following command:
```sh
$ ./scripts/docker/start.sh
```and wait for all containers to start.
## Notes
- You can start the crawler in detached mode by passing --detach to start.sh.
- Ensure you have at least 3 GB of memory as the Elasticsearch stack docker will require 2 GB.# How to initiate crawling
One can use the RabbitMQ dashboard available at localhost:15003, and publish a new JSON object in the **crawlingQueue**
.The object should look like this:
```json
{
"url": "https://facebookcorewwwi.onion"
}
```## How to speed up crawling
If one want to speed up the crawling, he can scale the instance of crawling component in order to increase performances.
This may be done by issuing the following command after the crawler is started:```sh
$ ./scripts/docker/start.sh -d --scale crawler=5
```this will set the number of crawler instance to 5.
# How to view results
You can use the Kibana dashboard available at http://localhost:15004. You will need to create an index pattern named '
resources', and when it asks for the time field, choose 'time'.# How to hack the crawler
If you've made a change to one of the crawler component and wish to use the updated version when running start.sh you
just need to issue the following command:```sh
$ goreleaser --snapshot --skip-publish --rm-dist
```this will rebuild all images using local changes. After that just run start.sh again to have the updated version
running.# Architecture
The architecture details are available [here](docs/architecture.png).