Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/jxltom/scrapymon
Simple Web UI for Scrapy spider management via Scrapyd
https://github.com/jxltom/scrapymon
scrapy scrapy-dashboard scrapy-ui scrapyd scrapyd-dashboard scrapyd-ui
Last synced: 3 months ago
JSON representation
Simple Web UI for Scrapy spider management via Scrapyd
- Host: GitHub
- URL: https://github.com/jxltom/scrapymon
- Owner: jxltom
- License: mit
- Created: 2017-02-13T11:41:20.000Z (almost 8 years ago)
- Default Branch: master
- Last Pushed: 2018-06-25T14:34:33.000Z (over 6 years ago)
- Last Synced: 2024-07-15T11:52:06.943Z (4 months ago)
- Topics: scrapy, scrapy-dashboard, scrapy-ui, scrapyd, scrapyd-dashboard, scrapyd-ui
- Language: Python
- Homepage: https://scrapymon.herokuapp.com/
- Size: 2.61 MB
- Stars: 49
- Watchers: 4
- Forks: 11
- Open Issues: 6
-
Metadata Files:
- Readme: README.md
- Changelog: CHANGES
- License: LICENSE
Awesome Lists containing this project
README
# scrapymon
Simple management UI for scrapyd. The demo is available at [http://scrapymon.demo.jxltom.me/](http://scrapymon.demo.jxltom.me/) with ```admin``` for both username and password. Note that the demo will reset every 40 minutes and it may take some time to spin up if no one has accessed it for a while.
## Features
- Show all projects from a Scrapyd server
- Show all versions of each project
- Show all spiders in each project
- Show all pending, running and finished jobs from a Scrapyd server
- Show logs of each job
- Schedule spiders run
- Cancel pending or running jobs
- Delete project or a specific version
- Http basic access authentication supported
- Served by [Gevent](https://github.com/gevent/gevent) for production use## Screenshots
![projects_dash](https://github.com/jxltom/scrapymon/blob/master/docs/screenshots/projects_dash.png)
![jobs_dash](https://github.com/jxltom/scrapymon/blob/master/docs/screenshots/jobs_dash.png)
![logs_dash](https://github.com/jxltom/scrapymon/blob/master/docs/screenshots/logs_dash.png)## Getting Started
- Install by ```pip install scrapymon```.
- Run by ```scrapymon [--host=] [--port=] [--server=] [--auth=]```.
- Default ```--host``` is ```0.0.0.0```
- Default ```--port``` is ```5000```
- Default ```--server``` is ```http://127.0.0.1:6800```
- Default ```--auth``` is ```admin:admin```
- Or you can run by ```scrapymon``` with valid environment variables ```$HOST```, ```$PORT```, ```$SCRAPYD_SERVER``` and ```$BASIC_AUTH```.## TODO
- Support schedule a spider run with arguments.
- Highlighted and searcharable logs with catagories
- Logs auto refresh and pagination
- Create project via ```addversion.json```.
- Time Localization
- Add Dockerfile## Contributing
Contributions are welcomed!