Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/suddi/fundscraper
Collection of web crawlers to scrape fund data using Scrapy
https://github.com/suddi/fundscraper
crawler funds scraper scrapy
Last synced: 26 days ago
JSON representation
Collection of web crawlers to scrape fund data using Scrapy
- Host: GitHub
- URL: https://github.com/suddi/fundscraper
- Owner: suddi
- License: apache-2.0
- Created: 2017-10-04T04:26:56.000Z (about 7 years ago)
- Default Branch: master
- Last Pushed: 2024-04-08T16:08:31.000Z (7 months ago)
- Last Synced: 2024-04-14T22:55:34.680Z (7 months ago)
- Topics: crawler, funds, scraper, scrapy
- Language: Python
- Size: 28.3 KB
- Stars: 1
- Watchers: 3
- Forks: 2
- Open Issues: 2
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# fundscraper
[![CircleCI](https://circleci.com/gh/suddi/fundscraper.svg?style=svg)](https://circleci.com/gh/suddi/fundscraper)
[![Codacy Badge](https://api.codacy.com/project/badge/Grade/d43e89675ba64685ad3635b884bf4957)](https://www.codacy.com/app/Suddi/fundscraper)
[![license](https://img.shields.io/github/license/suddi/fundscraper.svg)](https://github.com/suddi/fundscraper/blob/master/LICENSE)[Scrapy](https://scrapy.org/) web crawlers to collect fund data
## Installation
If you have `virtualenvwrapper`, set up a virtual environment with the following command:
````
mkvirtualenv fundscraper
````Install dependencies:
````
pip install -r requirements.txt
````Install dev dependencies:
````
python install -r test_requirements.txt
````## Usage
To crawl utilizing a spider:
````
scrapy crawl aia.com.hk
````To calculate returns from historic data:
````
python setup.py compute_returns
````To compute most performant funds:
````
python setup.py compute_performing_funds
````