Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/zfael/scrape-it-all
Modular web scraper for Node.JS
https://github.com/zfael/scrape-it-all
crawler scraper scraping scraping-websites web-scraping
Last synced: about 1 month ago
JSON representation
Modular web scraper for Node.JS
- Host: GitHub
- URL: https://github.com/zfael/scrape-it-all
- Owner: zfael
- License: mit
- Created: 2019-12-09T23:24:27.000Z (about 5 years ago)
- Default Branch: master
- Last Pushed: 2022-12-10T12:09:53.000Z (about 2 years ago)
- Last Synced: 2024-12-16T04:36:22.018Z (about 1 month ago)
- Topics: crawler, scraper, scraping, scraping-websites, web-scraping
- Language: TypeScript
- Homepage:
- Size: 1.45 MB
- Stars: 0
- Watchers: 4
- Forks: 1
- Open Issues: 16
-
Metadata Files:
- Readme: README.md
- Contributing: CONTRIBUTING.md
- License: LICENSE
Awesome Lists containing this project
README
# Scrape-it-all π·οΈπΈοΈ
> Modular web scraper for Node.JSHave you ever wanted to be aware of when something changes in a webpage and thought about build a scraper to keep posted about it? yeah? so scrape-it-all project aims to solve the initial setup overhead for ya!
Basically, it wraps the whole logic of a given scraper so you just need to call a function and get the result you wanted!For instance
```ts
import coinMarketCap from '@scrape-it-all/coin-market-cap';
const result = await coinMarketCap.cryptoDetails({ metadata: { crypto: 'bitcoin' } });// result
{
"name": " Bitcoin (BTC)",
"icon": "https://s2.coinmarketcap.com/static/img/coins/64x64/1.png?_=b4ab82b",
"currentValue": "$9,897.98 USD"
}
```## How is the repo structured?
The scrape-it-all project is a monorepo that is built on top of [Lerna](https://github.com/lerna/lerna) so we can leverage individual package based on each scraper available.
In addition, there might be some core functionality that will be used among scrapers which makes sense to expose that as an individual package as well.## Wanna help?
Want to report a bug, ask for a feature/scraper (or be the one to implement it), or improve something you feel might be worth? Alright, we get you covered! Read up on the [contributing](https://github.com/zfael/scrape-it-all/blob/master/CONTRIBUTING.md) file to be in sync with our guidelines!