Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/node-js-libs/node.io
https://github.com/node-js-libs/node.io
Last synced: about 2 months ago
JSON representation
- Host: GitHub
- URL: https://github.com/node-js-libs/node.io
- Owner: node-js-libs
- License: mit
- Created: 2010-11-03T08:53:10.000Z (almost 14 years ago)
- Default Branch: master
- Last Pushed: 2015-11-18T03:22:35.000Z (almost 9 years ago)
- Last Synced: 2024-07-10T14:23:12.518Z (2 months ago)
- Language: JavaScript
- Homepage:
- Size: 3.1 MB
- Stars: 1,808
- Watchers: 51
- Forks: 147
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- Changelog: HISTORY.md
- License: LICENSE
Awesome Lists containing this project
- awesome-github-repos - node-js-libs/node.io - (JavaScript)
README
**Note: this library is no longer maintained.**
I wrote node.io in 2010 when node.js was still in its infancy and the npm repository didn't have the amazing choice of libraries as it does today.
Since it's now quite trivial to write your own scraper I've decided to stop maintaining the library.
Here's an example using [request](https://github.com/mikeal/request), [cheerio](https://github.com/MatthewMueller/cheerio) and [async](https://github.com/caolan/async).
```javascript
var request = require('request')
, cheerio = require('cheerio')
, async = require('async')
, format = require('util').format;var reddits = [ 'programming', 'javascript', 'node' ]
, concurrency = 2;async.eachLimit(reddits, concurrency, function (reddit, next) {
var url = format('http://reddit.com/r/%s', reddit);
request(url, function (err, response, body) {
if (err) throw err;
var $ = cheerio.load(body);
$('a.title').each(function () {
console.log('%s (%s)', $(this).text(), $(this).attr('href'));
});
next();
});
});
```Happy scraping.