https://github.com/reactual/hacker-news-favorites-api
A simple script that will download the favorites for the provided hacker news user id
https://github.com/reactual/hacker-news-favorites-api
Last synced: 6 months ago
JSON representation
A simple script that will download the favorites for the provided hacker news user id
- Host: GitHub
- URL: https://github.com/reactual/hacker-news-favorites-api
- Owner: reactual
- License: mit
- Created: 2018-07-01T20:56:16.000Z (almost 7 years ago)
- Default Branch: master
- Last Pushed: 2023-01-24T08:53:19.000Z (over 2 years ago)
- Last Synced: 2024-11-14T17:51:03.504Z (6 months ago)
- Language: JavaScript
- Size: 1.72 MB
- Stars: 68
- Watchers: 3
- Forks: 7
- Open Issues: 7
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# hacker-news-favorites-api
A simple script that will scrape the favorites for a provided hacker news user id.
[Before](https://news.ycombinator.com/favorites?id=sbr464) ⇨ [After](https://hnfavs.reactual.autocode.gg/?id=sbr464&limit=1) 👍
## Why?
The official Hacker News API from Y Combinator doesn't include the favorites endpoint. I use favorites like bookmarks and wanted to sync them to another app.
## Usage
You can run yourself, or use the api endpoint hosted at the below url. Be sure to pass the query params mentioned below.
##### Base Endpoint:
`https://hnfavs.reactual.autocode.gg/`
##### Example URL:
This example would download 1 page of 30 favorites for the user `sbr464`
`https://hnfavs.reactual.autocode.gg/?id=sbr464&limit=1`
---
Accepts 3 query params, only `id` is required.
#### id
_required_ | `string` | default='' | The Hacker News username.
#### limit
_optional_ | `number` | default=1 | Max # of pages to download (30/page).
#### offset
_optional_ | `number` | default=1 (none) | Pagination offset, the page to begin on.
---
## Data Format
The API will return a JSON object-array containg the favorites as below. You can use the article id to obtain more info via the [Hacker News API](https://github.com/HackerNews/API). There is more info (comments link, score) avail to scrape from favorites, but the HN API is more reliable.
```js
;[
{
// HN article id
id: '17437185',// Article's direct url (not comments)
link: 'https://github.com/serhii-londar/open-source-mac-os-apps',// Title of the article on HN
title: 'Awesome macOS open source applications'
}
]
```## Notes
Currently, the request timeout is manually set to just under 3 minutes, but could change. You may need to provide an offset and make multiple requests if you are getting a timeout when trying to download hundreds-thousands+ of favorites. If you receive a timeout error, modify the limit/offset and retry the request. We are using [autocode](https://autocode.com/) to host the api. (stdlib.com rebranded to autocode.com)
## More Info
Additional docs & a live testing tool avail at [autocode](https://autocode.com/lib/reactual/hnfavs/).
## Installation via npm
Recently published as `hnfavs` on npm, have only done basic testing.
`x-ray` is a required peer dependency, you'll need to add to your project.```sh
npm install --save hnfavs x-rayyarn add hnfavs x-ray
``````js
// Basic usage
const hnfavs = require('hnfavs')hnfavs('sbr464').then(console.log)
```## Local Installation
It's a node.js app, using [x-ray](https://github.com/matthewmueller/x-ray) for the web scraping. See `src/main.test.js` for example usage locally.
```sh
# Clone repo
$ git clone [email protected]:reactual/hacker-news-favorites-api.git$ cd hacker-news-favorites-api
# Install with yarn or npm
$ yarn# Run tests
$ yarn test
```