Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/poga/hyperfeed
decentralized rss publishing
https://github.com/poga/hyperfeed
dat decentralization hyperdrive p2p publishing rss
Last synced: about 1 month ago
JSON representation
decentralized rss publishing
- Host: GitHub
- URL: https://github.com/poga/hyperfeed
- Owner: poga
- License: mit
- Created: 2016-09-23T17:53:41.000Z (about 8 years ago)
- Default Branch: master
- Last Pushed: 2017-04-25T07:22:30.000Z (over 7 years ago)
- Last Synced: 2024-09-16T01:04:10.844Z (3 months ago)
- Topics: dat, decentralization, hyperdrive, p2p, publishing, rss
- Language: JavaScript
- Homepage:
- Size: 72.3 KB
- Stars: 69
- Watchers: 5
- Forks: 3
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
- awesome-peer-to-peer - hyperfeed
- awesome-dat - hyperfeed - publish decentralized rss, atom or rdf feeds, based on `hyperdrive` and `feed` (Outdated / Other Related Dat Project Modules)
- awesome-peer-to-peer - hyperfeed
- awesome-starred - poga/hyperfeed - decentralized rss publishing (publishing)
README
# Hyperfeed
[![NPM Version](https://img.shields.io/npm/v/hyperfeed.svg)](https://www.npmjs.com/package/hyperfeed) [![JavaScript Style Guide](https://img.shields.io/badge/code%20style-standard-brightgreen.svg)](http://standardjs.com/)
Hyperfeed is a self-archiving P2P live feed. You can convert any RSS/ATOM/RDF feed to a P2P live update publishing network.
* **Self-archiving**: Items and it's linked page will be archived within hyperfeed.
* **Decentralized**: Feed contents can still be distributed between readers even if the original host is down.
* **Live**: No need to poll the original feed. Updates will be pushed to you.```
npm install hyperfeed
```## Synopsis
Publish your RSS feed through hyperfeed:
```js
const request = require('request')
const hyperfeed = require('hyperfeed')
const hyperdrive = require('hyperdrive')
const swarm = require('hyperdiscovery')const url = 'https://medium.com/feed/google-developers'
var archive = hyperdrive('./feed')
var feed = hyperfeed(archive)
feed.ready(() => {
swarm(archive)
console.log(feed.key.toString('hex'))
feed.update(request(url), (err) => {
console.log('feed imported')
})
})```
Now you can replicate the hyperfeed through a p2p network:
```js
const Hyperfeed = require('hyperfeed')
const swarm = require('hyperdiscovery')
const hyperdrive = require('hyperdrive')var archive = hyperdrive('./anotherFeed', '')
var feed = hyperfeed(archive)
swarm(archive) // load the feed from the p2p network
feed.list((err, entries) => {
console.log(entries) // all entries in the feed (include history entries)
})
```## API
#### `var feed = hyperfeed(archive, [opts])`
Create a new Hyperfeed instance. `opts` includes:
```javascript
{
scrapLink: true // set to false to stop archiving linked page for each feed item
}
```#### `feed.key`
The public key identifying the feed.
#### `feed.discoveryKey`
A key derived from the public key that can be used to discovery other peers sharing this feed.
#### `feed.meta`
The metadata of the feed.
#### `feed.ready(cb)`
Wait for feed is fully ready and all properties has been populated.
#### `feed.update(feedStream, cb(err, feed))`
import a RSS feed into `feed`. Accept a stream.
#### `feed.setMeta(metadataObject, cb(err))`
Set feed's metadata.
#### `feed.list(cb(err, entries))`
List archived item in the feed.
#### `feed.save(item, [scrappedData], cb(err))`
Save a new feed item. Check [https://github.com/jpmonette/feed](https://github.com/jpmonette/feed) for item detail.
If you already have scrapped data for the given item, you can pass it to `scrappedData` to avoid redundant requests.
#### `feed.export(count, cb(err, rss))`
Export a RSS-2.0 Feed containing latest `count` items.
## License
The MIT License