Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/jvidalv/super-simple-sitemap-generator
Node powered scraper that iterates trough all the internal links of the specified url. It works on CSR pages (React, Angular) with dynamic urls.
https://github.com/jvidalv/super-simple-sitemap-generator
csr dynamic google-search-console node sitemap xml
Last synced: 3 months ago
JSON representation
Node powered scraper that iterates trough all the internal links of the specified url. It works on CSR pages (React, Angular) with dynamic urls.
- Host: GitHub
- URL: https://github.com/jvidalv/super-simple-sitemap-generator
- Owner: jvidalv
- Created: 2020-02-06T20:06:08.000Z (almost 5 years ago)
- Default Branch: master
- Last Pushed: 2024-06-18T00:56:44.000Z (7 months ago)
- Last Synced: 2024-09-19T13:49:27.252Z (4 months ago)
- Topics: csr, dynamic, google-search-console, node, sitemap, xml
- Language: JavaScript
- Homepage:
- Size: 59.6 KB
- Stars: 16
- Watchers: 2
- Forks: 2
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
[![License](http://img.shields.io/npm/l/super-simple-sitemap-generator.svg?style=flat-square)](http://opensource.org/licenses/MIT)
[![NPM Version](http://img.shields.io/npm/v/super-simple-sitemap-generator.svg?style=flat-square)](https://npmjs.com/package/super-simple-sitemap-generator)
[![NPM Downloads](https://img.shields.io/npm/dm/super-simple-sitemap-generator.svg?style=flat-square)](https://npmjs.com/package/super-simple-sitemap-generator)A node.js powered scrapper 🔥 that iterates trough all the internal links of the specified url.
It works on CSR pages (React, Angular) with dynamic urls.
Once it is done it generates a ``sitemap.xml`` file with all the urls found, ready to be uploaded to Google Search Console.
#### Usage:
``` bash
$ sitemap https://vvlog.dev
```#### Params:
Parameter | type | default | description
--- | --- | --- | ---
--wait | integer | 1500 | Specify the time (milliseconds) to wait (So the fetches are completed) before starting to parse the page.
--limit | integer | 999999 | Specify the limit of urls to parse before stopping the scrapper.#### Todo:
* [x] Make it a NPM package.
* [ ] Make wait time dynamic in response of fetches inside url.
* [ ] New params that lets you specify how deep you want to go inside the url.
* [ ] Integrate it as part of build process of a create-react-app.
* [ ] Clean old code.