Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/amandeepmittal/robotize
Generates a robots.txt
https://github.com/amandeepmittal/robotize
javascript nodejs npm npm-package robots robots-generator robots-txt
Last synced: about 1 month ago
JSON representation
Generates a robots.txt
- Host: GitHub
- URL: https://github.com/amandeepmittal/robotize
- Owner: amandeepmittal
- Created: 2016-03-09T18:39:21.000Z (almost 9 years ago)
- Default Branch: master
- Last Pushed: 2019-11-01T13:49:48.000Z (about 5 years ago)
- Last Synced: 2024-05-02T20:59:38.929Z (8 months ago)
- Topics: javascript, nodejs, npm, npm-package, robots, robots-generator, robots-txt
- Language: JavaScript
- Homepage: https://www.npmjs.com/package/robotize
- Size: 13.7 KB
- Stars: 4
- Watchers: 4
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: Readme.md
- Changelog: History.md
Awesome Lists containing this project
README
# robotize
[![npm version][version-badge]][version-url]
[![build status][build-badge]][build-url]
[![downloads][downloads-badge]][downloads-url]> Generates a robots.txt
This module generates a robots.txt. The generated robots.txt conforms to the
[standards set by Google](https://developers.google.com/webmasters/control-crawl-index/docs/robots_txt).
Use it to programmatically generate a robots.txt file for your site.## Installation
```
$ npm install robotize
```## Example
```javascript
const robotize = require('robotize');
const opts = {
useragent: 'googlebot',
allow: ['index.html', 'about.html'],
disallow: ['404.html'],
sitemap: 'https://www.site.com/sitemap.xml'
};robotize(opts, (err, robots) => {
if (err) {
throw new Error(err);
} else {
console.log(robots);
}
});
```Will log:
```
User-agent: googlebot
Allow: index.html
Allow: about.html
Disallow: 404.html
Sitemap: https://www.site.com/sitemap.xml
```## Options
Robotize accepts an object with options. The options are:
- `useragent`: the useragent - String, default: `*`
- `allow`: an array of the url(s) to allow - Array of Strings
- `disallow`: an array of the url(s) to disallow - Array of Strings
- `sitemap`: the sitemap url - StringRobotize expects at least one of the last three options. So either `allow`,
`disallow` or `sitemap` must be passed.## Credits
Forked from
[robots-generator](https://github.com/haydenbleasel/robots-generator).## License
MIT
[build-badge]: https://travis-ci.org/superwolff/robotize.svg
[build-url]: https://travis-ci.org/superwolff/robotize
[downloads-badge]: https://img.shields.io/npm/dm/robotize.svg
[downloads-url]: https://www.npmjs.com/package/robotize
[version-badge]: https://img.shields.io/npm/v/robotize.svg
[version-url]: https://www.npmjs.com/package/robotize