Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/tanepiper/robots-webpack-plugin

A webpack plugin to output a robots.txt file with options per environment
https://github.com/tanepiper/robots-webpack-plugin

Last synced: about 1 month ago
JSON representation

A webpack plugin to output a robots.txt file with options per environment

Awesome Lists containing this project

README

        

## Webpack Robots.txt

A webpack plugin for generating robots.txt files.

### Why is this plugin useful

When you have multiple environments such as dev, qa and production to want to ensure your non-production environments
are not exposed to search engines such as Google.

This plugin allows you to include this in your environment settings to generate a file.

### Install the plugin

`npm install --save-dev @tanepiper/robots-webpack-plugin`

### How to I use this plugin?

In your webpack config:

```
const RobotsPlugin = require('@tanepiper/robots-webpack-plugin');

const webpackConfig = {
entry: 'src/index.js',
output: {
path: '/public',
filename: 'index.js'
},
plugins: [new RobotsPlugin()]
};
```

The default output is an `Allow` directive for all user agents. To configure, pass in an array to userAgents option.
Each one is an object with a `name:string`, `disallow:array` and `allow:array` in any combination.

There is also the sitemap options which will output a Sitemap directive at the end. Just provide a full url to the sitemap file.

```
plugins: [new RobotsPlugin({
sitemap: 'https://example.com/sitemap.xml',
userAgents: [{
name: '*',
disallow: ['/', '/cgi-bin'],
allow: ['/blog']
}, {
name: 'Googlebot',
disallow: ['/cgi-bin', '/some-path'],
allow: ['/']
}]
})]
```