https://github.com/friendsofshopware/froshrobotstxt
This plugin allows you to manage your robots.txt file.
https://github.com/friendsofshopware/froshrobotstxt
Last synced: about 1 month ago
JSON representation
This plugin allows you to manage your robots.txt file.
- Host: GitHub
- URL: https://github.com/friendsofshopware/froshrobotstxt
- Owner: FriendsOfShopware
- License: mit
- Created: 2021-09-07T22:07:31.000Z (over 3 years ago)
- Default Branch: main
- Last Pushed: 2024-04-22T07:19:58.000Z (about 1 year ago)
- Last Synced: 2024-04-23T10:50:03.855Z (about 1 year ago)
- Language: PHP
- Homepage:
- Size: 27.3 KB
- Stars: 9
- Watchers: 2
- Forks: 0
- Open Issues: 1
-
Metadata Files:
- Readme: README.md
- Changelog: CHANGELOG.md
Awesome Lists containing this project
README
# FroshRobotsTxt
[](https://twitter.com/ShopwareFriends)
[](https://slack.shopware.com)This plugin provides a `robots.txt` for a Shopware 6 shop. Currently it is not possible to distinguish different user-agents. Note that in general only the `robots.txt` at the root [will be considered](https://developers.google.com/search/docs/advanced/robots/create-robots-txt#format_location).
## `Allow` and `Disallow` rules
Currently there exist the following default rules
```
Allow: /
Disallow: /*?
Allow: /*theme/
Allow: /media/*?ts=
```
If you need to modify them this should be done by a template modification. If there are other general rules which are useful for others, consider creating a pull request.It is possible to configure the `Disallow` and `Allow` rules in the plugin configuration. Each line needs to start with `Allow:` or `Disallow:` followed by the URI-path. The generated `robots.txt` will contain each path prefixed with the absolute base path.
For example suppose you have two "domains" configured for a sales channel `example.com` and `example.com/en` and the plugin configuration
```
Disallow: /account/
Disallow: /checkout/
Disallow: /widgets/
Allow: /widgets/cms/
Allow: /widgets/menu/offcanvas
```
The `robots.txt` at `example.com/robots.txt` contains:
```
Disallow: /en/account/
Disallow: /en/checkout/
Disallow: /en/widgets/
Allow: /en/widgets/cms/
Allow: /en/widgets/menu/offcanvas
Disallow: /account/
Disallow: /checkout/
Disallow: /widgets/
Allow: /widgets/cms/
Allow: /widgets/menu/offcanvas
```## Sitemaps
In addition to the rules the sitemaps containing the domain will be linked. Again suppose we have the domains `example.com` and `example.com/en` configured, the `robots.txt` will contain
```
Sitemap: https://example.com/en/sitemap.xml
Sitemap: https://example.com/sitemap.xml
```