Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/dabapps/django-simple-robots
Simple robots.txt file serving for web applications
https://github.com/dabapps/django-simple-robots
Last synced: 4 days ago
JSON representation
Simple robots.txt file serving for web applications
- Host: GitHub
- URL: https://github.com/dabapps/django-simple-robots
- Owner: dabapps
- License: bsd-2-clause
- Created: 2017-10-09T16:28:04.000Z (about 7 years ago)
- Default Branch: master
- Last Pushed: 2023-07-17T14:13:19.000Z (over 1 year ago)
- Last Synced: 2023-07-17T20:45:14.230Z (over 1 year ago)
- Language: Python
- Size: 25.4 KB
- Stars: 2
- Watchers: 20
- Forks: 2
- Open Issues: 1
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
django-simple-robots
====================[![Build Status](https://travis-ci.org/dabapps/django-simple-robots.svg)](https://travis-ci.org/dabapps/django-simple-robots)
[![pypi release](https://img.shields.io/pypi/v/django-simple-robots.svg)](https://pypi.python.org/pypi/django-simple-robots)Most web applications shouldn't be indexed by Google. This app just provides a view that serves a "deny all" robots.txt.
In some cases, you do want your app to be indexed - but only in your production environment (not any staging environments). For this case, you can set `ROBOTS_ALLOW_HOST`. If the incoming hostname matches this setting, an "allow all" robots.txt will be served. Otherwise, the "deny all" will be served.
Tested against Django 3.2, 4.0, 4.1 and 4.2 on Python 3.8, 3.9, 3.10 and 3.11
### Installation
Install from PIP
pip install django-simple-robots
In your root urlconf, add an entry as follows:
from django.conf.urls import url
from simple_robots.views import serve_robotsurlpatterns = [
path("robots.txt", serve_robots),
# ..... other stuff
]Then, add `simple_robots` to `INSTALLED_APPS` in your `settings.py`
Optionally, set `ROBOTS_ALLOW_HOST` settings variable.
ROBOTS_ALLOW_HOST = "myproductionurl.com"
That's it!
### Customization
The allow and disallow template are stored at `robots.txt` and `robots-disallow.txt` respectively. You can override these in your projects templates directory to customize the responses.
### Code of conduct
For guidelines regarding the code of conduct when contributing to this repository please review [https://www.dabapps.com/open-source/code-of-conduct/](https://www.dabapps.com/open-source/code-of-conduct/)