Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/voronind/django-url-robots
Django robots.txt generator
https://github.com/voronind/django-url-robots
Last synced: 28 days ago
JSON representation
Django robots.txt generator
- Host: GitHub
- URL: https://github.com/voronind/django-url-robots
- Owner: voronind
- License: other
- Created: 2012-12-28T19:36:08.000Z (almost 12 years ago)
- Default Branch: master
- Last Pushed: 2020-12-08T15:13:45.000Z (almost 4 years ago)
- Last Synced: 2024-10-08T15:10:33.658Z (about 1 month ago)
- Language: Python
- Size: 32.2 KB
- Stars: 6
- Watchers: 3
- Forks: 4
- Open Issues: 1
-
Metadata Files:
- Readme: README.rst
- Changelog: CHANGELOG
- License: LICENSE
Awesome Lists containing this project
README
=========================
django-url-robots
=========================``Django`` ``robots.txt`` generator. Based on using decorated ``django.conf.urls.url``.
It gets ``urlpatterns`` and replaces ambiguous parts by ``*``.Installation & Usage
=========================The recommended way to install django-url-robots is with `pip `_
1. Install from PyPI with ``easy_install`` or ``pip``::
pip install django-url-robots
2. Add ``'url_robots'`` to your ``INSTALLED_APPS``::
INSTALLED_APPS = (
...
'url_robots',
...
)3. Add url_robots view to your root URLconf::
urlpatterns += [
url(r'^robots\.txt$', url_robots.views.robots_txt),
]4. Describe rules by boolean keyword argument ``robots_allow`` using for it ``url_robots.utils.url`` instead ``django.conf.urls.url``::
from url_robots.utils import url
urlpatterns += [
url('^profile/private$', views.some_view, robots_allow=False),
]
``django-url-robots`` tested with ``Django-1.8+``. Encodes unicode characters by percent-encoding.Settings
====================In this moment there are only one option to define template of ``robots.txt`` file::
urlpatterns += [
url(r'^robots\.txt$', url_robots.views.robots_txt, {'template': 'my_awesome_robots_template.txt'}),
]Example
===================
robots_template.txt::User-agent: *
Disallow: /* # disallow all
{{ rules|safe }}urls.py::
from django.conf.urls import include
urlpatterns = [
url(r'^profile', include('url_robots.tests.urls_profile')),
]urls_profile.py::
from url_robots.utils import url
urlpatterns = [
url(r'^s$', views.some_view, name='profiles', robots_allow=True),
url(r'^/(?P\w+)$', views.some_view),
url(r'^/(?P\w+)/private', views.some_view, name='profile_private', robots_allow=False),
url(r'^/(?P\w+)/public', views.some_view, name='profile_public', robots_allow=True),
]Resulting robots.txt::
User-agent: *
Disallow: /* # disallow all
Allow: /profiles$
Disallow: /profile/*/private*
Allow: /profile/*/public*