Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/ameygawade/streamlit-robots_txt_generator
This Streamlit app allows users to generate and customize a robots.txt file by selecting user-agents, specifying disallowed paths, enabling crawler delay, and providing a sitemap URL.
https://github.com/ameygawade/streamlit-robots_txt_generator
config data-science front generative generator google robots-txt search-algorithm search-engine seo seo-optimization stream streamlit txt-files web webapp webapplication
Last synced: 3 months ago
JSON representation
This Streamlit app allows users to generate and customize a robots.txt file by selecting user-agents, specifying disallowed paths, enabling crawler delay, and providing a sitemap URL.
- Host: GitHub
- URL: https://github.com/ameygawade/streamlit-robots_txt_generator
- Owner: ameygawade
- Created: 2024-04-30T11:32:09.000Z (6 months ago)
- Default Branch: main
- Last Pushed: 2024-04-30T12:56:48.000Z (6 months ago)
- Last Synced: 2024-06-03T09:46:57.527Z (5 months ago)
- Topics: config, data-science, front, generative, generator, google, robots-txt, search-algorithm, search-engine, seo, seo-optimization, stream, streamlit, txt-files, web, webapp, webapplication
- Language: Python
- Homepage: https://www.whoamey.com/projects-by-amey-gawade/data-science/streamlit/robots_txt_generator
- Size: 98.3 MB
- Stars: 1
- Watchers: 0
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
- jimsghstars - ameygawade/streamlit-robots_txt_generator - This Streamlit app allows users to generate and customize a robots.txt file by selecting user-agents, specifying disallowed paths, enabling crawler delay, and providing a sitemap URL. (Python)