Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/austinsonger/sitemapsandrobotsaroundtheweb
Sitemaps and Robots.txt for websites around the world.
https://github.com/austinsonger/sitemapsandrobotsaroundtheweb
bug-bounty bugbounty ethical-hacking footprinting hacking information-gathering osint penetration-testing reconnaissance robots robots-txt scanning search searching security security-research sitemap sitemap-xml sitemaps webpentest
Last synced: 15 days ago
JSON representation
Sitemaps and Robots.txt for websites around the world.
- Host: GitHub
- URL: https://github.com/austinsonger/sitemapsandrobotsaroundtheweb
- Owner: austinsonger
- Created: 2020-09-15T14:43:15.000Z (over 4 years ago)
- Default Branch: master
- Last Pushed: 2020-09-18T13:38:21.000Z (over 4 years ago)
- Last Synced: 2024-11-21T05:25:04.383Z (3 months ago)
- Topics: bug-bounty, bugbounty, ethical-hacking, footprinting, hacking, information-gathering, osint, penetration-testing, reconnaissance, robots, robots-txt, scanning, search, searching, security, security-research, sitemap, sitemap-xml, sitemaps, webpentest
- Homepage:
- Size: 137 KB
- Stars: 4
- Watchers: 2
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Sitemaps and Robots around the Web
> Robots.txt for websites around the world.This is to be used for educational purposes and Security Researchers and Bug Bounty Programs.
## Robots.txt
### US Companies
- [Amazon](robots/us-companies/amazon-robots.txt)
- [Apple](robots/us-companies/apple-robots.txt)
- [Facebook](robots/us-companies/facebook-robots.txt)
- [Github](robots/us-companies/github-robots.txt)
- [Gitlab](robots/us-companies/gitlab-robots.txt)
- [Google](robots/us-companies/google-robots.txt)
- [Target](robots/us-companies/target-robots.txt)
- [Walmart](robots/us-companies/walmart-robots.txt)## Sitemaps
> Only add sitemaps that are reasonable in size.- [Google](Sitemaps/Google/README.md)
## How To
- [Robots.txt Basics](robots-basics.md)
- [Sitemap Basics](sitemap-basics.md)# Contributing
Any contributions you make are **greatly appreciated**.
1. Fork the Project
2. Commit your Changes
3. Push to the Branch
4. Open a Pull Request