Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/ShielderSec/webtech
Identify technologies used on websites.
https://github.com/ShielderSec/webtech
burpsuite webtech
Last synced: 16 days ago
JSON representation
Identify technologies used on websites.
- Host: GitHub
- URL: https://github.com/ShielderSec/webtech
- Owner: ShielderSec
- License: gpl-3.0
- Created: 2018-09-06T13:36:05.000Z (about 6 years ago)
- Default Branch: master
- Last Pushed: 2024-01-14T21:42:14.000Z (10 months ago)
- Last Synced: 2024-08-31T06:36:25.074Z (2 months ago)
- Topics: burpsuite, webtech
- Language: Python
- Homepage: https://www.shielder.it/
- Size: 278 KB
- Stars: 274
- Watchers: 14
- Forks: 44
- Open Issues: 3
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# WebTech
Identify technologies used on websites. [More info on the release's blogpost](https://www.shielder.it/blog/webtech-identify-technologies-used-on-websites/).## CLI Installation
WebTech is available on pip:
```
pip install webtech
```It can be also installed via setup.py:
```
python setup.py install --user
```## Burp Integration
Download Jython 2.7.0 standalone and install it into Burp.
In "Extender" > "Options" > "Python Environment":
- Select the Jython jar locationFinally, in "Extender" > "Extension":
- Click "Add"
- Select "py" or "Python" as extension format
- Select the `Burp-WebTech.py` file in this folder## Usage
Scan a website:
```
$ webtech -u https://example.com/Target URL: https://example.com
...$ webtech -u file://response.txt
Target URL:
...
```Full usage:
```
$ webtech -hUsage: webtech [options]
Options:
-h, --help show this help message and exit
-u URLS, --urls=URLS url(s) to scan
--ul=URLS_FILE, --urls-file=URLS_FILE
url(s) list file to scan
--ua=USER_AGENT, --user-agent=USER_AGENT
use this user agent
--rua, --random-user-agent
use a random user agent
--db=DB_FILE, --database-file=DB_FILE
custom database file
--oj, --json output json-encoded report
--og, --grep output grepable report
--udb, --update-db force update of remote db files```
## Use WebTech as a library
```
import webtech# you can use options, same as from the command line
wt = webtech.WebTech(options={'json': True})# scan a single website
try:
report = wt.start_from_url('https://shielder.it')
print(report)
except webtech.utils.ConnectionException:
print("Connection error")
```For more examples see `webtech_example.py`.
## Resources for database matching
HTTP Headers information - http://netinfo.link/http/headers.html
Cookie names - https://webcookies.org/top-cookie-names