Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/davideolgiati/partyloud

A simple tool to generate fake web browsing and mitigate tracking
https://github.com/davideolgiati/partyloud

bash bash-script bot dns http https privacy privacy-tools shell traffic-analysis traffic-generator traffic-simulation

Last synced: about 5 hours ago
JSON representation

A simple tool to generate fake web browsing and mitigate tracking

Awesome Lists containing this project

README

        


Partyloud


Get your privacy back

Generate fake web browsing and mitigate tracking






made-with-bash



CodeFactor



GitHub code size in bytes





Last commit



License





PRs Welcome








PartyLoud is a highly configurable and straightforward free tool that helps you prevent tracking directly from your linux terminal, no special skills required. Once started, you can forget it is running. It provides several flags; each flag lets you customize your experience and change PartyLoud behaviour according to your needs.


Screenshot


Please submit bugs and feature requests and help me to continuously improve this project.

For questions / feedbacks please contact me Here



- **Simple.** 3 files only, no installation required, just clone this repo an you're ready to go.
- **Powerful.** Thread-based navigation.
- **Stealthy.** Optimized to emulate user navigation.
- **Portable.** You can use this script on every unix-based OS.

This project was inspired by [noisy.py](https://github.com/1tayH/noisy "noisy.py")

### [πŸ“ Changelog](CHANGELOG.md)

## βš™οΈ `How It Works`

1. URLs and keywords are loaded (either from partyloud.conf and badwords or from user-defined files)
2. If proxy flag has been used, proxy config will be tested
3. For each URL in ULR-list a thread is started, each thread as an [user agent](https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/User-Agent) associated
4. Each thread will start by sending an [HTTP](https://www.scaler.com/topics/hypertext-transfer-protocol/) request to the given URL
5. The response if filtered using the keywords in order to prevent 404s and malformed URLs
6. A new URL is choosen from the list generated after filering
7. Current thread sleeps for a random time
8. Actions from 4 to 7 are repeated using the new URL until user send kill signal (CTRL-C or enter key)

## πŸš€ `Features`

- Configurable urls list and blocklist
- Random DNS Mode : each request is done on a different DNS Server
- Multi-threaded request engine (# of thread are equal to # of urls in partyloud.conf)
- Error recovery mechanism to protect Engines from failures
- Spoofed User Agent prevent from fingerprinting (each engine has a different user agent)
- Dynamic UI

## πŸŽ‰ `Setup`

Clone the repository:
```sh
git clone https://github.com/realtho/PartyLoud.git
```
Navigate to the directory and make the script executable:
```sh
cd PartyLoud
chmod +x partyloud.sh
```
Run 'partyloud':
```sh
./partyloud.sh
```

## πŸ“‹ `Usage`

```sh
Usage: ./partyloud.sh [options...]

-d --dns DNS Servers are sourced from specified FILE,
each request will use a different DNS Server
in the list
!!WARNING THIS FEATURE IS EXPERIMENTAL!!
!!PLEASE LET ME KNOW ISSUES ON GITHUB !!
-l --url-list read URL list from specified FILE
-b --blocklist read blocklist from specified FILE
-p --http-proxy set a HTTP proxy
-s --https-proxy set a HTTPS proxy
-n --no-wait disable wait between one request and an other
-h --help dispaly this help
```

##### To stop the script press either enter or CRTL-C

##  ⚠️ `File Specifications`


In current release there is no input-validation on files.

If you find bugs or have suggestions on how to improve this features please help me by opening issues on GitHub



### Intro

###### If you don’t have special needs , default config files are just fine to get you started.

Default files are located in:

* [badwords](badwords)
* [partyloud.conf](partyloud.conf)
* [DNSList](DNSList)

Please note that file name and extension are not important, just content of files matter

#### [badwords](badwords) - Keywords-based blocklist

[badwords](badwords) is a keywords-based blocklist used to filter non-HTML content, images, document and so on.
The default config as been created after several weeks of testing. If you really think you need a custom blocklist, my suggestion is to start by copy and modifying default config according to your needs.
Here are some hints on how to create a great blocklist file:

| DO βœ… | DONT 🚫 |
| ------------- | ------------- |
| Use only ASCII chars | Define one-site-only rules |
| Try to keep the rules as general as possible | Define case-sensitive rules |
| Prefer relative path | Place more than one rule per line |

#### [partyloud.conf](partyloud.conf) - ULR List

[partyloud.conf](partyloud.conf) is a ULR List used as starting point for fake navigation generators.
The goal here is to create a good list of sites containing a lot of URLs.
Aside suggesting you not to use google, youtube and social networks related links, I've really no hints for you.
###### Note #1 - To work properly the URLs must be [well-formed](https://earthsci.stanford.edu/computing/hosting/urlsyntax/index.php)
###### Note #2 - Even if the file contains 1000 lines only 10 are used (first 10, working on randomness)
###### Note #3 - Only one URL per line is allowed

#### [DNSList](DNSList) - DNS List

[DNSList](DNSList) is a List of DNS used as argument for random DNS feature. Random DNS is not enable by default, so the β€œdefault file” is really just a guide line and a test used while developing the function to se if everything was working as expected.
The only suggestion here is to add as much address as possible to increase randomness.
###### Note #1 - Only one address per line is allowed

## πŸ“– `FAQ`

Isn't this literally just a cli based frontend to curl?


The core of the script is a curl request, but this tool does more than that. When you run the script, several threads are started. Each thread makes a different http request and parses the output to choose the next url, simulating web navigation. Unless the user stops the script (either pressing enter or via CTRL-C), it will stay alive.

How does the error recovery mechanism work?


Error recovery mechanism is an elegant way to say that if the http request returns a status code starting with 4 or 5 (error), the script will use a backup-url on order to continue normal execution.

May I fork your project?


Look Here πŸ˜‰

How easy is this fake traffic to detect?


Unfortunately it's pretty easy, but keep in mind that this is a beta and I'll fix this "issue" in upcoming releases.

What does badwords do?


badwords is just a list of keywords used to filter urls in order to prevent 404s and traversing non-html content (like images, css, js). You can create your own, but, unless you have special needs, I recommend you use the default one or at least use it as a template.

What does partyloud.conf do?


partyloud.conf is just a list of root urls used to start the fake navigation. You can create your own conf file, but pay attention that the more urls you add, the more threads you start. This is an "open issue". Upcoming releases will come with a max thread number in order to avoid Fork Bombs.