An open API service indexing awesome lists of open source software.

https://github.com/curbengh/splunk-malware-filter

Mirror of https://gitlab.com/malware-filter/splunk-malware-filter
https://github.com/curbengh/splunk-malware-filter

splunk splunk-application

Last synced: 3 months ago
JSON representation

Mirror of https://gitlab.com/malware-filter/splunk-malware-filter

Awesome Lists containing this project

README

        

# malware-filter Add-on (Splunk)

- [Installation](#installation)
- [Usage](#usage)
- [geturlhausfilter](#geturlhausfilter)
- [getphishingfilter](#getphishingfilter)
- [getpupfilter](#getpupfilter)
- [getvnbadsitefilter](#getvnbadsitefilter)
- [getbotnetfilter](#getbotnetfilter)
- [getbotnetip](#getbotnetip)
- [getopendbl](#getopendbl)
- [Disable individual commands](#disable-individual-commands)
- [Build](#build)
- [Download failover](#download-failover)

Provide custom search commands to update [malware-filter](https://gitlab.com/malware-filter) lookups. Each command downloads from a source CSV and emit rows as events which can then be piped to a lookup file or used as a subsearch. Each command is exported globally and can be used in any app. This add-on currently does not have any UI.

[Lookup files](./lookups/) can be updated using the bundled scheduled reports every 12 hours, every 15 minutes for botnet_ip.csv and opendbl_ip.csv. The scheduled reports are disabled by default. Enable the relevant schedule that corresponds to the required lookup file. Modify the search string to add [optional arguments](#usage).

Source CSVs will be downloaded via a proxy if configured in "$SPLUNK_HOME/etc/system/local/[server.conf](https://docs.splunk.com/Documentation/Splunk/latest/Admin/Serverconf#Splunkd_http_proxy_configuration)".

Refer to [this article](https://mdleom.com/blog/2023/04/16/splunk-lookup-malware-filter/) for a more comprehensive guide on detecting malicious domain, URL, IP and CIDR range.

Tested on Splunk 9.x.

## Installation

Releases are available at [Splunkbase](https://splunkbase.splunk.com/app/6970) and [GitLab](https://gitlab.com/malware-filter/splunk-malware-filter/-/releases)

Instruction to build the main branch is available at the [Build](#build) section.

## Usage

```
| geturlhausfilter wildcard_prefix= wildcard_suffix= wildcard_affix= message=
| outputlookup override_if_empty=false urlhaus-filter-splunk-online.csv
```

Optional arguments:

- **wildcard_prefix** ``: list of column names to have wildcard "\*" prefixed to their _non-empty_ value. New column(s) named "{column_name}\_wildcard_prefix" will be created. Non-existant column will be silently ignored. Accepted values: `"column_name"`, `"columnA,columnB"`.
- **wildcard_suffix** ``: Same as wildcard_prefix but have the wildcard suffixed instead.
- **wildcard_affix** ``: Same as wildcard_prefix but have the wildcard prefixed and suffixed.
- **message** ``: Add custom message column. New column "custom_message" will be created.

Example:

```
| geturlhausfilter
| outputlookup override_if_empty=false urlhaus-filter-splunk-online.csv
```

| host | path | message | updated |
| ------------ | ---------- | ----------------------------------------- | -------------------- |
| example2.com | /some-path | urlhaus-filter malicious website detected | 2022-12-21T12:34:56Z |

```
| geturlhausfilter wildcard_prefix=path message="lorem ipsum"
| outputlookup override_if_empty=false urlhaus-filter-splunk-online.csv
```

| host | path | message | updated | path_wildcard_prefix | message |
| ------------ | ---------- | ----------------------------------------- | -------------------- | -------------------- | ----------- |
| example2.com | /some-path | urlhaus-filter malicious website detected | 2022-12-21T12:34:56Z | \*/some-path | lorem ipsum |
| example.com | | urlhaus-filter malicious website detected | 2022-12-21T12:34:56Z | | lorem ipsum |

## Lookup files

Lookup files are bundled but they are empty, run the relevant `| getsomething | outputlookup some-filter.csv` to get the latest lookup before using any of them.

- urlhaus-filter-splunk-online.csv
- phishing-filter-splunk.csv
- pup-filter-splunk.csv
- vn-badsite-filter-splunk.csv
- botnet-filter-splunk.csv
- botnet_ip.csv
- opendbl_ip.csv

## geturlhausfilter

```
| geturlhausfilter wildcard_prefix= wildcard_suffix= wildcard_affix= message=
| outputlookup override_if_empty=false urlhaus-filter-splunk-online.csv
```

Output columns are listed here https://gitlab.com/malware-filter/urlhaus-filter#splunk

## getphishingfilter

```
| getphishingfilter wildcard_prefix= wildcard_suffix= wildcard_affix= message=
| outputlookup override_if_empty=false phishing-filter-splunk.csv
```

Output columns are listed here https://gitlab.com/malware-filter/phishing-filter#splunk

## getpupfilter

```
| getpupfilter wildcard_prefix= wildcard_suffix= wildcard_affix= message=
| outputlookup override_if_empty=false pup-filter-splunk.csv
```

Output columns are listed here https://gitlab.com/malware-filter/pup-filter#splunk

## getvnbadsitefilter

```
| getvnbadsitefilter wildcard_prefix= wildcard_suffix= wildcard_affix= message=
| outputlookup override_if_empty=false vn-badsite-filter-splunk.csv
```

Output columns are listed here https://gitlab.com/malware-filter/vn-badsite-filter#splunk

## getbotnetfilter

Highly recommend to use [`getbotnetip`](#getbotnetip) instead.

```
| getbotnetfilter message=
| outputlookup override_if_empty=false botnet-filter-splunk.csv
```

Output columns are listed here https://gitlab.com/malware-filter/botnet-filter#splunk

## getbotnetip

Recommend to update the lookup file "botnet_ip.csv" every 5 minutes (cron `*/5 * * * *`).

```
| getbotnetip message=
| outputlookup override_if_empty=false botnet_ip.csv
```

Columns:

| first_seen_utc | dst_ip | dst_port | c2_status | last_online | malware | updated |
| ------------------- | ------------- | -------- | --------- | ----------- | ------- | -------------------- |
| 2021-01-17 07:44:46 | 51.178.161.32 | 4643 | online | 2023-01-26 | Dridex | 2023-01-25T17:41:16Z |

Source: https://feodotracker.abuse.ch/downloads/ipblocklist.csv

## getopendbl

Recommend to update the lookup file "opendbl_ip.csv" every 15 minutes (cron `*/15 * * * *`).

```
| getopendbl message=
| outputlookup override_if_empty=false opendbl_ip.csv
```

| start | end | netmask | cidr_range | name | updated |
| --------------- | --------------- | ------- | ------------------ | ----------------------------------------- | -------------------- |
| 187.190.252.167 | 187.190.252.167 | 32 | 187.190.252.167/32 | Emerging Threats: Known Compromised Hosts | 2023-01-30T08:03:00Z |
| 89.248.163.0 | 89.248.163.255 | 24 | 89.248.163.0/24 | Dshield | 2023-01-30T08:01:00Z |

Source: https://opendbl.net/

## Example usage

```
| tstats summariesonly=true allow_old_summaries=true count FROM datamodel=Web WHERE Web.action="allowed"
BY Web.user, Web.src, Web.dest, Web.site, Web.url, Web.category, Web.action, index, _time span=1s
| rename Web.* AS *
| lookup urlhaus-filter-splunk-online host AS site, host AS dest OUTPUT message AS description, updated
| lookup urlhaus-filter-splunk-online path_wildcard_prefix AS vendor_url, host AS site, host AS dest OUTPUT message AS description2, updated AS updated2
| lookup phishing-filter-splunk host AS site, host AS dest OUTPUT message AS description3, updated AS updated3
| lookup phishing-filter-splunk path_wildcard_prefix AS vendor_url, host AS site, host AS dest OUTPUT message AS description4, updated AS updated4
| lookup pup-filter-splunk host AS site, host AS dest OUTPUT message AS description5, updated AS updated5
| lookup vn-badsite-filter-splunk host AS site, host AS dest OUTPUT message AS description6, updated AS updated6
| lookup botnet_ip dst_ip AS dest OUTPUT malware AS description7, updated AS updated7
| eval Description=coalesce(description, description2, description3, description4, description5, description6, description7)
| search Description=*
| eval updated=coalesce(updated, updated2, updated3, updated4, updated5, updated6, updated7), "Signature Last Updated"=strftime(strptime(updated." +0000","%Y-%m-%dT%H:%M:%SZ %z"),"%Y-%m-%d %H:%M:%S %z"), Time=strftime(_time, "%Y-%m-%d %H:%M:%S %z"), "Source IP"=src, Username=user, Domain=site, "Destination IP"=dest, URL=url, Action=action
| table Time, index, "Signature Last Updated", "Source IP", Username, Domain, "Destination IP", Description, Action, URL
```

It is not recommended to use subsearch (e.g. `[| inputlookup urlhaus-filter-splunk-online.csv | fields host ]`) for these [lookup tables](./lookups/) especially [urlhaus-filter](./lookups/urlhaus-filter-splunk-online.csv) and [phishing-filter](./lookups/phishing-filter-splunk.csv) because they usually have more than 30,000 rows, which exceed the soft-limit of [10,000 rows](https://docs.splunk.com/Documentation/SplunkCloud/latest/Search/Aboutsubsearches#Subsearch_performance_considerations) returned by subsearch.

## Disable individual commands

Settings -> All configurations -> filter by "malware_filter" app

## Build

```
git clone https://gitlab.com/malware-filter/splunk-malware-filter
cd splunk-malware-filter
python build.py
```

## Download failover

For `get*filter` search commands, the script will attempt to download from the following domains **in sequence** (check out the `DOWNLOAD_URLS` constant in each script):

- malware-filter.gitlab.io
- curbengh.github.io
- curbengh.github.io
- malware-filter.gitlab.io
- malware-filter.pages.dev
- \*-filter.pages.dev

It is not necessary to allow outbound connection to all the above domains, it just depends how much redundancy you prefer.

## Disclaimer

`getbotnetip.py` and `getopendbl.py` are included simply for convenience, their upstream sources are not affiliated with malware-filter.

## Repository Mirrors

https://gitlab.com/curben/blog#repository-mirrors

## License

[Creative Commons Zero v1.0 Universal](LICENSE-CC0.md) and [MIT License](LICENSE)

[splunk-sdk-python](https://github.com/splunk/splunk-sdk-python) bundled with the package under lib folder: [Apache License 2.0](https://choosealicense.com/licenses/apache-2.0/)