Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/mitre/saf
The MITRE Security Automation Framework (SAF) Command Line Interface (CLI) brings together applications, techniques, libraries, and tools developed by MITRE and the security community to streamline security automation for systems and DevOps pipelines
https://github.com/mitre/saf
compliance devsecops json mitre mitre-corporation mitre-saf security security-automation security-automation-framework
Last synced: about 4 hours ago
JSON representation
The MITRE Security Automation Framework (SAF) Command Line Interface (CLI) brings together applications, techniques, libraries, and tools developed by MITRE and the security community to streamline security automation for systems and DevOps pipelines
- Host: GitHub
- URL: https://github.com/mitre/saf
- Owner: mitre
- License: other
- Created: 2021-11-29T16:50:20.000Z (about 3 years ago)
- Default Branch: main
- Last Pushed: 2025-01-18T02:27:28.000Z (1 day ago)
- Last Synced: 2025-01-18T02:38:16.012Z (1 day ago)
- Topics: compliance, devsecops, json, mitre, mitre-corporation, mitre-saf, security, security-automation, security-automation-framework
- Language: TypeScript
- Homepage: https://saf-cli.mitre.org
- Size: 67.7 MB
- Stars: 139
- Watchers: 22
- Forks: 38
- Open Issues: 125
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
- awesome-software-supply-chain-security - mitre/saf: The MITRE Security Automation Framework (SAF) Command Line Interface (CLI) brings together applications, techniques, libraries, and tools developed by MITRE and the security community to streamline security automation for systems and DevOps pipelines
README
# Security Automation Framework CLI
The MITRE Security Automation Framework (SAF) Command Line Interface (CLI) brings together applications, techniques, libraries, and tools developed by MITRE and the security community to streamline security automation for systems and DevOps pipelines
The SAF CLI is the successor to [Heimdall Tools](https://github.com/mitre/heimdall_tools) and [InSpec Tools](https://github.com/mitre/inspec_tools).
## Terminology
- "[Heimdall](https://github.com/mitre/heimdall2)" - Our visualizer for all security result data
- "[Heimdall Data Format (HDF)](https://saf.mitre.org/#/normalize)" - Our common data format to preserve and transform security data## Installation
* [Via NPM](#installation-via-npm)
* [Update via NPM](#update-via-npm)
* [Via Brew](#installation-via-brew)
* [Update via Brew](#update-via-brew)
* [Via Docker](#installation-via-docker)
* [Update via Docker](#update-via-docker)
* [Via Windows Installer](#installation-via-windows-installer)
* [Update via Windows Installer](#update-via-windows-installer)## Developers
For detail information about development, testing , and contributing to the SAF project refer to [MITRE SAF Develpment](https://github.com/mitre/saf/blob/main/docs/contributors-guide.md)## Usage
### Attest HDF Data
* [Attest](#attest)
* [Create Attestations](#create-attestations)
* [Apply Attestations](#apply-attestations)
### Convert HDF to Other Formats* [Get Help with Convert](#convert)
* [Convert From HDF](#convert-from-hdf)
* [HDF to ASFF](#hdf-to-asff)
* [HDF to Splunk](#hdf-to-splunk)
* [HDF to XCCDF Results](#hdf-to-xccdf-results)
* [HDF to Checklist](#hdf-to-checklist)
* [HDF to CSV](#hdf-to-csv)
* [HDF to Condensed JSON](#hdf-to-condensed-json)### Convert Other Formats to HDF
* [Convert To HDF](#convert-to-hdf)
* [Anchore Grype to HDF](#anchore-grype-to-hdf)
* [ASFF to HDF](#asff-to-hdf)
* [AWS Config to HDF](#aws-config-to-hdf)
* [Burp Suite to HDF](#burp-suite-to-hdf)
* [CKL to POA&M](#ckl-to-poam)
* [CycloneDX SBOM to HDF](#cyclonedx-sbom-to-hdf)
* [DBProtect to HDF](#dbprotect-to-hdf)
* [Dependency-Track to HDF](#dependency-track-to-hdf)
* [Fortify to HDF](#fortify-to-hdf)
* [gosec to HDF](#gosec-to-hdf)
* [Ion Channel 2 HDF](#ion-channel-2-hdf)
* [JFrog Xray to HDF](#jfrog-xray-to-hdf)
* [Tenable Nessus to HDF](#tenable-nessus-to-hdf)
* [Microsoft Secure Score to HDF](#msft_secure-to-hdf)
* [Netsparker to HDF](#netsparker-to-hdf)
* [NeuVector to HDF](#neuvector-to-hdf)
* [Nikto to HDF](#nikto-to-hdf)
* [Prisma to HDF](#prisma-to-hdf)
* [Prowler to HDF](#prowler-to-hdf)
* [Sarif to HDF](#sarif-to-hdf)
* [Scoutsuite to HDF](#scoutsuite-to-hdf)
* [Snyk to HDF](#snyk-to-hdf)
* [SonarQube to HDF](#sonarqube-to-hdf)
* [Splunk to HDF](#splunk-to-hdf)
* [Trivy to HDF](#trivy-to-hdf)
* [Trufflehog to HDF](#trufflehog-to-hdf)
* [Twistlock to HDF](#twistlock-to-hdf)
* [Veracode to HDF](#veracode-to-hdf)
* [XCCDF Results to HDF](#xccdf-results-to-hdf)
* [OWASP ZAP to HDF](#owasp-zap-to-hdf)### eMasser Client
* [eMASS API CLI](#emass-api-cli)
### View HDF Summaries and Data
* [View](#view)
* [Heimdall](#heimdall)
* [Summary](#summary)### Validate HDF Thresholds
* [Validate](#validate)
* [Thresholds](#thresholds)### Generate Data Reports and More
* [Generate](#generate)
* [Delta](#delta)
* [Delta Supporting Commands](#delta-supporting-options)
* [CKL Templates](#ckl-templates)
* [InSpec Metadata](#inspec-metadata)
* [Inspec Profile](#inspec-profile)
* [Thresholds](#thresholds-1)
* [Spreadsheet (csv/xlsx) to InSpec](#spreadsheet-csvxlsx-to-inspec)
* [DoD Stub vs CIS Stub Formatting](#dod-stub-vs-cis-stub-formatting)
* [Mapping Files](#mapping-files)### Enhance and Supplement HDF Data
* [Supplement](#supplement)
* [Passthrough](#passthrough)
* [Read](#read)
* [Write](#write)
* [Target](#target)
* [Read](#read-1)
* [Write](#write-1)### License and Authors
* [License and Author](#license-and-author)
---
## Installation
___
### Installation via NPM
The SAF CLI can be installed and kept up to date using `npm`, which is included with most versions of [NodeJS](https://nodejs.org/en/).
```bash
npm install -g @mitre/saf
```#### Update via NPM
To update the SAF CLI with `npm`:
```bash
npm update -g @mitre/saf
```
[top](#installation)---
### Installation via Brew
The SAF CLI can be installed and kept up to date using `brew`.
```
brew install mitre/saf/saf-cli
```#### Update via Brew
To update the SAF CLI with `brew`:
```
brew upgrade mitre/saf/saf-cli
```
[top](#installation)---
### Installation via Docker
**On Linux and Mac:**
The docker command below can be used to run the SAF CLI one time, where `arguments` contains the command and flags you want to run. For ex: `--version` or `view summary -i hdf-results.json`.
```
docker run -it -v$(pwd):/share mitre/saf
```To run the SAF CLI with a persistent shell for one or more commands, use the following, then run each full command. For ex: `saf --version` or `saf view summary -i hdf-results.json`. You can change the entrypoint you wish to use. For example, run with `--entrypoint sh` to open in a shell terminal. If the specified entrypoint is not found, try using the path such as `--entrypoint /bin/bash`.
```
docker run --rm -it --entrypoint bash -v$(pwd):/share mitre/saf
```**On Windows:**
The docker command below can be used to run the SAF CLI one time, where `arguments` contains the command and flags you want to run. For ex: `--version` or `view summary -i hdf-results.json`.
```
docker run -it -v%cd%:/share mitre/saf
```To run the SAF CLI with a persistent shell for one or more commands, use the following, then run each full command. For ex: `saf --version` or `saf view summary -i hdf-results.json`. You can change the entrypoint you wish to use. For example, run with `--entrypoint sh` to open in a shell terminal. If the specified entrypoint is not found, try using the path such as `--entrypoint /bin/bash`.
```
docker run --rm -it --entrypoint sh -v%cd%:/share mitre/saf
```**NOTE:**
Remember to use Docker CLI flags as necessary to run the various subcommands.
For example, to run the `emasser configure` subcommand, you need to pass in a volume that contains your certificates and where you can store the resultant .env. Furthermore, you need to pass in flags for enabling the pseudo-TTY and interactivity.
```
docker run -it -v "$(pwd)":/share mitre/saf emasser configure
```Other commands might not require the `-i` or `-t` flags and instead only need a bind-mounted volume, such as a file based `convert`.
```
docker run --rm -v "$(pwd)":/share mitre/saf convert -i test/sample_data/trivy/sample_input_report/trivy-image_golang-1.12-alpine_sample.json -o test.json
```Other flags exist to open up network ports or pass through environment variables so make sure to use whichever ones are required to successfully run a command.
#### Update via Docker
To update the SAF CLI with `docker`:
```bash
docker pull mitre/saf:latest
```
[top](#installation)---
### Installation via Windows Installer
To install the latest release of the SAF CLI on Windows, download and run the most recent installer for your system architecture from the [Releases](https://github.com/mitre/saf/releases) π¬οΈ page.
#### Update via Windows Installer
To update the SAF CLI on Windows, uninstall any existing version from your system and then download and run the most recent installer for your system architecture from the [Releases](https://github.com/mitre/saf/releases) π¬οΈ page.
[top](#installation)
## Usage
---### Attest
Attest to 'Not Reviewed' controls: sometimes requirements canβt be tested automatically by security tools and hence require manual review, whereby someone interviews people and/or examines a system to confirm (i.e., attest as to) whether the control requirements have been satisfied.
#### Create Attestations
```
attest create Create attestation files for use with `saf attest apply`USAGE
$ saf attest create -o [-i -t ]FLAGS
-i, --input= (optional) An input HDF file to search for controls
-o, --output= (required) The output filename
-t, --format= [default: json] (optional) The output file type
GLOBAL FLAGS
-h, --help Show CLI help
-L, --logLevel= [default: info] Specify level for logging (if implemented by the CLI command)
--interactive Collect input tags interactively (not available on all CLI commands)EXAMPLES
$ saf attest create -o attestation.json -i hdf.json$ saf attest create -o attestation.xlsx -t xlsx
```
[top](#usage)
#### Apply Attestations
```
attest apply Apply one or more attestation files to one or more HDF results setsUSAGE
$ saf attest apply -i ... ... -oFLAGS
-i, --input=... (required) Your input HDF and Attestation file(s)
-o, --output= (required) Output file or folder (for multiple executions)GLOBAL FLAGS
-h, --help Show CLI help
-L, --logLevel= [default: info] Specify level for logging (if implemented by the CLI command)
--interactive Collect input tags interactively (not available on all CLI commands)EXAMPLES
$ saf attest apply -i hdf.json attestation.json -o new-hdf.json$ saf attest apply -i hdf1.json hdf2.json attestation.xlsx -o outputDir
```
[top](#usage)
### ConvertTranslating your data to and from Heimdall Data Format (HDF) is done using the `saf convert` command.
Want to Recommend or Help Develop a Converter? See [the wiki](https://github.com/mitre/saf/wiki/How-to-recommend-development-of-a-mapper) π° on how to get started.
### Convert From HDF
[top](#convert-other-formats-to-hdf)
#### Anchore Grype to HDF
```
convert anchoregrype2hdf Translate a Anchore Grype output file into an HDF results setUSAGE
$ saf convert anchoregrype2hdf -i -o [-h] [-w]FLAGS
-i, --input= (required) Input Anchore Grype file
-o, --output= (required) Output HDF JSON File
-w, --includeRaw Include raw data from the input Anchore Grype fileGLOBAL FLAGS
-h, --help Show CLI help
-L, --logLevel= [default: info] Specify level for logging (if implemented by the CLI command)
--interactive Collect input tags interactively (not available on all CLI commands)EXAMPLES
$ saf convert anchoregrype2hdf -i anchoregrype.json -o output-hdf-name.json
```#### HDF to ASFF
***Note:*** Uploading findings into AWS Security hub requires configuration of the AWS CLI, see π [the AWS documentation](https://docs.aws.amazon.com/cli/latest/userguide/cli-chap-configure.html) or configuration of environment variables via Docker.
```
convert hdf2asff Translate a Heimdall Data Format JSON file into
AWS Security Findings Format JSON file(s) and/or
upload to AWS Security Hub
USAGE
$ saf convert hdf2asff -a -r -i -t [-h] [-R] (-u [-I -C ] | [-o ])FLAGS
-C, --certificate= Trusted signing certificate file
-I, --insecure Disable SSL verification, this is insecure.
-R, --specifyRegionAttribute Manually specify the top-level `Region` attribute - SecurityHub
populates this attribute automatically and prohibits one from
updating it using `BatchImportFindings` or `BatchUpdateFindings`
-i, --input= (required) Input HDF JSON File
-o, --output= Output ASFF JSON Folder
-r, --region= (required) SecurityHub Region
-t, --target= (required) Unique name for target to track findings across time
-u, --upload Upload findings to AWS Security HubGLOBAL FLAGS
-h, --help Show CLI help
-L, --logLevel= [default: info] Specify level for logging (if implemented by the CLI command)
--interactive Collect input tags interactively (not available on all CLI commands)EXAMPLES
Send output to local file system
$ saf convert hdf2asff -i rhel7-scan_02032022A.json -a 123456789 -r us-east-1 -t rhel7_example_host -o rhel7.asff
Upload findings to AWS Security Hub
$ saf convert hdf2asff -i rds_mysql_i123456789scan_03042022A.json -a 987654321 -r us-west-1 -t Instance_i123456789 -u
Upload findings to AWS Security Hub and Send output to local file system
$ saf convert hdf2asff -i snyk_acme_project5_hdf_04052022A.json -a 2143658798 -r us-east-1 -t acme_project5 -o snyk_acme_project5 -u
```
[top](#convert-hdf-to-other-formats)
#### HDF to Splunk**Notice**: HDF to Splunk requires configuration on the Splunk server. See π [Splunk Configuration](https://github.com/mitre/saf/wiki/Splunk-Configuration).
```
convert hdf2splunk Translate and upload a Heimdall Data Format JSON file into a Splunk serverUSAGE
$ saf convert hdf2splunk -i -H -I [-h] [-P ] [-s http|https] [-u | -t ] [-p ] [-L info|warn|debug|verbose]FLAGS
-H, --host= (required) Splunk Hostname or IP
-I, --index= (required) Splunk index to import HDF data into
-P, --port= [default: 8089] Splunk management port (also known as the Universal Forwarder port)
-i, --input= (required) Input HDF file
-p, --password= Your Splunk password
-s, --scheme= [default: https] HTTP Scheme used for communication with splunk
-t, --token= Your Splunk API Token
-u, --username= Your Splunk usernameGLOBAL FLAGS
-h, --help Show CLI help
-L, --logLevel= [default: info] Specify level for logging (if implemented by the CLI command)
--interactive Collect input tags interactively (not available on all CLI commands)EXAMPLES
User name/password Authentication
$ saf convert hdf2splunk -i rhel7-results.json -H 127.0.0.1 -u admin -p Valid_password! -I hdf
Token Authentication
$ saf convert hdf2splunk -i rhel7-results.json -H 127.0.0.1 -t your.splunk.token -I hdf
```For HDF Splunk Schema documentation visit π [Heimdall converter schemas](https://github.com/mitre/heimdall2/blob/master/libs/hdf-converters/src/converters-from-hdf/splunk/Schemas.md#schemas)
**Previewing HDF Data Within Splunk:**
An example of a full raw search query:
```sql
index="<>" meta.subtype=control | stats values(meta.filename) values(meta.filetype) list(meta.profile_sha256) values(meta.hdf_splunk_schema) first(meta.status) list(meta.status) list(meta.is_baseline) values(title) last(code) list(code) values(desc) values(descriptions.*) values(id) values(impact) list(refs{}.*) list(results{}.*) list(source_location{}.*) values(tags.*) by meta.guid id
| join meta.guid
[search index="<>" meta.subtype=header | stats values(meta.filename) values(meta.filetype) values(meta.hdf_splunk_schema) list(statistics.duration) list(platform.*) list(version) by meta.guid]
| join meta.guid
[search index="<>" meta.subtype=profile | stats values(meta.filename) values(meta.filetype) values(meta.hdf_splunk_schema) list(meta.profile_sha256) list(meta.is_baseline) last(summary) list(summary) list(sha256) list(supports{}.*) last(name) list(name) list(copyright) list(maintainer) list(copyright_email) last(version) list(version) list(license) list(title) list(parent_profile) list(depends{}.*) list(controls{}.*) list(attributes{}.*) list(status) by meta.guid]```
An example of a formatted table search query:
```sql
index="<>" meta.subtype=control | stats values(meta.filename) values(meta.filetype) list(meta.profile_sha256) values(meta.hdf_splunk_schema) first(meta.status) list(meta.status) list(meta.is_baseline) values(title) last(code) list(code) values(desc) values(descriptions.*) values(id) values(impact) list(refs{}.*) list(results{}.*) list(source_location{}.*) values(tags.*) by meta.guid id
| join meta.guid
[search index="<>" meta.subtype=header | stats values(meta.filename) values(meta.filetype) values(meta.hdf_splunk_schema) list(statistics.duration) list(platform.*) list(version) by meta.guid]
| join meta.guid
[search index="<>" meta.subtype=profile | stats values(meta.filename) values(meta.filetype) values(meta.hdf_splunk_schema) list(meta.profile_sha256) list(meta.is_baseline) last(summary) list(summary) list(sha256) list(supports{}.*) last(name) list(name) list(copyright) list(maintainer) list(copyright_email) last(version) list(version) list(license) list(title) list(parent_profile) list(depends{}.*) list(controls{}.*) list(attributes{}.*) list(status) by meta.guid]
| rename values(meta.filename) AS "Results Set", values(meta.filetype) AS "Scan Type", list(statistics.duration) AS "Scan Duration", first(meta.status) AS "Control Status", list(results{}.status) AS "Test(s) Status", id AS "ID", values(title) AS "Title", values(desc) AS "Description", values(impact) AS "Impact", last(code) AS Code, values(descriptions.check) AS "Check", values(descriptions.fix) AS "Fix", values(tags.cci{}) AS "CCI IDs", list(results{}.code_desc) AS "Results Description", list(results{}.skip_message) AS "Results Skip Message (if applicable)", values(tags.nist{}) AS "NIST SP 800-53 Controls", last(name) AS "Scan (Profile) Name", last(summary) AS "Scan (Profile) Summary", last(version) AS "Scan (Profile) Version"
| table meta.guid "Results Set" "Scan Type" "Scan (Profile) Name" ID "NIST SP 800-53 Controls" Title "Control Status" "Test(s) Status" "Results Description" "Results Skip Message (if applicable)" Description Impact Severity Check Fix "CCI IDs" Code "Scan Duration" "Scan (Profile) Summary" "Scan (Profile) Version"
```
[top](#convert-hdf-to-other-formats)
#### HDF to XCCDF Results
```
convert hdf2xccdf Translate an HDF file into an XCCDF XMLUSAGE
$ saf convert hdf2xccdf -i -o [-h]FLAGS
-i, --input= (required) Input HDF file
-o, --output= (required) Output XCCDF XML FileGLOBAL FLAGS
-h, --help Show CLI help
-L, --logLevel= [default: info] Specify level for logging (if implemented by the CLI command)
--interactive Collect input tags interactively (not available on all CLI commands)EXAMPLES
$ saf convert hdf2xccdf -i hdf_input.json -o xccdf-results.xml
```
[top](#convert-hdf-to-other-formats)
#### HDF to Checklist
```
convert hdf2ckl Translate a Heimdall Data Format JSON file into a
DISA checklist fileUSAGE
$ saf convert hdf2ckl -i -o [-h] [-m ] [--profilename ] [--profiletitle ] [--version ] [--releasenumber ] [--releasedate ] [--marking ] [-H ] [-I ] [-M ] [-F ] [--targetcomment ] [--role Domain Controller|Member Server|None|Workstation] [--assettype Computing|Non-Computing] [--techarea |Application Review|Boundary Security|CDS Admin Review|CDS Technical Review|Database Review|Domain Name System (DNS)|Exchange Server|Host Based System Security (HBSS)|Internal Network|Mobility|Other Review|Releasable Networks (REL)|Releaseable Networks (REL)|Traditional Security|UNIX OS|VVOIP Review|Web Review|Windows OS] [--stigguid ] [--targetkey ] [--webdbsite --webordatabase] [--webdbinstance ] [--vulidmapping gid|id]FLAGS
-h, --help Show CLI help.
-i, --input= (required) Input HDF file
-o, --output= (required) Output CKL fileCHECKLIST METADATA FLAGS
-F, --fqdn= Fully Qualified Domain Name
-H, --hostname= The name assigned to the asset within the network
-I, --ip= IP address
-M, --mac= MAC address
-m, --metadata= Metadata JSON file, generate one with "saf generate ckl_metadata"
--assettype= The category or classification of the asset
--marking= A security classification or designation of the asset, indicating its sensitivity level
--profilename= Profile name
--profiletitle= Profile title
--releasedate= Profile release date
--releasenumber= Profile release number
--role= The primary function or role of the asset within the network or organization
--stigguid= A unique identifier associated with the STIG for the asset
--targetcomment= Additional comments or notes about the asset
--targetkey= A unique key or identifier for the asset within the checklist or inventory system
--techarea= The technical area or domain to which the asset belongs
--version= Profile version number
--vulidmapping= Which type of control identifier to map to the checklist ID
--webdbinstance= The specific instance of the web application or database running on the server
--webdbsite= The specific site or application hosted on the web or database server
--webordatabase Indicates whether the STIG is primarily for either a web or database serverDESCRIPTION
Translate a Heimdall Data Format JSON file into a DISA checklist fileEXAMPLES
$ saf convert hdf2ckl -i rhel7-results.json -o rhel7.ckl --fqdn reverseproxy.example.org --hostname reverseproxy --ip 10.0.0.3 --mac 12:34:56:78:90:AB$ saf convert hdf2ckl -i rhel8-results.json -o rhel8.ckl -m rhel8-metadata.json
```
[top](#convert-hdf-to-other-formats)
#### HDF to CSV
```
convert hdf2csv Translate a Heimdall Data Format JSON file into a
Comma Separated Values (CSV) fileUSAGE
$ saf convert hdf2csv -i -o [-h] [-f ] [-t]FLAGS
-f, --fields= [default: All Fields] Fields to include in output CSV, separated by commas
-i, --input= (required) Input HDF file
-o, --output= (required) Output CSV file
-t, --noTruncate Don't truncate fields longer than 32,767 characters (the cell limit in Excel)GLOBAL FLAGS
-h, --help Show CLI help
-L, --logLevel= [default: info] Specify level for logging (if implemented by the CLI command)
--interactive Collect input tags interactively (not available on all CLI commands)EXAMPLES
Running the CLI interactively
$ saf convert hdf2csv --interactive
Providing flags at the command line
$ saf convert hdf2csv -i rhel7-results.json -o rhel7.csv --fields "Results Set,Status,ID,Title,Severity"
```
[top](#convert-hdf-to-other-formats)
#### HDF to Condensed JSON
```
convert hdf2condensed Condensed format used by some community members
to pre-process data for elasticsearch and custom dashboardsUSAGE
$ saf convert hdf2condensed -i -o [-h]FLAGS
-i, --input= (required) Input HDF file
-o, --output= (required) Output condensed JSON fileGLOBAL FLAGS
-h, --help Show CLI help
-L, --logLevel= [default: info] Specify level for logging (if implemented by the CLI command)
--interactive Collect input tags interactively (not available on all CLI commands)EXAMPLES
$ saf convert hdf2condensed -i rhel7-results.json -o rhel7-condensed.json
```
[top](#convert-hdf-to-other-formats)---
### Convert To HDF#### ASFF to HDF
Output|Use|Command
---|---|---
ASFF json|All the findings that will be fed into the mapper|aws securityhub get-findings > asff.json
AWS SecurityHub enabled standards json|Get all the enabled standards so you can get their identifiers|aws securityhub get-enabled-standards > asff_standards.json
AWS SecurityHub standard controls json|Get all the controls for a standard that will be fed into the mapper|aws securityhub describe-standards-controls --standards-subscription-arn "arn:aws:securityhub:us-east-1:123456789123:subscription/cis-aws-foundations-benchmark/v/1.2.0" > asff_cis_standard.json```
convert asff2hdf Translate a AWS Security Finding Format JSON into a
Heimdall Data Format JSON file(s)
USAGE
$ saf convert asff2hdf -o [-h] (-i [--securityhub ]... | -a -r [-I | -C ] [-t ]) [-L info|warn|debug|verbose]FLAGS
-C, --certificate= Trusted signing certificate file
-I, --insecure Disable SSL verification, this is insecure
-H, --securityHub= Additional input files to provide context that an ASFF file needs
such as the CIS AWS Foundations or AWS Foundational Security Best
Practices documents (in ASFF compliant JSON form)
-a, --aws Pull findings from AWS Security Hub
-i, --input= (required if not using AWS) Input ASFF JSON file
-o, --output= (required) Output HDF JSON folder
-r, --region= Security Hub region to pull findings from
-t, --target=... Target ID(s) to pull from Security Hub (maximum 10), leave blank for non-HDF findingsGLOBAL FLAGS
-h, --help Show CLI help
-L, --logLevel= [default: info] Specify level for logging (if implemented by the CLI command)
--interactive Collect input tags interactively (not available on all CLI commands)EXAMPLES
Using ASFF JSON file
$ saf convert asff2hdf -i asff-findings.json -o output-folder-name
Using ASFF JSON file with additional input files
$ saf convert asff2hdf -i asff-findings.json --securityhub ... --securityhub -o output-folder-name
Using AWS to pull ASFF JSON findings
$ saf convert asff2hdf --aws -o out -r us-west-2 --target rhel7
```
[top](#convert-other-formats-to-hdf)
#### AWS Config to HDF***Note:*** Pulling AWS Config results data requires configuration of the AWS CLI, see π [the AWS documentation](https://docs.aws.amazon.com/cli/latest/userguide/cli-chap-configure.html) or configuration of environment variables via Docker.
```
convert aws_config2hdf Pull Configuration findings from AWS Config and convert
into a Heimdall Data Format JSON file
USAGE
$ saf convert aws_config2hdf -r -o [-h] [-a ] [-s ] [-t ] [-i]FLAGS
-a, --accessKeyId= Access key ID
-i, --insecure Disable SSL verification, this is insecure.
-o, --output= (required) Output HDF JSON File
-r, --region= (required) Region to pull findings from
-s, --secretAccessKey= Secret access key
-t, --sessionToken= Session tokenGLOBAL FLAGS
-h, --help Show CLI help
-L, --logLevel= [default: info] Specify level for logging (if implemented by the CLI command)
--interactive Collect input tags interactively (not available on all CLI commands)EXAMPLES
$ saf convert aws_config2hdf -a ABCDEFGHIJKLMNOPQRSTUV -s +4NOT39A48REAL93SECRET934 -r us-east-1 -o output-hdf-name.json
```
[top](#convert-other-formats-to-hdf)
#### Burp Suite to HDF
```
convert burpsuite2hdf Translate a BurpSuite Pro XML file into a Heimdall
Data Format JSON file
USAGE
$ saf convert burpsuite2hdf -i -o [-h] [-w]FLAGS
-i, --input= (required) Input Burpsuite Pro XML File
-o, --output= (required) Output HDF JSON File
-w, --includeRaw Include raw input file in HDF JSON fileGLOBAL FLAGS
-h, --help Show CLI help
-L, --logLevel= [default: info] Specify level for logging (if implemented by the CLI command)
--interactive Collect input tags interactively (not available on all CLI commands)EXAMPLES
$ saf convert burpsuite2hdf -i burpsuite_results.xml -o output-hdf-name.json
```
[top](#convert-other-formats-to-hdf)
#### CKL to POA&MNote: The included CCI to NIST Mappings are the extracted from NIST.gov, for mappings specific to eMASS use [this](https://github.com/mitre/ckl2POAM/blob/main/resources/cci2nist.json) file instead).
```
convert ckl2POAM Translate DISA Checklist CKL file(s) to POA&M filesUSAGE
$ saf convert ckl2POAM -i -o [-h] [-O ] [-d ] [-s ]FLAGS
-O, --officeOrg= Default value for Office/org (prompts for each file if not set)
-d, --deviceName= Name of target device (prompts for each file if not set)
-i, --input=... (required) Path to the DISA Checklist File(s)
-o, --output= (required) Path to output PO&M File(s)
-s, --rowsToSkip= [default: 4] Rows to leave between POA&M Items for milestonesGLOBAL FLAGS
-h, --help Show CLI help
-L, --logLevel= [default: info] Specify level for logging (if implemented by the CLI command)
--interactive Collect input tags interactively (not available on all CLI commands)ALIASES
$ saf convert ckl2poamEXAMPLES
$ saf convert ckl2POAM -i checklist_file.ckl -o output-folder -d abcdefg -s 2
```[top](#convert-other-formats-to-hdf)
#### CycloneDX SBOM to HDFNote: Currently, only the CycloneDX SBOM, VEX, and HBOM formats are officially supported in the CycloneDX SBOM convert command (formats like SaaSBOM are NOT supported and will result in errors). To convert other non-CycloneDX SBOM formats, first convert your current SBOM data file into the CycloneDX SBOM data format with [their provided utility](https://github.com/CycloneDX/cyclonedx-cli) and then convert the CycloneDX SBOM file to OHDF with the `saf convert cyclonedx_sbom2hdf` command.
EX) To convert SPDX SBOM format to CycloneDX SBOM format using the [CycloneDX CLI](https://github.com/CycloneDX/cyclonedx-cli), you can perform the following:
```
cyclonedx-cli convert --input-file spdx-sbom.json --output-file cyclonedx-sbom.json --input-format spdxjson --output-format json
```And then use that resulting CycloneDX SBOM file to convert to OHDF.
```
convert cyclonedx_sbom2hdf Translate a CycloneDX SBOM report into an HDF results setUSAGE
$ saf convert cyclonedx_sbom2hdf -i -o [-h] [-w]FLAGS
-i, --input= (required) Input CycloneDX SBOM File
-o, --output= (required) Output HDF JSON File
-w, --includeRaw Include raw input file in HDF JSON fileGLOBAL FLAGS
-h, --help Show CLI help
-L, --logLevel= [default: info] Specify level for logging (if implemented by the CLI command)
--interactive Collect input tags interactively (not available on all CLI commands)EXAMPLES
$ saf convert cyclonedx_sbom2hdf -i cyclonedx_sbom.json -o output-hdf-name.json
```[top](#convert-other-formats-to-hdf)
#### DBProtect to HDF
```
convert dbprotect2hdf Translate a DBProtect report in "Check Results
Details" XML format into a Heimdall Data Format JSON file
USAGE
$ saf convert dbprotect2hdf -i -o [-h] [-w]FLAGS
-i, --input= (required) 'Check Results Details' XML File
-o, --output= (required) Output HDF JSON File
-w, --includeRaw Include raw input file in HDF JSON fileGLOBAL FLAGS
-h, --help Show CLI help
-L, --logLevel= [default: info] Specify level for logging (if implemented by the CLI command)
--interactive Collect input tags interactively (not available on all CLI commands)
EXAMPLES
$ saf convert dbprotect2hdf -i check_results_details_report.xml -o output-hdf-name.json
```[top](#convert-other-formats-to-hdf)
##### Dependency-Track to HDF
```
convert dependency_track2hdf Translate a Dependency-Track results JSON
file into a Heimdall Data Format JSON file
USAGE
$ saf convert dependency_track2hdf -i -o [-h] [-w]FLAGS
-h, --help Show CLI help.
-i, --input= (required) Input Dependency-Track FPF file
-o, --output= (required) Output HDF file
-w, --with-rawGLOBAL FLAGS
-L, --logLevel= [default: info] Specify level for logging (if implemented by the CLI command)
--interactive Collect input tags interactively (not available on all CLI commands)EXAMPLES
saf convert dependency_track2hdf -i dt-fpf.json -o output-hdf-name.json
```[top](#convert-other-formats-to-hdf)
#### Fortify to HDF
```
convert fortify2hdf Translate a Fortify results FVDL file into a Heimdall
Data Format JSON file; the FVDL file is an XML that can be
extracted from the Fortify FPR project file using standard
file compression tools
USAGE
$ saf convert fortify2hdf -i -o [-h] [-w]FLAGS
-i, --input= (required) Input FVDL File
-o, --output= (required) Output HDF JSON File
-w, --includeRaw Include raw input file in HDF JSON fileGLOBAL FLAGS
-h, --help Show CLI help
-L, --logLevel= [default: info] Specify level for logging (if implemented by the CLI command)
--interactive Collect input tags interactively (not available on all CLI commands)EXAMPLES
$ saf convert fortify2hdf -i audit.fvdl -o output-hdf-name.json
```[top](#convert-other-formats-to-hdf)
#### gosec to HDF
```
convert gosec2hdf Translate a gosec (Golang Security Checker) results file
into a Heimdall Data Format JSON file
USAGE
$ saf convert gosec2hdf -i -o [-h] [-w]FLAGS
-h, --help Show CLI help.
-i, --input= (required) Input gosec Results JSON File
-o, --output= (required) Output HDF JSON File
-w, --includeRaw Include raw input file in HDF JSON fileGLOBAL FLAGS
-L, --logLevel= [default: info] Specify level for logging (if implemented by the CLI command)
--interactive Collect input tags interactively (not available on all CLI commands)EXAMPLES
$ saf convert gosec2hdf -i gosec_results.json -o output-hdf-name.json
```[top](#convert-other-formats-to-hdf)
#### Ion Channel 2 HDF
```
convert ionchannel2hdf Pull and translate SBOM data from Ion Channel
into Heimdall Data Format
USAGE
$ saf convert ionchannel2hdf -o [-h] (-i | -a -t [--raw ] [-p ] [-A ]) [-L info|warn|debug|verbose]FLAGS
-A, --allProjects Pull all projects available within your team
-L, --logLevel= [default: info]
-a, --apiKey= API Key from Ion Channel user settings
-i, --input=... Input IonChannel JSON file
-o, --output= (required) Output JSON folder
-p, --project=... The name of the project(s) you would like to pull
-t, --teamName= Your team name that contains the project(s) you would like to pull data from
--raw Output Ion Channel raw dataGLOBAL FLAGS
-h, --help Show CLI help
-L, --logLevel= [default: info] Specify level for logging (if implemented by the CLI command)
--interactive Collect input tags interactively (not available on all CLI commands)EXAMPLES
Using Input IonChannel JSON file
$ saf convert ionchannel2hdf -o output-folder-name -i ion-channel-file.json
Using IonChannel API Key (pull one project)
$ saf convert ionchannel2hdf -o output-folder-name -a ion-channel-apikey -t team-name -p project-name-to-pull --raw
Using IonChannel API Key (pull all project)
$ saf convert ionchannel2hdf -o output-folder-name -a ion-channel-apikey -t team-name -A --raw```
[top](#convert-other-formats-to-hdf)
#### JFrog Xray to HDF
```
convert jfrog_xray2hdf Translate a JFrog Xray results JSON file into a
Heimdall Data Format JSON file
USAGE
$ saf convert jfrog_xray2hdf -i -o [-h] [-w]FLAGS
-i, --input= (required) Input JFrog JSON File
-o, --output= (required) Output HDF JSON File
-w, --includeRaw Include raw input file in HDF JSON fileGLOBAL FLAGS
-h, --help Show CLI help
-L, --logLevel= [default: info] Specify level for logging (if implemented by the CLI command)
--interactive Collect input tags interactively (not available on all CLI commands)EXAMPLES
$ saf convert jfrog_xray2hdf -i xray_results.json -o output-hdf-name.json
```[top](#convert-other-formats-to-hdf)
#### Tenable Nessus to HDF
```
convert nessus2hdf Translate a Nessus XML results file into a Heimdall Data Format JSON file.
The current iteration maps all plugin families except for 'Policy Compliance'
A separate HDF JSON is generated for each host reported in the Nessus Report.
USAGE
$ saf convert nessus2hdf -i -o [-h] [-w]
FLAGS
-i, --input= (required) Input Nessus XML File
-o, --output= (required) Output HDF JSON File
-w, --includeRaw Include raw input file in HDF JSON fileGLOBAL FLAGS
-L, --logLevel= [default: info] Specify level for logging (if implemented by the CLI command)
--interactive Collect input tags interactively (not available on all CLI commands)
EXAMPLES
$ saf convert nessus2hdf -i nessus_results.xml -o output-hdf-name.json
```[top](#convert-other-formats-to-hdf)
#### Microsoft Secure Score to HDF
Output|Use|Command
---|---|---
Microsoft Secure Score JSON|This file contains the Graph API response for the `security/secureScore` endpoint|PowerShell$ `Get-MgSecuritySecureScore -Top 500`
Microsoft Secure Score Control Profiles JSON|This file contains the Graph API response for the `security/secureScoreControlProfiles` endpoint|PowerShell$ `Get-MgSecuritySecureScoreControlProfile -Top 500`
Combined JSON|Combine the outputs from `security/secureScore` and `security/secureScoreControlProfiles` endpoints|`jq -s \'{"secureScore": .[0], "profiles": .[1]}\' secureScore.json secureScoreControlProfiles.json````
convert msft_secure2hdf Translate a Microsoft Secure Score report and Secure Score Control to a Heimdall Data Format JSON fileUSAGE
$ saf convert msft_secure2hdf -p -r -o [-h]
$ saf convert msft_secure2hdf -t -a -s -o [-h]
$ saf convert msft_secure2hdf -i -o [-h]FLAGS
-C, --certificate= Trusted signing certificate file
-I, --insecure Disable SSL verification, this is insecure.
-a, --appId= Azure application ID
-i, --combinedInputs= JSON File combining the outputs from the Microsoft Graph API endpoints
{secureScore: }, profiles:
-o, --output= (required) Output HDF JSON file
-p, --inputProfiles= Input Microsoft Graph API "GET /security/secureScoreControlProfiles" output JSON File
-r, --inputScoreDoc= Input Microsoft Graph API "GET /security/secureScores" output JSON File
-s, --appSecret= Azure application secret
-t, --tenantId= Azure tenant ID
-w, --includeRaw Include raw input file in HDF JSON fileGLOBAL FLAGS
-h, --help Show CLI help
-L, --logLevel= [default: info] Specify level for logging (if implemented by the CLI command)
--interactive Collect input tags interactively (not available on all CLI commands)EXAMPLES
Using input files
$ saf convert msft_secure2hdf -p secureScore.json -r secureScoreControlProfiles -o output-hdf-name.json [-w]Using Azure tenant ID
$ saf convert msft_secure2hdf -t "12345678-1234-1234-1234-1234567890abcd" \
-a "12345678-1234-1234-1234-1234567890abcd" \
-s "aaaaa~bbbbbbbbbbbbbbbbbbbbbbbbb-cccccccc" \
-o output-hdf-name.json [-I | -C ]Using combined inputs
$ saf convert msft_secure2hdf -i <(jq '{"secureScore": .[0], "profiles": .[1]}' secureScore.json secureScoreControlProfiles.json)> \
-o output-hdf-name.json [-w]```
[top](#convert-other-formats-to-hdf)
#### Netsparker to HDF
```
convert netsparker2hdf Translate a Netsparker XML results file into a
Heimdall Data Format JSON file. The current
iteration only works with Netsparker Enterprise
Vulnerabilities Scan.
USAGE
$ saf convert netsparker2hdf -i -o [-h] [-w]FLAGS
-i, --input= (required) Input Netsparker XML File
-o, --output= (required) Output HDF JSON File
-w, --includeRaw Include raw input file in HDF JSON fileGLOBAL FLAGS
-h, --help Show CLI help
-L, --logLevel= [default: info] Specify level for logging (if implemented by the CLI command)
--interactive Collect input tags interactively (not available on all CLI commands)EXAMPLES
$ saf convert netsparker2hdf -i netsparker_results.xml -o output-hdf-name.json
```[top](#convert-other-formats-to-hdf)
#### NeuVector to HDF
```
convert neuvector2hdf Translate a NeuVector results JSON to a Heimdall Data Format JSON fileUSAGE
$ saf convert neuvector2hdf -i