{"id":14158673,"url":"https://github.com/curl/wcurl","last_synced_at":"2026-01-06T07:11:57.051Z","repository":{"id":247155092,"uuid":"822755345","full_name":"curl/wcurl","owner":"curl","description":"a simple wrapper around curl to easily download files","archived":false,"fork":false,"pushed_at":"2025-04-20T22:53:13.000Z","size":105,"stargazers_count":313,"open_issues_count":5,"forks_count":15,"subscribers_count":11,"default_branch":"main","last_synced_at":"2025-05-11T19:34:25.560Z","etag":null,"topics":["command-line","curl","download","url"],"latest_commit_sha":null,"homepage":"https://curl.se/wcurl","language":"Shell","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"other","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/curl.png","metadata":{"files":{"readme":"README.md","changelog":"CHANGELOG.md","contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":"AUTHORS","dei":null,"publiccode":null,"codemeta":null,"zenodo":null}},"created_at":"2024-07-01T18:52:09.000Z","updated_at":"2025-05-09T13:34:33.000Z","dependencies_parsed_at":"2024-12-29T12:02:33.974Z","dependency_job_id":"b203d81d-dd59-4a0f-8aea-12c4cc00fdaa","html_url":"https://github.com/curl/wcurl","commit_stats":{"total_commits":58,"total_committers":8,"mean_commits":7.25,"dds":"0.39655172413793105","last_synced_commit":"898c516d710571862e911e62b127499236b68e3d"},"previous_names":["debian/wcurl","curl/wcurl"],"tags_count":8,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/curl%2Fwcurl","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/curl%2Fwcurl/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/curl%2Fwcurl/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/curl%2Fwcurl/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/curl","download_url":"https://codeload.github.com/curl/wcurl/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":254436944,"owners_count":22070946,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["command-line","curl","download","url"],"created_at":"2024-08-17T09:02:28.438Z","updated_at":"2026-01-06T07:11:57.046Z","avatar_url":"https://github.com/curl.png","language":"Shell","readme":"\u003c!--\nCopyright (C) Samuel Henrique \u003csamueloph@debian.org\u003e, Sergio Durigan\nJunior \u003csergiodj@debian.org\u003e and many contributors, see the AUTHORS\nfile.\n\nSPDX-License-Identifier: curl\n--\u003e\n\n# [![wcurl logo](https://curl.se/logo/wcurl-logo.svg)](https://curl.se/wcurl)\n\n# Install wcurl\n\nFirst check if your distro/OS vendor ships `wcurl` as part of their official\nrepositories, `wcurl` might be shipped as part of the `curl` package.\n\nIf they do not ship it, consider making a request for it.\n\nYou can always install wcurl by simply downloading the script:\n\n```sh\ncurl -fLO https://github.com/curl/wcurl/releases/latest/download/wcurl\nchmod +x wcurl\nsudo mv wcurl /usr/local/bin/wcurl\n```\n\n# Install wcurl's manpage\n\n```sh\ncurl -fLO https://github.com/curl/wcurl/releases/latest/download/wcurl.1\nsudo mkdir -p /usr/local/share/man/man1/\nsudo mv wcurl.1 /usr/local/share/man/man1/wcurl.1\nsudo mandb\n```\n\n# wcurl(1)\n\n**wcurl**\n\n* a simple wrapper around curl to easily download files.\n\n# Synopsis\n\n```text\nwcurl \u003cURL\u003e...\nwcurl [--curl-options \u003cCURL_OPTIONS\u003e]... [--no-decode-filename] [-o|-O|--output \u003cPATH\u003e] [--dry-run] [--] \u003cURL\u003e...\nwcurl [--curl-options=\u003cCURL_OPTIONS\u003e]... [--no-decode-filename] [--output=\u003cPATH\u003e] [--dry-run] [--] \u003cURL\u003e...\nwcurl -V|--version\nwcurl -h|--help\n```\n\n# Description\n\n**wcurl** is a simple curl wrapper which lets you use curl to download files\nwithout having to remember any parameters.\n\nSimply call **wcurl** with a list of URLs you want to download and **wcurl** picks\nsane defaults.\n\nIf you need anything more complex, you can provide any of curl's supported\nparameters via the `--curl-options` option. Just beware that you likely\nshould be using curl directly if your use case is not covered.\n\n* By default, **wcurl** does:\n  * Percent-encode whitespace in URLs;\n  * Download multiple URLs in parallel if the installed curl's version is \u003e= 7.66.0 (`--parallel`);\n  * Use a total number of 5 parallel connections to the same protocol + hostname + port number target if the installed curl's version is \u003e= 8.16.0 (`--parallel-max-host`);\n  * Follow redirects;\n  * Automatically choose a filename as output;\n  * Avoid overwriting files if the installed curl's version is \u003e= 7.83.0 (`--no-clobber`);\n  * Perform retries;\n  * Set the downloaded file timestamp to the value provided by the server, if available;\n  * Disable **curl**'s URL globbing parser so `{}` and `[]` characters in URLs are not treated specially;\n  * Percent-decode the resulting filename;\n  * Use \"index.html\" as default filename if there is none in the URL.\n\n# Options\n\n* `--curl-options, curl-options=\u003cCURL_OPTIONS\u003e`...\n\n  Specify extra options to be passed when invoking curl. May be specified more than once.\n\n* `-o, -O, --output, --output=\u003cPATH\u003e`\n\n  Use the provided output path instead of getting it from the URL. If multiple\n  URLs are provided, resulting files share the same name with a number appended to\n  the end (curl \u003e= 7.83.0). If this option is provided multiple times, only the\n  last value is considered.\n\n* `--no-decode-filename`\n\n  Do not percent-decode the output filename, even if the percent-encoding in the\n  URL was done by wcurl, e.g.: The URL contained whitespace.\n\n* `--dry-run`\n\n  Do not actually execute curl, just print what would be invoked.\n\n* `-V, --version`\n\n  Print version information.\n\n* `-h, --help`\n\n  Print help message.\n\n# CURL_OPTIONS\n\nAny option supported by curl can be set here. This is not used by wcurl; it is\ninstead forwarded to the curl invocation.\n\n# URL\n\nURL to be downloaded. Anything that is not a parameter is considered\nan URL. Whitespace is percent-encoded and the URL is passed to curl, which\nthen performs the parsing. May be specified more than once.\n\n# Examples\n\n* Download a single file:\n\n  ```sh\n  wcurl example.com/filename.txt\n  ```\n\n* Download two files in parallel:\n\n  ```sh\n  wcurl example.com/filename1.txt example.com/filename2.txt\n  ```\n\n* Download a file passing the `--progress-bar` and `--http2` flags to curl:\n\n  ```sh\n  wcurl --curl-options=\"--progress-bar --http2\" example.com/filename.txt\n  ```\n\n* Resume from an interrupted download. The options necessary to resume the download (`--clobber --continue-at -`) must be the **last** options specified in `--curl-options`. Note that the only way to resume interrupted downloads is to allow wcurl to overwrite the destination file:\n\n  ```sh\n  wcurl --curl-options=\"--clobber --continue-at -\" example.com/filename.txt\n  ```\n\n* Download multiple files without a limit of concurrent connections per host (the default limit is 5):\n\n  ```sh\n  wcurl --curl-options=\"--parallel-max-host 0\" example.com/filename1.txt example.com/filename2.txt\n  ```\n\n# Running the testsuite\n\nIf you would like to run the tests, you first need to install the\n`shunit2` package.  On Debian-like and Fedora-like systems, the\npackage is called `shunit2`.\n\nAfter that, you can run the testsuite by simply invoking the test\nscript:\n\n```sh\n./tests/tests.sh\n```\n\n# Lint\n\nTo lint the shell scripts, you need to install `shellcheck` and `checkbashisms`. These tools check the scripts for issues and ensure they follow best practices.\n\n* On Debian-like systems: `apt install shellcheck devscripts`\n* On Fedora-like systems: `dnf install shellcheck devscripts`\n\nAfter installation, you can run `shellcheck` and `checkbashisms` by executing the following commands:\n\n```sh\nshellcheck wcurl ./tests/*\n\ncheckbashisms wcurl ./tests/*\n```\n\n# Authors\n\nSamuel Henrique \u0026lt;[samueloph@debian.org](mailto:samueloph@debian.org)\u0026gt;  \nSergio Durigan Junior \u0026lt;[sergiodj@debian.org](mailto:sergiodj@debian.org)\u0026gt;  \nand many contributors, see the AUTHORS file.\n\n# Reporting Bugs\n\nIf you experience any problems with **wcurl** that you do not experience with curl,\nsubmit an issue [here](https://github.com/curl/wcurl/issues).\n\n# Copyright\n\n**wcurl** is licensed under the curl license\n","funding_links":[],"categories":["Shell"],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fcurl%2Fwcurl","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fcurl%2Fwcurl","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fcurl%2Fwcurl/lists"}