{"id":20770714,"url":"https://github.com/codemonauts/s3-ecouploader","last_synced_at":"2025-09-28T02:30:35.745Z","repository":{"id":82653946,"uuid":"235424999","full_name":"codemonauts/s3-ecouploader","owner":"codemonauts","description":"Sync a folder to an S3 bucket and try to use not much ram, bandwith and storage space","archived":false,"fork":false,"pushed_at":"2020-02-12T10:48:27.000Z","size":13,"stargazers_count":4,"open_issues_count":0,"forks_count":1,"subscribers_count":3,"default_branch":"master","last_synced_at":"2025-04-09T14:12:53.641Z","etag":null,"topics":["hacktoberfest"],"latest_commit_sha":null,"homepage":"","language":"Go","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":null,"status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/codemonauts.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":null,"code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2020-01-21T19:31:42.000Z","updated_at":"2024-10-24T19:38:48.000Z","dependencies_parsed_at":null,"dependency_job_id":"9d0b0df1-83bf-4a30-a538-1c8e62e447e6","html_url":"https://github.com/codemonauts/s3-ecouploader","commit_stats":null,"previous_names":[],"tags_count":0,"template":false,"template_full_name":null,"purl":"pkg:github/codemonauts/s3-ecouploader","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/codemonauts%2Fs3-ecouploader","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/codemonauts%2Fs3-ecouploader/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/codemonauts%2Fs3-ecouploader/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/codemonauts%2Fs3-ecouploader/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/codemonauts","download_url":"https://codeload.github.com/codemonauts/s3-ecouploader/tar.gz/refs/heads/master","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/codemonauts%2Fs3-ecouploader/sbom","scorecard":null,"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":277315624,"owners_count":25797669,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","status":"online","status_checked_at":"2025-09-28T02:00:08.834Z","response_time":79,"last_error":null,"robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":true,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["hacktoberfest"],"created_at":"2024-11-17T12:11:33.943Z","updated_at":"2025-09-28T02:30:35.401Z","avatar_url":"https://github.com/codemonauts.png","language":"Go","readme":"# s3-ecouploader\n\nThis tool was created to run on an embedded low-end NAS of one of our customers which was running some kind of very limited\nLinux so we decided to use Go and compile a static binary. To also save traffic and stoarge costs, this tool first\nchecks if a file exists in the destination bucket. If so, it also checks the `ETag` provided by S3 (which is just the\nMD5 hash of this file). This way we only upload a file if it is not present in the bucket or changed since the last\nupload.\n\n## Credentials\nThe tool will read it's credentials from the default awscli config file located at `~/.aws/credentials`.\n\n## Usage\n```\n  -bucket string\n        Destination S3 Bucket\n  -region string\n        Region of the S3 Bucket\n  -src string\n        Local folder to backup\n  -dest string\n        Remote prefix for S3\n  -debug\n        Enable debug logging\n  -force\n        Skip hashing and upload all files\n  -stdin\n        Read a list of files from stdin\n```\n\n## Optimize runtime for very big folders\nWhen running this tool every day on a big folder, most of the files didn't got touched since last run and therefore\ndon't need to be processed by this tool. One can now use `find` to get a list of files which got modified a short while\nago. This way the amount of files which have to be processed by this tool drops significantly. We run our backup\nscripts every night and start the script like this:\n```\nfind /mnt/data -mtime -2 -type f | ./s3-ecouploader -stdin ....\n```\nThis command will process all files (`-type f`) which have a modification timestamp (`mtime`) between now and two days ago\n(`-2`).\n\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fcodemonauts%2Fs3-ecouploader","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fcodemonauts%2Fs3-ecouploader","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fcodemonauts%2Fs3-ecouploader/lists"}