{"id":20899179,"url":"https://github.com/geekzter/files-sync","last_synced_at":"2025-12-25T09:19:14.094Z","repository":{"id":39711554,"uuid":"420926109","full_name":"geekzter/files-sync","owner":"geekzter","description":"PowerShell script wrappers for AzCopy \u0026 rsync","archived":false,"fork":false,"pushed_at":"2025-01-11T19:52:51.000Z","size":70,"stargazers_count":0,"open_issues_count":0,"forks_count":0,"subscribers_count":2,"default_branch":"main","last_synced_at":"2025-01-19T12:54:03.252Z","etag":null,"topics":["azcopy","azure-storage","file-sync","file-upload","lightroom","powershell","rsync","rsync-wrapper"],"latest_commit_sha":null,"homepage":"","language":"PowerShell","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/geekzter.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2021-10-25T07:46:44.000Z","updated_at":"2025-01-11T19:52:50.000Z","dependencies_parsed_at":"2024-01-21T13:30:37.065Z","dependency_job_id":"c239bdbb-84b3-4fd1-a9f4-6145bee7f3e3","html_url":"https://github.com/geekzter/files-sync","commit_stats":null,"previous_names":[],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/geekzter%2Ffiles-sync","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/geekzter%2Ffiles-sync/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/geekzter%2Ffiles-sync/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/geekzter%2Ffiles-sync/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/geekzter","download_url":"https://codeload.github.com/geekzter/files-sync/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":243285625,"owners_count":20266849,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["azcopy","azure-storage","file-sync","file-upload","lightroom","powershell","rsync","rsync-wrapper"],"created_at":"2024-11-18T11:13:23.832Z","updated_at":"2025-12-25T09:19:09.068Z","avatar_url":"https://github.com/geekzter.png","language":"PowerShell","readme":"[![azcopy-ci](https://github.com/geekzter/files-sync/actions/workflows/azcopy-ci.yml/badge.svg)](https://github.com/geekzter/files-sync/actions/workflows/azcopy-ci.yml)\n[![install-azcopy-ci](https://github.com/geekzter/files-sync/actions/workflows/install-azcopy-ci.yml/badge.svg)](https://github.com/geekzter/files-sync/actions/workflows/install-azcopy-ci.yml)\n\n# File sync scripts for azcopy \u0026 rsync\n\nThis repo contains a number of scripts that act as wrappers around [azcopy](https://github.com/Azure/azure-storage-azcopy) and [rsync](https://github.com/WayneD/rsync). You can find them [here](./scripts).\n\nThe wrappers allow batch (i.e. multiple sync) operations to happen in sequence, and implement (additional) retries. They are implemented using PowerShell, with the batches configured as JSON files.\n\nWhy would you want to sync files to a redundant location? If you're using a cloud storage solution such as DropBox, Google Drive, iCloud or OneDrive, your files are safe you'd assume. Well, yes and no. They protect you from any *hardware* problems you may have on the device you're syncing files from. That is, you still have an online copy of the files in case you lose a device due to a hardware issue or altogether.\nHowever, they do not (or only in a limited way) protect you against *logical* errors. A logical error is when you accidentally delete files in a file (move) operation. Your deletes will be synced to the cloud and files will be removed (eventually) there as well. A classic example is overwriting a folder containing files with a folder with no files using the Finder on macOS. Once you confirm the overwrite, your original files have been deleted.\nIf you discover what happened too late, data may be lost forever.  \nLast, incidents have happened in the past were cloud providers did lose your data e.g. [Adobe Creative Cloud](https://www.dpreview.com/news/8563369544/lightroom-cc-update-for-ios-ipados-permanently-deletes-photos-and-presets-for-some-users).  \n\nThis repo takes the approach of archiving files to a (remote) destination where they will remain, no syncing of deletes. This works great for files you are unlikely to modify e.g. media files.\n\n## Sync with rsync\n\n[rsync](https://github.com/WayneD/rsync) is a tool with a long history on Linux that is also preinstalled on macOS. The [sync_with_rsync.ps1](./scripts/sync_with_rsync.ps1) script takes a settings file with configured directory pairs and optional patterns and exclude list as argument. See example below:\n\n```json\n{\n    \"syncPairs\" : [\n        {\n            \"source\": \"~/Pictures/Lightroom/Photos\",\n            \"target\": \"/Volumes/External/LightroomBackup/Photos\"\n        }\n        \n    ]\n}\n```\n\n### Syncing\n\nAdapt the sample [sample](./scripts/rsync-settings.json) and pass its path as argument into [sync_with_rsync.ps1](./scripts/sync_with_rsync.ps1):\n\n```powershell\nsync_with_rsync.ps1 -SettingsFile /path/to/settings.json\n```\n\nOr sync with the bash script [sync_with_rsync.sh](./scripts/sync_with_rsync.sh):\n\n```bash\nsync_with_rsync.sh --settings-file /path/to/settings.json\n```\n\n## Sync with azcopy \n\n[azcopy](https://github.com/Azure/azure-storage-azcopy) is a tool that allows you to sync (among other things) a local directory to an Azure Storage Account. The [sync_with_azcopy.ps1](./scripts/sync_with_azcopy.ps1) script takes a settings file with configured directory pairs and optional patterns and exclude list as argument. See example below:\n```json\n{\n    \"tenantId\" : \"00000000-0000-0000-0000-000000000000\",\n    \"syncPairs\" : [\n        {\n            \"source\": \"~/Pictures/Lightroom/Photos\",\n            \"target\": \"https://mystorage.blob.core.windows.net/lightroom/photos\"\n        }\n    ]\n}\n```\n\nThis settings file requires an Azure Active Directory tenant to be configured through the tenantId field. This will allow the script to 'find' storage accounts configured in the settings file using Azure Resource Graph. Alternatively, the `AZCOPY_TENANT_ID` environment variable or the `Tenant` argument can be used.\n\n### Azure Storage Account(s)\n\nYou can work with pre-existing storage accounts, or you can use the [create_storage_account.ps1](./scripts/create_storage_account.ps1) script to create one with recommended settings: \n- Cross-region (RA-GRS) data replication\n- Public access disabled\n- Resource lock to prevent accidental deletion\n- Soft delete enabled\n- Storage Firewall enabled (sync will open and close it as needed)\n\n```powershell\ncreate_storage_account.ps1 -Name mystorage `\n                           -ResourceGroup files-sync `\n                           -Location westeurope `\n                           -Container lightroom, pictures, video\n```\n\n### Syncing with Azure Storage\n\nSee adapt the sample [sample](./scripts/azcopy-settings.jsonc) and pass its path as argument into [sync_with_azcopy.ps1](./scripts/sync_with_azcopy.ps1):\n```powershell\nsync_with_azcopy.ps1 -SettingsFile /path/to/settings.json\n```\nThis script separates Azure control plane and data plane operations, with the latter happening after the former. Once azcopy has started, control plane access is no longer required and SAS tokens will be used for authentication.","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fgeekzter%2Ffiles-sync","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fgeekzter%2Ffiles-sync","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fgeekzter%2Ffiles-sync/lists"}