https://github.com/andreaferretti/backupper
A simple tool for incremental copies
https://github.com/andreaferretti/backupper
Last synced: 6 months ago
JSON representation
A simple tool for incremental copies
- Host: GitHub
- URL: https://github.com/andreaferretti/backupper
- Owner: andreaferretti
- License: apache-2.0
- Created: 2020-01-19T16:20:21.000Z (over 5 years ago)
- Default Branch: master
- Last Pushed: 2020-01-21T19:16:41.000Z (over 5 years ago)
- Last Synced: 2025-01-29T22:31:52.402Z (8 months ago)
- Language: Nim
- Size: 6.84 KB
- Stars: 1
- Watchers: 3
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Backupper
Backupper is a simple backup tool. It suits my use case and workflow, but it is
not guaranteed to be for everyone.## Disclaimer
Backupper works for me. It may eat your hard drive and set your hard disk on
fire. If that happens, I am sorry for you, but there is not much I can do.
Use at your own risk. No guarantee implied whatsoever.If you want to use it, I suggest that at least you read the source to ensure
that it does what you expect, it is only 136 lines.## Assumption
In my workflow, I maintain a set of documents on multiple machines. I want
to keep them in sync, but the machines are not networked, and the set of
documents is big. For this, I need a tool that lets me compute a digest of a
directory, and then compute a diff for the status of a directory with respect
to a previous digest. I can then apply the diff over the equivalent directory
on a different machine.Diffs only contain the updated files and some metadata, so they are easily
portable on a USB key. Digests are just JSON files containing hashes.A final assumption - I don't delete documents (but I may move them). Backupper
has an option to remove documents, but it is disabled by default. It is not
smart, and will not look at timestamps - instead it will removed every file
present in the target directory but not in the digest. so DO NOT ENABLE IT
unless you have read the source, checked the content of the digest and are
sure of what you are doing.## Usage
Say you mainly use machine A, but sometimes you want to sync machine B as well.
On machine A, the documents are stored in `dirA`, on machine B in `dirB`.1. On machine B, compute the latest digest
```
backupper digest --dir dirB --json digest.json
```2. Copy `digest.json` on machine A
3. See if machine A has some updated content
```
backupper changes --dir dirA --json digest.json --digest-dir diff
```Now `diff` should contain all files that are changed or updated since the
digest, together with some metadata.4. Copy `diff` on machine B
5. Apply the changes to keep machine B updated
```
backupper recover --dir dirB --digest-dir diff
```