Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/urethramancer/twofold
An exercise in duplicate-finding.
https://github.com/urethramancer/twofold
cli exercise find-duplicates
Last synced: 10 days ago
JSON representation
An exercise in duplicate-finding.
- Host: GitHub
- URL: https://github.com/urethramancer/twofold
- Owner: Urethramancer
- License: mit
- Created: 2017-01-04T13:20:55.000Z (almost 8 years ago)
- Default Branch: main
- Last Pushed: 2021-12-27T21:01:08.000Z (almost 3 years ago)
- Last Synced: 2024-06-20T06:31:28.203Z (6 months ago)
- Topics: cli, exercise, find-duplicates
- Language: Go
- Size: 14.6 KB
- Stars: 1
- Watchers: 2
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# twofold [![Build Status](https://travis-ci.org/Urethramancer/twofold.svg?branch=master)](https://travis-ci.org/Urethramancer/ftwofold)
An exercise in duplicate-finding.## Purpose
Find duplicates within a folder and list, link or remove them. Slightly dangerous, no backsies if you screw up.## Usage
The simplest invocation is to list duplicates in the current directory. Example:```sh
$ twofold -l
Deep-scanning /Users/orb/go/twofold
```You can specify a path as its sole standalone argument:
```sh
$ twofold -l ~/Downloads
Deep-scanning /Users/orb/Downloads
```All duplicates found will be listed, grouped by hash, then the program exits.
Use the `-v` flag to also display checksumming progress while it traverses the path.
The slightly more destructive flags are:
- `--symlink` to remove and symlink duplicates from the first file in the set
- `--hardlink` to remove and link (a.k.a. hardlink) duplicates to the inode of the first file in the set (the most convenient, should the original move or be deleted)
- `--remove` to simply remove the duplicates (the most destructive option)## LICENCE
MIT.### TODO
- Maybe add date-sorting to make it possible to keep the oldest or newest of each set of duplicates.
- Search multiple supplied directories, finding duplicates between all of them.