Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/vbalnt/tfeat
TFeat descriptor models for BMVC 2016 paper "Learning local feature descriptors with triplets and shallow convolutional neural networks"
https://github.com/vbalnt/tfeat
deep-learning descriptor pytorch tfeat-descriptor triplets
Last synced: 22 days ago
JSON representation
TFeat descriptor models for BMVC 2016 paper "Learning local feature descriptors with triplets and shallow convolutional neural networks"
- Host: GitHub
- URL: https://github.com/vbalnt/tfeat
- Owner: vbalnt
- License: mit
- Created: 2016-09-21T14:00:38.000Z (over 7 years ago)
- Default Branch: master
- Last Pushed: 2021-01-16T11:26:01.000Z (over 3 years ago)
- Last Synced: 2024-05-02T02:17:33.379Z (about 2 months ago)
- Topics: deep-learning, descriptor, pytorch, tfeat-descriptor, triplets
- Language: Jupyter Notebook
- Size: 30.1 MB
- Stars: 146
- Watchers: 18
- Forks: 45
- Open Issues: 2
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Lists
- awesome-cbir-papers - Learning local feature descriptors with triplets and shallow convolutional neural networks
- awesome-image-retrieval-papers - Learning local feature descriptors with triplets and shallow convolutional neural networks
README
# TFeat shallow convolutional patch descriptor
Code for the BMVC 2016 paper [Learning local feature descriptors with triplets and shallow convolutional neural networks](http://www.bmva.org/bmvc/2016/papers/paper119/paper119.pdf)## Pre-trained models
We provide the following pre-trained models:| network name | model link | training dataset |
| ------------- | :-------------: | -----: |
| `tfeat-liberty` | [tfeat-liberty.params](./pretrained-models/tfeat-liberty.params) | liberty (UBC) |
| `tfeat-yosemite` | [tfeat-yosemite.params](./pretrained-models/tfeat-yosemite.params) | yosemite (UBC) |
| `tfeat-notredame` | [tfeat-notredame.params](./pretrained-models/tfeat-notredame.params) | notredame (UBC) |
| `tfeat-ubc` | coming soon... | all UBC |
| `tfeat-hpatches` | coming soon... | HPatches (split A) |
| `tfeat-all` | coming soon... | All the above |## Pre-trained models usage with Kornia
TFeat has been integrated into [Kornia](https://github.com/kornia/kornia)
First install Kornia: `pip install kornia`
```python
import torch
import kornia as Kinput = torch.rand(16, 1, 32, 32)
tfeat = K.feature.TFeat(pretrained=True)
descs = tfeat(input) # 16x128
```## Quick start guide
To run `TFeat` on a tensor of patches:```python
tfeat = tfeat_model.TNet()
net_name = 'tfeat-liberty'
models_path = 'pretrained-models'
net_name = 'tfeat-liberty'
tfeat.load_state_dict(torch.load(os.path.join(models_path,net_name+".params")))
tfeat.cuda()
tfeat.eval()x = torch.rand(10,1,32,32).cuda()
descrs = tfeat(x)
print(descrs.size())#torch.Size([10, 128])
```Note that no normalisation is needed for the input patches,
it is done internally inside the network.## Testing `TFeat`: Examples (WIP)
We provide an `ipython` notebook that shows how to load and use
the pre-trained networks. We also provide the following examples:- extracting descriptors from image patches
- matching two images using `openCV`
- matching two images using `vlfeat`For the testing example code, check [tfeat-test notebook](tfeat-test.ipynb)
## Re-training `TFeat`
We provide an `ipython` notebook with examples on how to train
`TFeat`. Training can either use the `UBC` datasets `Liberty,
Notredame, Yosemite`, the `HPatches` dataset, and combinations
of all the datasets.For the training code, check [tfeat-train notebook](tfeat-train.ipynb)