Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/mikoto10032/automaticweightedloss
Multi-task learning using uncertainty to weigh losses for scene geometry and semantics, Auxiliary Tasks in Multi-task Learning
https://github.com/mikoto10032/automaticweightedloss
auxiliary-tasks deep-learning multi-task multi-task-learning pytorch weigh-losses
Last synced: 7 days ago
JSON representation
Multi-task learning using uncertainty to weigh losses for scene geometry and semantics, Auxiliary Tasks in Multi-task Learning
- Host: GitHub
- URL: https://github.com/mikoto10032/automaticweightedloss
- Owner: Mikoto10032
- License: apache-2.0
- Created: 2020-06-07T11:06:39.000Z (over 4 years ago)
- Default Branch: master
- Last Pushed: 2020-06-20T06:07:20.000Z (over 4 years ago)
- Last Synced: 2024-12-22T10:11:59.382Z (14 days ago)
- Topics: auxiliary-tasks, deep-learning, multi-task, multi-task-learning, pytorch, weigh-losses
- Language: Python
- Homepage:
- Size: 8.79 KB
- Stars: 591
- Watchers: 5
- Forks: 83
- Open Issues: 13
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# AutomaticWeightedLoss
A PyTorch implementation of Liebel L, Körner M. [Auxiliary tasks in multi-task learning](https://arxiv.org/pdf/1805.06334)[J]. arXiv preprint arXiv:1805.06334, 2018.
The above paper improves the paper "[Multi-task learning using uncertainty to weigh losses for scene geometry and semantics](http://openaccess.thecvf.com/content_cvpr_2018/html/Kendall_Multi-Task_Learning_Using_CVPR_2018_paper.html)" to avoid the loss of becoming negative during training.
## Requirements
* Python
* PyTorch## How to Train with Your Model
* Clone the repository
``` bash
git clone [email protected]:Mikoto10032/AutomaticWeightedLoss.git
```* Create an AutomaticWeightedLoss module
```python
from AutomaticWeightedLoss import AutomaticWeightedLossawl = AutomaticWeightedLoss(2) # we have 2 losses
loss1 = 1
loss2 = 2
loss_sum = awl(loss1, loss2)
```* Create an optimizer to learn weight coefficients
```python
from torch import optimmodel = Model()
optimizer = optim.Adam([
{'params': model.parameters()},
{'params': awl.parameters(), 'weight_decay': 0}
])
```* A complete example
```python
from torch import optim
from AutomaticWeightedLoss import AutomaticWeightedLossmodel = Model()
awl = AutomaticWeightedLoss(2) # we have 2 losses
loss_1 = ...
loss_2 = ...# learnable parameters
optimizer = optim.Adam([
{'params': model.parameters()},
{'params': awl.parameters(), 'weight_decay': 0}
])for i in range(epoch):
for data, label1, label2 in data_loader:
# forward
pred1, pred2 = Model(data)
# calculate losses
loss1 = loss_1(pred1, label1)
loss2 = loss_2(pred2, label2)
# weigh losses
loss_sum = awl(loss1, loss2)
# backward
optimizer.zero_grad()
loss_sum.backward()
optimizer.step()
```## Something to Say
Actually, it is not always effective, but I hope it can help you.