https://github.com/frankaging/lat-for-transformer
Structured Self-Attention Weights Encode Semantics in Sentiment Analysis
https://github.com/frankaging/lat-for-transformer
attention-flows attention-weights semantics sentiment-analysis
Last synced: 8 months ago
JSON representation
Structured Self-Attention Weights Encode Semantics in Sentiment Analysis
- Host: GitHub
- URL: https://github.com/frankaging/lat-for-transformer
- Owner: frankaging
- Created: 2020-04-24T19:33:57.000Z (about 6 years ago)
- Default Branch: master
- Last Pushed: 2020-10-08T23:04:08.000Z (over 5 years ago)
- Last Synced: 2025-04-05T12:24:50.550Z (about 1 year ago)
- Topics: attention-flows, attention-weights, semantics, sentiment-analysis
- Language: Jupyter Notebook
- Homepage: https://arxiv.org/abs/2010.04922
- Size: 11.6 MB
- Stars: 6
- Watchers: 2
- Forks: 2
- Open Issues: 0