Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/4ai/ls-llama
A Simple but Powerful SOTA NER Model | Official Code For Label Supervised LLaMA Finetuning
https://github.com/4ai/ls-llama
conll2003 llama llama2 llms named-entity-recognition ontonotes sequence-classification token-classification
Last synced: about 2 months ago
JSON representation
A Simple but Powerful SOTA NER Model | Official Code For Label Supervised LLaMA Finetuning
- Host: GitHub
- URL: https://github.com/4ai/ls-llama
- Owner: 4AI
- License: mit
- Created: 2023-09-30T03:09:04.000Z (over 1 year ago)
- Default Branch: main
- Last Pushed: 2024-03-17T06:21:30.000Z (10 months ago)
- Last Synced: 2024-11-08T21:05:54.396Z (about 2 months ago)
- Topics: conll2003, llama, llama2, llms, named-entity-recognition, ontonotes, sequence-classification, token-classification
- Language: Python
- Homepage: https://arxiv.org/abs/2310.01208
- Size: 3.54 MB
- Stars: 141
- Watchers: 1
- Forks: 24
- Open Issues: 9
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# LS-LLaMA: Label Supervised LLaMA Finetuning
📢: For convenience, we build a bi-directional LLMs toolkit BiLLM for language understanding. Welcome to use it.
[![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/label-supervised-llama-finetuning/named-entity-recognition-on-conll03-4)](https://paperswithcode.com/sota/named-entity-recognition-on-conll03-4?p=label-supervised-llama-finetuning)
[![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/label-supervised-llama-finetuning/named-entity-recognition-on-ontonotes-5-0-1)](https://paperswithcode.com/sota/named-entity-recognition-on-ontonotes-5-0-1?p=label-supervised-llama-finetuning)
## Usage
Our implementation currently supports the following sequence classification benchmarks:
1. SST2 (2 classes) / SST5 (5 classes)
2. AGNews (4 classes)
3. Twitter Financial News Sentiment (twitterfin, 3 classes)and token classification benchmarks for named entity recognition (NER): CoNLL2003 and OntonotesV5.
Commands for training LS-LLaMA and LS-unLLaMA on different tasks can follow the templates below:
```console
foo@bar:~$ CUDA_VISIBLE_DEVICES=0 python file_name.py dataset_name model_size
````file_name.py` can be one of `unllama_seq_clf.py`, `unllama_token_clf.py`, `llama_seq_clf.py`, and `llama_token_clf.py`, for training LS-LLaMA and LS-unLLaMA on sequence- and token-level classification.
`dataset_name` can be one of `sst2`, `sst5`, `agnews`, `twitterfin`, `conll03`, and `ontonotesv5`.
`model_size` can be `7b` or `13b`, corresponding to LLaMA-2-7B and LLaMA-2-13B.
For example, the following command will train LS-unLLaMA based on LLaMA-2-7B on AGNews for sequence classification:
```console
foo@bar:~$ CUDA_VISIBLE_DEVICES=0 python unllama_seq_clf.py agnews 7b
```## Implementations
Load Pretrained Models
```python
from transformers import AutoTokenizer
from modeling_llama import (
LlamaForSequenceClassification, LlamaForTokenClassification,
UnmaskingLlamaForSequenceClassification, UnmaskingLlamaForTokenClassification,
)model_id = 'meta-llama/Llama-2-7b'
tokenizer = AutoTokenizer.from_pretrained(model_id)
model = LlamaForSequenceClassification.from_pretrained(model_id).bfloat16()
model = LlamaForTokenClassification.from_pretrained(model_id).bfloat16()
model = UnmaskingLlamaForSequenceClassification.from_pretrained(model_id).bfloat16()
model = UnmaskingLlamaForTokenClassification.from_pretrained(model_id).bfloat16()
```For more usage, please refer to `unllama_seq_clf.py`, `unllama_token_clf.py`, `llama_seq_clf.py`, `llama_token_clf.py`.
# Citation
```
@article{li2023label,
title={Label supervised llama finetuning},
author={Li, Zongxi and Li, Xianming and Liu, Yuzhang and Xie, Haoran and Li, Jing and Wang, Fu-lee and Li, Qing and Zhong, Xiaoqin},
journal={arXiv preprint arXiv:2310.01208},
year={2023}
}
```