Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
awesome-bert
bert nlp papers, applications and github resources, including the newst xlnet , BERT、XLNet 相关论文和 github 项目
https://github.com/Jiakui/awesome-bert
Last synced: 37 minutes ago
JSON representation
-
other resources for BERT:
- brightmart/bert_language_understanding - training of Deep Bidirectional Transformers for Language Understanding: pre-train TextCNN,
- Y1ran/NLP-BERT--ChineseVersion
- JayYip/bert-multiple-gpu
- HighCWu/keras-bert-tpu - trained models for feature extraction and prediction on TPU,
- Willyoung2017/Bert_Attempt
- Pydataman/bert_examples
- Microsoft/AzureML-BERT - to-end walk through for fine-tuning BERT using Azure Machine Learning ,
- bigboNed3/bert_serving
- yoheikikuta/bert-japanese
- algteam/bert-examples - demo,
- cedrickchee/awesome-bert-nlp
- cnfive/cnbert
- brightmart/bert_customized
- yaserkl/BERTvsULMFIT
- 1234560o/Bert-model-code-interpretation
- cdathuraliya/bert-inference
- gameofdimension/java-bert-predict
- guotong1988/BERT-chinese - training of Deep Bidirectional Transformers for Language Understanding 中文 汉语
- yangbisheng2009/cn-bert
- zhongyunuestc/bert_multitask
- yuanxiaosc/BERT_Paper_Chinese_Translation - training of Deep Bidirectional Transformers for Language Understanding 论文的中文翻译。Chinese Translation! https://yuanxiaosc.github.io/2018/12/…,
- JayYip/bert-multiple-gpu
- JayYip/bert-multitask-learning
- kpot/keras-transformer
-
BERT NER task:
- kyzhouhzau/Bert-BiLSTM-CRF - as-service. Model structure : bert-embedding bilstm crf. ,
- Hoiy/berserker - BERt chineSE woRd toKenizER, Berserker (BERt chineSE woRd toKenizER) is a Chinese tokenizer built on top of Google's BERT model. ,
- Kyubyong/bert_ner
- jiangpinglei/BERT_ChineseWordSegment - Score 97%,
- lemonhu/NER-BERT-pytorch - trained BERT model.
- zhpmatrix/bert-sequence-tagging
- kyzhouhzau/BERT-NER - 2003 NER ! ,
- king-menin/ner-bert - Bi-LSTM-CRF) with google bert https://github.com/google-research.
- macanv/BERT-BiLSMT-CRF-NER - CRF model with Google BERT Fine-tuning ,
- FuYanzhe2/Name-Entity-Recognition - crf,Lattice-CRF,bert-ner及近年ner相关论文follow,
- mhcao916/NER_Based_on_BERT
- ProHiryu/bert-chinese-ner
- yanwii/ChineseNER - GRU + CRF 的中文机构名、人名识别 中文实体识别, 支持google bert模型
-
BERT chatbot :
- guillaume-chevalier/ReuBERT - answering chatbot, simply.
- GaoQ1/rasa_nlu_gq
- GaoQ1/rasa_chatbot_cn - nlu和rasa-core 搭建的对话系统demo,
- GaoQ1/rasa-bert-finetune - nlu 的bert finetune,
- geodge831012/bert_robot
- yuanxiaosc/BERT-for-Sequence-Labeling-and-Text-Classification - 2003 named entity identification, Snips Slot Filling and Intent Prediction.
-
BERT language model and embedding:
- YC-wind/embedding_study
- Kyubyong/bert-token-embeddings
- xu-song/bert_as_language_model - research/bert,
- yuanxiaosc/Deep_dynamic_word_representation - trained models for deep dynamic word representation (DDWR). It combines the BERT model and ELMo's deep context word representation.,
- imgarylai/bert-embedding - embedding.readthedocs.io/,
- terrifyzhao/bert-utils
- fennuDetudou/BERT_implement
- charles9n/bert-sklearn
- NVIDIA/Megatron-LM
- hankcs/BERT-token-level-embedding
- facebookresearch/LAMA - trained language models.
- xu-song/bert_as_language_model - research/bert,
- yuanxiaosc/Deep_dynamic_word_representation - trained models for deep dynamic word representation (DDWR). It combines the BERT model and ELMo's deep context word representation.,
- whqwill/seq2seq-keyphrase-bert - keyphrase-pytorch,
-
BERT Text Summarization Task:
- nlpyang/BertSum - tune BERT for Extractive Summarization,
- santhoshkolloju/Abstractive-Summarization-With-Transfer-Learning
- nayeon7lee/bert-summarization - Based Natural Language Generation for Text Summarization', Paper: https://arxiv.org/pdf/1902.09243.pdf
- dmmiller612/lecture-summarizer
-
BERT Text Generation Task:
- asyml/texar - purpose text generation toolkit, has also implemented BERT here for classification, and text generation applications by combining with Texar's other modules.
- voidful/BertGenerate
- Tiiiger/bert_score
-
BERT Knowledge Graph Task :
- lvjianxin/Knowledge-extraction - LSTM+CRF 升级版:Bert预训练
- aditya-AI/Information-Retrieval-System-using-BERT
- jkszw2014/bert-kbqa-NLPCC2017
- yuanxiaosc/Schema-based-Knowledge-Extraction
- yuanxiaosc/Entity-Relation-Extraction
- WenRichard/KBQA-BERT
- zhpmatrix/BERTem
- sakuranew/BERT-AttributeExtraction - tuning and feature extraction. 使用基于bert的微调和特征提取方法来进行知识图谱百度百科人物词条属性抽取。,
-
BERT Coreference Resolution
- ianycxu/RGCN-with-BERT - Relational Graph Convolutional Networks (RGCN) with BERT for Coreference Resolution Task
- isabellebouchard/BERT_for_GAP-coreference
-
BERT visualization toolkit:
-
BERT Text Match:
-
BERT tutorials:
- graykode/nlp-tutorial
- dragen1860/TensorFlow-2.x-Tutorials - Encoders, FasterRCNN, GPT, BERT examples, etc. TF 2.0版入门实例代码,实战教程。,
-
official implement:
- google-research/bert - trained models for BERT ,
-
implement of BERT besides tensorflow:
- codertimo/BERT-pytorch
- dmlc/gluon-nlp
- dbiir/UER-py - py is a toolkit for pre-training on general-domain corpus and fine-tuning on downstream task. UER-py maintains model modularity and supports research extensibility. It facilitates the use of different pre-training models (e.g. BERT), and provides interfaces for users to further extend upon.
- BrikerMan/Kashgari - powered multilingual NLP framework, allows you to build your models in 5 minutes for named entity recognition (NER), part-of-speech tagging (PoS) and text classification tasks. Includes BERT, GPT-2 and word2vec embedding.
- kaushaltrivedi/fast-bert
- Separius/BERT-keras - trained weights,
- innodatalabs/tbert
- dreamgonfly/BERT-pytorch
- CyberZHG/keras-bert - trained models for feature extraction and prediction
- MaZhiyuanBUAA/bert-tf1.4.0 - tf1.4.0
- dhlee347/pytorchic-bert
- miroozyx/BERT_with_keras
- conda-forge/pytorch-pretrained-bert-feedstock - smithy repository for pytorch-pretrained-bert. ,
- Rshcaroline/BERT_Pytorch_fastNLP
- nghuyong/ERNIE-Pytorch
- guotong1988/BERT-tensorflow - training of Deep Bidirectional Transformers for Language Understanding
- soskek/bert-chainer - training of Deep Bidirectional Transformers for Language Understanding"
-
Pretrained BERT weights:
- brightmart/roberta_zh
- ymcui/Chinese-BERT-wwm - Training with Whole Word Masking for Chinese BERT(中文BERT-wwm预训练模型) https://arxiv.org/abs/1906.08101,
- thunlp/OpenCLaP - trained Model Zoo, OpenCLaP:多领域开源中文预训练语言模型仓库,
- brightmart/xlnet_zh - Trained Chinese XLNet_Large,
-
improvement over BERT:
- zihangdai/xlnet
- kimiyoung/transformer-xl - XL: Attentive Language Models Beyond a Fixed-Length Context, This repository contains the code in both PyTorch and TensorFlow for our paper.
- GaoPeng97/transformer-xl-chinese
- PaddlePaddle/ERNIE - training models and Fine-tuning tools) BERT 的中文改进版 ERNIE,
- facebookresearch/SpanBERT
- brightmart/albert_zh - SUPERVISED LEARNING OF LANGUAGE REPRESENTATIONS https://arxiv.org/pdf/1909.11942.pdf,
- lonePatient/albert_pytorch - Supervised Learning Language Representations,
- kpe/bert-for-tf2 - BERT. https://github.com/kpe/bert-for-tf2,
- thunlp/ERNIE
- pytorch/fairseq - to-Sequence Toolkit written in Python. RoBERTa: A Robustly Optimized BERT Pretraining Approach,
-
domain specific BERT:
-
BERT Deploy Tricks:
-
BERT QA & RC task:
- sogou/SMRCToolkit
- benywon/ChineseBert
- matthew-z/R-net - net in PyTorch, with BERT and ELMo,
- nyu-dl/dl4marco-bert - ranking with BERT,
- chiayewken/bert-qa
- ankit-ai/BertQA-Attention-on-Steroids - Attention on Steroids,
- NoviScl/BERT-RACE - pretrained-BERT). I adapted the original BERT model to work on multiple choice machine comprehension.
- eva-n27/BERT-for-Chinese-Question-Answering
- allenai/allennlp-bert-qa-wrapper - pretrained-bert to make AllenNLP model archives, so that you can serve demos from AllenNLP.
- edmondchensj/ChineseQA-with-BERT
- graykode/toeicbert - pretrained-BERT model.,
- graykode/KorQuAD-beginner - beginner
- krishna-sharma19/SBU-QA
- basketballandlearn/Dureader-Bert
-
BERT classification task:
- maksna/bert-fine-tuning-for-chinese-multiclass-classification - training model bert to fine-tuning for the chinese multiclass classification
- NLPScott/bert-Chinese-classification-task
- Socialbird-AILab/BERT-Classification-Tutorial
- fooSynaptic/BERT_classifer_trial
- xiaopingzhong/bert-finetune-for-classfier
- pengming617/bert_classification
- xieyufei1993/Bert-Pytorch-Chinese-TextClassification
- liyibo/text-classification-demos
- circlePi/BERT_Chinese_Text_Class_By_pytorch
- kaushaltrivedi/bert-toxic-comments-multilabel
- lonePatient/BERT-chinese-text-classification-pytorch
- zhpmatrix/Kaggle-Quora-Insincere-Questions-Classification - 基于BERT的fine-tuning方案+基于tensor2tensor的Transformer Encoder方案
-
BERT Sentiment Analysis
- Chung-I/Douban-Sentiment-Analysis
- lynnna-xu/bert_sa
- HSLCY/ABSA-BERT-pair - Based Sentiment Analysis via Constructing Auxiliary Sentence (NAACL 2019) https://arxiv.org/abs/1903.09588,
- songyouwei/ABSA-PyTorch
- howardhsu/BERT-for-RRC-ABSA - Training for Review Reading Comprehension and Aspect-based Sentiment Analysis",
- brightmart/sentiment_analysis_fine_grain - label Classification with BERT; Fine Grained Sentiment Analysis from AI challenger,
Programming Languages
Categories
other resources for BERT:
47
implement of BERT besides tensorflow:
34
BERT QA & RC task:
29
BERT NER task:
27
BERT language model and embedding:
27
BERT classification task:
24
improvement over BERT:
21
BERT Knowledge Graph Task :
16
BERT chatbot :
12
BERT Sentiment Analysis
12
BERT Text Match:
12
BERT Deploy Tricks:
10
Pretrained BERT weights:
9
domain specific BERT:
8
BERT Text Summarization Task:
8
BERT Text Generation Task:
6
BERT Coreference Resolution
4
BERT tutorials:
4
BERT visualization toolkit:
2
official implement:
2
Sub Categories
Keywords
bert
67
nlp
40
tensorflow
27
pytorch
20
transformer
19
natural-language-processing
14
text-classification
12
ner
12
bert-model
11
language-model
11
named-entity-recognition
9
deep-learning
9
transfer-learning
8
chinese
7
python
6
machine-learning
6
roberta
6
elmo
5
pretrained-models
5
neural-network
4
relation-extraction
4
conll-2003
4
sentiment-analysis
4
gpt-2
4
xlnet
4
attention
4
google-bert
4
part-of-speech
3
multitask-learning
3
multi-task-learning
3
encoder-decoder
3
cws
3
squad
3
word-segmentation
3
albert
3
question-answering
3
classification
3
fine-tuning
3
rasa-nlu
3
entity-extraction
3
rasa
3
sentence-similarity
3
transformers
3
natural-language-understanding
3
keras
3
paper
2
nlu
2
tensorflow-serving
2
google
2
gpt2
2