Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/zhpmatrix/Kaggle-Quora-Insincere-Questions-Classification
Kaggle新赛(baseline)-基于BERT的fine-tuning方案+基于tensor2tensor的Transformer Encoder方案
https://github.com/zhpmatrix/Kaggle-Quora-Insincere-Questions-Classification
bert kaggle quora tensor2tensor transformer
Last synced: 6 days ago
JSON representation
Kaggle新赛(baseline)-基于BERT的fine-tuning方案+基于tensor2tensor的Transformer Encoder方案
- Host: GitHub
- URL: https://github.com/zhpmatrix/Kaggle-Quora-Insincere-Questions-Classification
- Owner: zhpmatrix
- Created: 2018-11-10T13:59:22.000Z (almost 6 years ago)
- Default Branch: master
- Last Pushed: 2018-11-11T06:22:51.000Z (almost 6 years ago)
- Last Synced: 2024-08-02T08:09:54.265Z (3 months ago)
- Topics: bert, kaggle, quora, tensor2tensor, transformer
- Language: Python
- Homepage:
- Size: 652 KB
- Stars: 61
- Watchers: 5
- Forks: 28
- Open Issues: 1
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
- awesome-transformer-nlp - zhpmatrix/Kaggle-Quora-Insincere-Questions-Classification - Kaggle baseline—fine-tuning BERT and tensor2tensor based Transformer encoder solution. (Tasks / Classification)
- awesome-bert - zhpmatrix/Kaggle-Quora-Insincere-Questions-Classification - 基于BERT的fine-tuning方案+基于tensor2tensor的Transformer Encoder方案 (BERT classification task:)
README
# Kaggle-Quora-Insincere-Questions-Classification
[Kaggle新赛](https://www.kaggle.com/c/quora-insincere-questions-classification)-基于BERT的fine-tuning方案baseline
submit.csv是测试集的预测结果
代码和项目[基于BERT的中文tagging](https://github.com/zhpmatrix/bert-sequence-tagging)一样,仅提供关键fine-tuning代码和运行脚本
基于bert的验证集的结果:
|class| precision | recall | f1-score |
|-----| ------ | ------ | ------ |
|0| 0.98 | 0.98 | 0.98 |
|1| 0.65 | 0.62 | 0.63 |
|micro avg| 0.96 | 0.96 | 0.96 |
|macro avg| 0.81 | 0.80 | 0.81 |
|weighted avg| 0.96 | 0.96 | 0.96 |基于tensor2tensor的验证集结果:
|class| precision | recall | f1-score |
|-----| ------ | ------ | ------ |
|0| 0.98 | 0.96 | 0.96 |
|1| 0.23 | 0.19 | 0.21 |
|micro avg| 0.92 | 0.92 | 0.92 |
|macro avg| 0.59 | 0.57 | 0.58 |
|weighted avg| 0.91 | 0.92 | 0.91 |