{"id":13535127,"url":"https://github.com/Socialbird-AILab/BERT-Classification-Tutorial","last_synced_at":"2025-04-02T00:32:36.031Z","repository":{"id":49741652,"uuid":"159097177","full_name":"Socialbird-AILab/BERT-Classification-Tutorial","owner":"Socialbird-AILab","description":null,"archived":false,"fork":false,"pushed_at":"2018-12-07T05:44:06.000Z","size":1092,"stargazers_count":529,"open_issues_count":12,"forks_count":143,"subscribers_count":20,"default_branch":"master","last_synced_at":"2024-08-02T08:09:54.955Z","etag":null,"topics":[],"latest_commit_sha":null,"homepage":null,"language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"apache-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/Socialbird-AILab.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":"CONTRIBUTING.md","funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null}},"created_at":"2018-11-26T02:15:25.000Z","updated_at":"2024-08-01T07:43:08.000Z","dependencies_parsed_at":"2022-09-24T04:10:15.869Z","dependency_job_id":null,"html_url":"https://github.com/Socialbird-AILab/BERT-Classification-Tutorial","commit_stats":null,"previous_names":[],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Socialbird-AILab%2FBERT-Classification-Tutorial","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Socialbird-AILab%2FBERT-Classification-Tutorial/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Socialbird-AILab%2FBERT-Classification-Tutorial/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Socialbird-AILab%2FBERT-Classification-Tutorial/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/Socialbird-AILab","download_url":"https://codeload.github.com/Socialbird-AILab/BERT-Classification-Tutorial/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":222788514,"owners_count":17037777,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":[],"created_at":"2024-08-01T08:00:50.025Z","updated_at":"2024-11-02T23:30:39.941Z","avatar_url":"https://github.com/Socialbird-AILab.png","language":"Python","readme":"# BERT-Classification-Tutorial\n标注数据，可以说是AI模型训练里最艰巨的一项工作了。自然语言处理的数据标注更是需要投入大量人力。相对计算机视觉的图像标注，文本的标注通常没有准确的标准答案，对句子理解也是因人而异，让这项工作更是难上加难。\n但是！谷歌最近发布的BERT大大的解决了这个问题！根据我们的实验，BERT在文本多分类的任务中，能在极小的数据下，带来显著的分类准确率提升。并且，实验主要对比的是仅仅5个月前发布的State of the art 语言模型迁移学习模型 - ULMFiT (https://arxiv.org/abs/1801.06146)， 结果有着明显的提升。\n\n![alt text](https://github.com/Socialbird-AILab/BERT-Classification-Tutorial/blob/master/pictures/Results.png)\n\n从上图我们可以看出，在不同的数据集中，BERT都有非常出色的表现。我们用的实验数据分为1000、 6700 和 12000 条，并且各自包含了测试数据，训练测试分割为80%-20%。数据集从多个网页来源获得，并经过了一系列的分类映射。但Noisy数据集带有较为显著的噪音，抽样统计显示噪音比例在20%左右。实验对比了几个模型，从最基础的卷积网络作为Baseline，到卷积网络加上传统的词向量Glove embedding， 然后是ULMFiT和BERT。\n\n\n# 1.运行环境\nTensorflow版本为Windows 1.10.0 GPU，具体安装教程可以参考此链接https://www.tensorflow.org/install/pip?lang=python3。Anaconda 版本为1.9.2 。\n\n# 2.硬件配置\n实验用的机器显卡为NVIDIA GeoForce GTX 1080 Ti，BERT base 模型占用显存约为9.5G。\n\n# 3.下载模型\n所有的运行环境设置好后，在这里可以下载到我们实验用的BERT base: https://storage.googleapis.com/bert_models/2018_10_18/uncased_L-12_H-768_A-12.zip\n 下载完后，放在 BERT_BASE_DIR 中。\n\n# 4.输入数据准备\n我们需要将文本数据分为三部分:\n\"\tTrain: train.tsv\n\"\tEvaluate: dev.tsv\n\"\tTest: test.tsv\n下面可以看到每个文件的格式，非常简单，一列为需要做分类的文本数据，另一列则是对应的Label。\n\ndata文件夹中包含了1000条10分类的样本数据，并分为训练和测试集。\n\n![alt text](https://github.com/Socialbird-AILab/BERT-Classification-Tutorial/blob/master/pictures/Our%20data%20example.png)\n\n# 5.实现细节\n运行run_classifier.py实现1000条10分类样本数据的文本分类任务。具体实现细节请参考教程:https://mp.weixin.qq.com/s/XmeDjHSFI0UsQmKeOgwnyA\n\n\n","funding_links":[],"categories":["BERT classification task:","Python","Tasks"],"sub_categories":["Classification"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2FSocialbird-AILab%2FBERT-Classification-Tutorial","html_url":"https://awesome.ecosyste.ms/projects/github.com%2FSocialbird-AILab%2FBERT-Classification-Tutorial","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2FSocialbird-AILab%2FBERT-Classification-Tutorial/lists"}