{"id":13512135,"url":"https://github.com/thunlp/OpenCLaP","last_synced_at":"2025-03-30T22:31:57.805Z","repository":{"id":43087984,"uuid":"194585684","full_name":"thunlp/OpenCLaP","owner":"thunlp","description":"Open Chinese Language Pre-trained Model Zoo","archived":false,"fork":false,"pushed_at":"2020-03-18T12:26:29.000Z","size":18,"stargazers_count":981,"open_issues_count":4,"forks_count":147,"subscribers_count":36,"default_branch":"master","last_synced_at":"2025-02-25T13:55:04.698Z","etag":null,"topics":[],"latest_commit_sha":null,"homepage":"","language":null,"has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/thunlp.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null}},"created_at":"2019-07-01T02:23:06.000Z","updated_at":"2025-02-23T11:40:02.000Z","dependencies_parsed_at":"2022-09-23T20:53:00.043Z","dependency_job_id":null,"html_url":"https://github.com/thunlp/OpenCLaP","commit_stats":null,"previous_names":[],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/thunlp%2FOpenCLaP","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/thunlp%2FOpenCLaP/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/thunlp%2FOpenCLaP/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/thunlp%2FOpenCLaP/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/thunlp","download_url":"https://codeload.github.com/thunlp/OpenCLaP/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":246390896,"owners_count":20769475,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":[],"created_at":"2024-08-01T03:01:31.746Z","updated_at":"2025-03-30T22:31:56.931Z","avatar_url":"https://github.com/thunlp.png","language":null,"readme":"# OpenCLaP：多领域开源中文预训练语言模型仓库\n\n## 目录\n* [项目简介](#项目简介)\n* [模型概览](#模型概览)\n* [使用方式](#使用方式)\n* [项目网站](#项目网站)\n* [作者与致谢](#作者与致谢)\n\n## 项目简介\n\nOpenCLaP（Open **C**hinese **La**nguage **P**re-trained Model Zoo）是由清华大学人工智能研究院自然语言处理与社会人文计算研究中心推出的一个多领域中文预训练模型仓库。预训练语言模型通过在大规模文本上进行预训练，可以作为下游自然语言处理任务的模型参数或者模型输入以提高模型的整体性能。该模型仓库具有如下几个特点：\n\n- 多领域。我们目前训练出了基于法律文本和百度百科的预训练模型，以提供多样化的可选择模型。\n- 能力强。我们使用了当前主流的 BERT 模型作为预训练的神经网络结构，并支持最大 512 长度的文本输入来适配更加多样的任务需求。\n- 持续更新。我们将在近期加入更多的预训练模型，如增加更多样的训练语料，使用最新的全词覆盖（Whole Word Masking）训练策略等。\n\n## 模型概览\n\n以下是我们目前公开发布的模型概览：\n\n| 名称         | 基础模型  | 数据来源                            | 训练数据大小 | 词表大小 | 模型大小 | 下载地址 |\n| ------------ | --------- | ----------------------------------- | ------------ | -------- | -------- | -------- |\n| 民事文书BERT | bert-base | 全部民事文书                        | 2654万篇文书 | 22554    | 370MB | [点我下载](https://thunlp.oss-cn-qingdao.aliyuncs.com/bert/ms.zip)     |\n| 刑事文书BERT | bert-base | 全部刑事文书                        | 663万篇文书  | 22554  | 370MB  | [点我下载](https://thunlp.oss-cn-qingdao.aliyuncs.com/bert/xs.zip)     |\n| 百度百科BERT | bert-base | [百度百科](http://baike.baidu.com/) | 903万篇词条  | 22166  | 367MB  | [点我下载](https://thunlp.oss-cn-qingdao.aliyuncs.com/bert/baike.zip)     \n\n## 使用方式\n\n我们提供的模型可以被开源项目[pytorch-pretrained-BERT](https://github.com/huggingface/pytorch-pretrained-BERT)直接使用。以民事文书BERT为例，具体使用方法分为两步：\n\n* 首先使用脚本下载我们的模型\n\n```\nwget https://thunlp.oss-cn-qingdao.aliyuncs.com/bert/ms.zip\nunzip ms.zip\n```\n\n* 在运行时指定使用我们的模型`--bert_model $model_folder`来进行使用\n\n## 项目网站\n\n请访问 http://zoo.thunlp.org 以获得更多有关信息。\n\n## 引用\n\nBibtex：\n\n```tex\n@techreport{zhong2019openclap,\n  title={Open Chinese Language Pre-trained Model Zoo},\n  author={Zhong, Haoxi and Zhang, Zhengyan and Liu, Zhiyuan and Sun, Maosong},\n  year={2019},\n  url = \"https://github.com/thunlp/openclap\",\n}\n```\n\n## 作者与致谢\n\nHaoxi Zhong（钟皓曦，硕士生）, Zhengyan Zhang（张正彦，本科生）, [Zhiyuan Liu](http://nlp.csai.tsinghua.edu.cn/~lzy/)（刘知远，副教授）, [Maosong Sun](http://nlp.csai.tsinghua.edu.cn/site2/index.php/zh/people?id=16)（孙茂松，教授）.\n\n感谢[幂律智能](http://powerlaw.ai/)对本项目的大力支持与帮助。\n\n\u003cimg src=\"http://zoo.thunlp.org/static/images/powerlaw.png\" height=\"120px\"\u003e\n","funding_links":[],"categories":["Pretrained Language Model","Others","Pretrained BERT weights:","Corpus 中文语料"],"sub_categories":["Repository","Multi-Modal Representation \u0026 Retrieval 多模态表征与检索"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fthunlp%2FOpenCLaP","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fthunlp%2FOpenCLaP","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fthunlp%2FOpenCLaP/lists"}