{"id":13534927,"url":"https://github.com/dreamgonfly/BERT-pytorch","last_synced_at":"2025-04-02T00:31:03.034Z","repository":{"id":148651537,"uuid":"153815494","full_name":"dreamgonfly/BERT-pytorch","owner":"dreamgonfly","description":"PyTorch implementation of BERT in \"BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding\"","archived":false,"fork":false,"pushed_at":"2018-11-01T12:36:00.000Z","size":1608,"stargazers_count":97,"open_issues_count":4,"forks_count":27,"subscribers_count":7,"default_branch":"master","last_synced_at":"2024-11-02T22:32:53.115Z","etag":null,"topics":[],"latest_commit_sha":null,"homepage":"https://arxiv.org/abs/1810.04805","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"unlicense","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/dreamgonfly.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2018-10-19T17:00:33.000Z","updated_at":"2024-10-19T13:56:15.000Z","dependencies_parsed_at":"2023-05-20T18:45:21.640Z","dependency_job_id":null,"html_url":"https://github.com/dreamgonfly/BERT-pytorch","commit_stats":null,"previous_names":[],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/dreamgonfly%2FBERT-pytorch","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/dreamgonfly%2FBERT-pytorch/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/dreamgonfly%2FBERT-pytorch/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/dreamgonfly%2FBERT-pytorch/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/dreamgonfly","download_url":"https://codeload.github.com/dreamgonfly/BERT-pytorch/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":246734975,"owners_count":20825211,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":[],"created_at":"2024-08-01T08:00:46.874Z","updated_at":"2025-04-02T00:31:03.001Z","avatar_url":"https://github.com/dreamgonfly.png","language":"Python","readme":"# BERT-pytorch\nPyTorch implementation of BERT in \"BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding\" (https://arxiv.org/abs/1810.04805)\n\n## Requirements\n- Python 3.6+\n- [PyTorch 4.1+](http://pytorch.org/)\n- [tqdm](https://github.com/tqdm/tqdm)\n\nAll dependencies can be installed via:\n\n```\npip install -r requirements.txt\n```\n\n## Quickstart\n\n### Prepare data\nFirst things first, you need to prepare your data in an appropriate format. \nYour corpus is assumed to follow the below constraints.\n\n- Each line is a *document*.\n- A *document* consists of *sentences*, seperated by vertical bar (|).\n- A *sentence* is assumed to be already tokenized. Tokens are seperated by space.\n- A *sentence* has no more than 256 tokens.\n- A *document* has at least 2 sentences. \n- You have two distinct data files, one for train data and the other for val data.\n\nThis repo comes with example data for pretraining in data/example directory.\nHere is the content of data/example/train.txt file.\n\n```\nOne, two, three, four, five,|Once I caught a fish alive,|Six, seven, eight, nine, ten,|Then I let go again.\nI’m a little teapot|Short and stout|Here is my handle|Here is my spout.\nJack and Jill went up the hill|To fetch a pail of water.|Jack fell down and broke his crown,|And Jill came tumbling after.  \n```\n\nAlso, this repo includes SST-2 data in data/SST-2 directory for sentiment classification.\n\n### Build dictionary\n```\npython bert.py preprocess-index data/example/train.txt --dictionary=dictionary.txt\n```\nRunning the above command produces dictionary.txt file in your current directory.\n\n### Pre-train the model\n```\npython bert.py pretrain --train_data data/example/train.txt --val_data data/example/val.txt --checkpoint_output model.pth\n```\nThis step trains BERT model with unsupervised objective. Also this step does:\n- logs the training procedure for every epoch\n- outputs model checkpoint periodically\n- reports the best checkpoint based on validation metric\n\n### Fine-tune the model\nYou can fine-tune pretrained BERT model with downstream task.\nFor example, you can fine-tune your model with SST-2 sentiment classification task. \n```\npython bert.py finetune --pretrained_checkpoint model.pth --train_data data/SST-2/train.tsv --val_data data/SST-2/dev.tsv\n```\nThis command also logs the procedure, outputs checkpoint, and reports the best checkpoint.\n\n## See also\n- [Transformer-pytorch](https://github.com/dreamgonfly/Transformer-pytorch) : My own implementation of Transformer. This BERT implementation is based on this repo.\n\n## Author\n[@dreamgonfly](https://github.com/dreamgonfly)","funding_links":[],"categories":["implement of BERT besides tensorflow:","Transformer Implementations By Communities"],"sub_categories":["PyTorch"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fdreamgonfly%2FBERT-pytorch","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fdreamgonfly%2FBERT-pytorch","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fdreamgonfly%2FBERT-pytorch/lists"}