{"id":13535002,"url":"https://github.com/whqwill/seq2seq-keyphrase-bert","last_synced_at":"2025-04-02T00:31:15.665Z","repository":{"id":94749704,"uuid":"159603246","full_name":"whqwill/seq2seq-keyphrase-bert","owner":"whqwill","description":"add BERT to encoder part for https://github.com/memray/seq2seq-keyphrase-pytorch","archived":false,"fork":false,"pushed_at":"2018-12-04T02:21:10.000Z","size":123,"stargazers_count":80,"open_issues_count":1,"forks_count":15,"subscribers_count":7,"default_branch":"master","last_synced_at":"2024-11-02T22:32:55.318Z","etag":null,"topics":[],"latest_commit_sha":null,"homepage":null,"language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":null,"status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/whqwill.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":null,"code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2018-11-29T03:40:42.000Z","updated_at":"2024-01-04T16:28:30.000Z","dependencies_parsed_at":"2023-05-24T21:15:51.628Z","dependency_job_id":null,"html_url":"https://github.com/whqwill/seq2seq-keyphrase-bert","commit_stats":null,"previous_names":[],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/whqwill%2Fseq2seq-keyphrase-bert","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/whqwill%2Fseq2seq-keyphrase-bert/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/whqwill%2Fseq2seq-keyphrase-bert/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/whqwill%2Fseq2seq-keyphrase-bert/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/whqwill","download_url":"https://codeload.github.com/whqwill/seq2seq-keyphrase-bert/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":246735028,"owners_count":20825212,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":[],"created_at":"2024-08-01T08:00:48.264Z","updated_at":"2025-04-02T00:31:15.640Z","avatar_url":"https://github.com/whqwill.png","language":"Python","readme":"# seq2seq-keyphrase-bert\n\n\nThe original code is from https://github.com/memray/seq2seq-keyphrase-pytorch, which is used to do keyphrase generation using seq2seq with attention model. \n\n\nRecently BERT (https://arxiv.org/abs/1810.04805) is very popular for many NLP taks, so I add BERT (https://github.com/huggingface/pytorch-pretrained-BERT) to the encoder part of the seq2seq model. I add a new model \"Seq2SeqBERT\", which uses BERT for encoder and uses GRU for decoder. \n\n\nSpecifically, I change some code in preprocess.py so that it preprocesses data using the tokenizer from BERT, and I add new model in pykp/model.py, relatively I change the beam_search methods in beam_search.py, and there are also some changes in pykp/io.py, train.py, evaluate.py.\n\n\nBut right now, the result is not good, I am still researching it. Here is the experiment report https://github.com/huggingface/pytorch-pretrained-BERT/files/2623599/RNN.vs.BERT.in.Keyphrase.generation.pdf \n\n\nWelcome to give me some advice where I did wrong.\n\n\nYou can use train.py to train seq2seq model. To use BERT, just set the encoder_type to 'bert', and it will initialize with \"Seq2SeqBERT\". The encoder details are also in Seq2SeqBERT model in pykp/model.py.\n","funding_links":[],"categories":["Other Resources","BERT language model and embedding:"],"sub_categories":["Other"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fwhqwill%2Fseq2seq-keyphrase-bert","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fwhqwill%2Fseq2seq-keyphrase-bert","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fwhqwill%2Fseq2seq-keyphrase-bert/lists"}