{"id":13546784,"url":"https://github.com/graph4ai/graph4nlp","last_synced_at":"2025-05-15T11:06:47.866Z","repository":{"id":38692019,"uuid":"280043130","full_name":"graph4ai/graph4nlp","owner":"graph4ai","description":"Graph4nlp is the library for the easy use of Graph Neural Networks for NLP. Welcome to visit our DLG4NLP website (https://dlg4nlp.github.io/index.html) for various learning resources! ","archived":false,"fork":false,"pushed_at":"2024-06-24T03:38:13.000Z","size":489856,"stargazers_count":1680,"open_issues_count":13,"forks_count":204,"subscribers_count":28,"default_branch":"master","last_synced_at":"2025-04-14T19:57:03.169Z","etag":null,"topics":["deep-learning","graph-neural-networks","machine-learning","natural-language-processing","nlp","pytorch"],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"apache-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/graph4ai.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2020-07-16T03:28:48.000Z","updated_at":"2025-04-09T12:27:04.000Z","dependencies_parsed_at":"2024-09-25T00:15:34.724Z","dependency_job_id":null,"html_url":"https://github.com/graph4ai/graph4nlp","commit_stats":null,"previous_names":[],"tags_count":3,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/graph4ai%2Fgraph4nlp","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/graph4ai%2Fgraph4nlp/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/graph4ai%2Fgraph4nlp/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/graph4ai%2Fgraph4nlp/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/graph4ai","download_url":"https://codeload.github.com/graph4ai/graph4nlp/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":254328385,"owners_count":22052632,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["deep-learning","graph-neural-networks","machine-learning","natural-language-processing","nlp","pytorch"],"created_at":"2024-08-01T12:00:44.914Z","updated_at":"2025-05-15T11:06:42.854Z","avatar_url":"https://github.com/graph4ai.png","language":"Python","readme":"\u003cp align=\"center\"\u003e\u003ca href=\"https://dlg4nlp.github.io/index.html\"\u003e\n\u003cimg src=\"./imgs/graph4nlp_logo.png\" width=\"800\" class=\"center\" alt=\"logo\"/\u003e\n    \u003cbr/\u003e\n    \u003ca/\u003e\n\u003c/p\u003e\n   \n[pypi-image]: https://badge.fury.io/py/graph4nlp.svg\n\n[pypi-url]: https://pypi.org/project/graph4nlp\n\n[license-image]:https://img.shields.io/badge/License-Apache%202.0-blue.svg\n\n[license-url]:https://github.com/graph4ai/graph4nlp/blob/master/LICENSE\n\n[contributor-image]:https://img.shields.io/github/contributors/graph4ai/graph4nlp\n\n[contributor-url]:https://github.com/graph4ai/graph4nlp/contributors\n\n[contributing-image]:https://img.shields.io/badge/contributions-welcome-brightgreen.svg?style=flat\n\n[contributing-url]:to_be_add\n\n[issues-image]:https://img.shields.io/github/issues/graph4ai/graph4nlp\n\n[issues-url]:https://github.com/graph4ai/graph4nlp/issues\n\n[forks-image]:https://img.shields.io/github/forks/graph4ai/graph4nlp\n\n[forks-url]:https://github.com/graph4ai/graph4nlp/fork\n\n[stars-image]:https://img.shields.io/github/stars/graph4ai/graph4nlp\n\n[stars-url]:https://github.com/graph4ai/graph4nlp/stars\n\n![Last Commit](https://img.shields.io/github/last-commit/graph4ai/graph4nlp)\n[![pypi][pypi-image]][pypi-url]\n[![Contributors][contributor-image]][contributor-url]\n[![Contributing][contributing-image]][contributing-url]\n[![License][license-image]][license-url]\n[![Issues][issues-image]][issues-url]\n[![Fork][forks-image]][forks-url]\n[![Star][stars-image]][stars-url]\n\n# Graph4NLP\n\n***Graph4NLP*** is an easy-to-use library for R\u0026D at the intersection of **Deep Learning on Graphs** and\n**Natural Language Processing** (i.e., DLG4NLP). It provides both **full implementations** of state-of-the-art models for data scientists and also **flexible interfaces** to build customized models for researchers and developers with whole-pipeline support. Built upon highly-optimized runtime libraries including [DGL](https://github.com/dmlc/dgl) , ***Graph4NLP*** has both high running efficiency and great extensibility. The architecture of ***Graph4NLP*** is shown in the following figure, where boxes with dashed lines represents the features under development. Graph4NLP consists of four different layers: 1) Data Layer, 2) Module Layer, 3) Model Layer, and 4) Application Layer.\n\n\u003cp align=\"center\"\u003e\n    \u003cimg src=\"docs/arch.png\" alt=\"architecture\" width=\"700\" /\u003e\n    \u003cbr\u003e\n    \u003cb\u003eFigure\u003c/b\u003e: Graph4NLP Overall Architecture\n\u003c/p\u003e\n\n## \u003cimg src=\"docs/new.png\" alt='new' width=30 /\u003e Graph4NLP news\n**01/20/2022:** The **v0.5.5 release**. Try it out! \u003cbr\u003e\n**09/26/2021:** The **v0.5.1 release**. Try it out! \u003cbr\u003e\n**09/01/2021:** Welcome to visit our **DLG4NLP website (https://dlg4nlp.github.io/index.html)** for various learning resources! \u003cbr\u003e\n**06/05/2021:** The **v0.4.1 release**. \n\n## Major Releases\n\n| Releases | Date       | Features                                                     |\n| -------- | ---------- | ------------------------------------------------------------ |\n| v0.5.5   | 2022-01-20 | - Support model.predict API by introducing wrapper functions. \u003cbr /\u003e - Introduce Three new inference_wrapper functions: classifier_inference_wrapper, generator_inference_wrapper, generator_inference_wrapper_for_tree. \u003cbr /\u003e - Add the inference and inference_advance examples in each application. \u003cbr /\u003e - Separate the graph topology and graph embedding process. \u003cbr /\u003e - Renew all the graph construction functions. \u003cbr /\u003e - Module graph_embedding is divided into graph_embedding_initialization and graph_embedding_learning. \u003cbr /\u003e - Unify the parameters in Dataset. We remove the ambiguous parameter ``graph_type`` and introduce ``graph_name`` to indicate the graph construction method and ``static_or_dynamic`` to indicate the static or dynamic graph  construction type.  \u003cbr /\u003e - New: The dataset now can automatically choose the default methods (e.g., ``topology_builder``) by only one parameter `` graph_name ``. |\n| v0.5.1   | 2021-09-26 | - Lint the codes \u003cbr /\u003e - Support testing with users' own data \u003cbr /\u003e - Fix the bug: The word embedding size was hard-coded in the 0.4.1 version. Now it is equal to \"word_emb_size\" parameter. \u003cbr /\u003e - Fix the bug: The build_vocab() is called twice in the 0.4.1 version. \u003cbr /\u003e - Fix the bug: The two main files of knowledge graph completion example missed the optional parameter \"kg_graph\" in ranking_and_hits() when resuming training the model. \u003cbr /\u003e - Fix the bug: We have fixed the preprocessing path error in KGC readme. \u003cbr /\u003e - Fix the bug: We have fixed embedding construction bug when setting emb_strategy to 'w2v'. |\n| v0.4.1   | 2021-06-05 | - Support the whole pipeline of Graph4NLP \u003cbr /\u003e - GraphData and Dataset support |\n\n## Quick tour\n\n***Graph4nlp*** aims to make it incredibly easy to use GNNs in NLP tasks (check out [Graph4NLP Documentation](https://graph4ai.github.io/graph4nlp/)). Here is an example of how to use the [*Graph2seq*](https://graph4ai.github.io/graph4nlp/) model (widely used in machine translation, question answering,\nsemantic parsing, and various other NLP tasks that can be abstracted as graph-to-sequence problem and has shown superior\nperformance).\n\n\u003c!-- If you want to further improve model performance, we also support pre-trained models including [BERT](https://arxiv.org/abs/1810.04805), etc.\n --\u003e\nWe also offer other high-level model APIs such as graph-to-tree models. If you are interested in DLG4NLP related research problems, you are very welcome to use our library and refer to our [graph4nlp survey](http://arxiv.org/abs/2106.06090).\n\n```python\nfrom graph4nlp.pytorch.datasets.jobs import JobsDataset\nfrom graph4nlp.pytorch.modules.graph_construction.dependency_graph_construction import DependencyBasedGraphConstruction\nfrom graph4nlp.pytorch.modules.config import get_basic_args\nfrom graph4nlp.pytorch.models.graph2seq import Graph2Seq\nfrom graph4nlp.pytorch.modules.utils.config_utils import update_values, get_yaml_config\n\n# build dataset\njobs_dataset = JobsDataset(root_dir='graph4nlp/pytorch/test/dataset/jobs',\n                           topology_builder=DependencyBasedGraphConstruction,\n                           topology_subdir='DependencyGraph')  # You should run stanfordcorenlp at background\nvocab_model = jobs_dataset.vocab_model\n\n# build model\nuser_args = get_yaml_config(\"examples/pytorch/semantic_parsing/graph2seq/config/dependency_gcn_bi_sep_demo.yaml\")\nargs = get_basic_args(graph_construction_name=\"node_emb\", graph_embedding_name=\"gat\", decoder_name=\"stdrnn\")\nupdate_values(to_args=args, from_args_list=[user_args])\ngraph2seq = Graph2Seq.from_args(args, vocab_model)\n\n# calculation\nbatch_data = JobsDataset.collate_fn(jobs_dataset.train[0:12])\n\nscores = graph2seq(batch_data[\"graph_data\"], batch_data[\"tgt_seq\"])  # [Batch_size, seq_len, Vocab_size]\n```\n\n## Overview\n\nOur Graph4NLP computing flow is shown as below.\n\u003cp align=\"center\"\u003e\n\u003cimg src=\"./imgs/graph4nlp_flow.png\" width=\"1000\" class=\"center\" alt=\"logo\"/\u003e\n    \u003cbr/\u003e\n\u003c/p\u003e\n\n## Graph4NLP Models and Applications\n\n### Graph4NLP models\n\n- [Graph2Seq](https://github.com/graph4ai/graph4nlp/blob/master/graph4nlp/pytorch/models/graph2seq.py): a general end-to-end neural encoder-decoder model that maps an input graph to a sequence of tokens.  \n- [Graph2Tree](https://github.com/graph4ai/graph4nlp/blob/master/graph4nlp/pytorch/models/graph2tree.py): a general end-to-end neural encoder-decoder model that maps an input graph to a tree structure.\n\n### Graph4NLP applications\n\nWe provide a comprehensive collection of NLP applications, together with detailed examples as follows:\n\n- [Text classification](https://github.com/graph4ai/graph4nlp/tree/master/examples/pytorch/text_classification): to give the sentence or document an appropriate label.\n- [Semantic parsing](https://github.com/graph4ai/graph4nlp/tree/master/examples/pytorch/semantic_parsing): to translate natural language into a machine-interpretable formal meaning representation.\n- [Neural machine translation](https://github.com/graph4ai/graph4nlp/tree/master/examples/pytorch/nmt): to translate a sentence in a source language to a different target language.\n- [summarization](https://github.com/graph4ai/graph4nlp/tree/master/examples/pytorch/summarization): to generate a shorter version of input texts which could preserve major meaning.\n- [KG completion](https://github.com/graph4ai/graph4nlp/tree/master/examples/pytorch/kg_completion): to predict missing relations between two existing entities in konwledge graphs.\n- [Math word problem solving](https://github.com/graph4ai/graph4nlp/tree/master/examples/pytorch/math_word_problem): to automatically solve mathematical exercises that provide background information about a problem in easy-to-understand language.\n- [Name entity recognition](https://github.com/graph4ai/graph4nlp/tree/master/examples/pytorch/name_entity_recognition): to tag entities in input texts with their corresponding type.\n- [Question generation](https://github.com/graph4ai/graph4nlp/tree/master/examples/pytorch/question_generation): to generate an valid and fluent question based on the given passage and target answer (optional).\n\n\n## Performance\n\nEnvironment: torch 1.8, ubuntu 16.04 with 2080ti GPUs\n\n| Task                       |              Dataset             |   GNN    Model      | Graph construction                           | Evaluation         |          Performance          |\n|----------------------------|:--------------------------------:|:-------------------:|----------------------------------------------|--------------------|:-----------------------------:|\n| Text classification        | TRECT\u003cbr\u003e CAirline\u003cbr\u003e CNSST\u003cbr\u003e |           GAT       | Dependency\u003cbr\u003e Constituency\u003cbr\u003e Dependency\u003cbr\u003e |      Accuracy    | 0.948\u003cbr\u003e 0.785\u003cbr\u003e 0.538\u003cbr\u003e |\n| Semantic Parsing           |               JOBS               |           SAGE      | Constituency                                 | Execution accuracy |             0.936             |\n| Question generation        |               SQuAD             |           GGNN       | Dependency                                      | BLEU-4             |             0.15175\t            |\n| Machine translation        |              IWSLT14             |           GCN       | Dynamic                                      | BLEU-4             |             0.3212            |\n| Summarization              |             CNN(30k)             |           GCN       | Dependency                                   | ROUGE-1            |              26.4             |\n| Knowledge graph completion | Kinship                          |           GCN      | Dependency                                    | MRR                | 82.4                          |\n| Math word problem          |              MAWPS               | SAGE                | Dynamic                                      | Solution accuracy   | 76.4                    |\n\n\n## Installation\n\nCurrently, users can install Graph4NLP via **pip** or **source code**. Graph4NLP supports the following OSes:\n\n- Linux-based systems (tested on Ubuntu 18.04 and later)\n- macOS (only CPU version)\n- Windows 10 (only support pytorch \u003e= 1.8)\n\n### Installation via pip (binaries)\nWe provide pip wheels for all major OS/PyTorch/CUDA combinations. Note that we highly recommend `Windows` users refer to `Installation via source code` due to compatibility.\n\n#### Ensure that at least PyTorch (\u003e=1.6.0) is installed:\nNote that `\u003e=1.6.0` is ok.\n``` bash\n$ python -c \"import torch; print(torch.__version__)\"\n\u003e\u003e\u003e 1.6.0\n```\n#### Find the CUDA version PyTorch was installed with (for GPU users):\n```bash\n$ python -c \"import torch; print(torch.version.cuda)\"\n\u003e\u003e\u003e 10.2\n```\n\n#### Install the relevant dependencies:\n`torchtext` is needed since Graph4NLP relies on it to implement embeddings.\nPlease pay attention to the PyTorch requirements before installing `torchtext` with the following script! For detailed version matching please refer [here](https://pypi.org/project/torchtext/).\n``` bash\npip install torchtext # \u003e=0.7.0\n```\n\n\n#### Install Graph4NLP\n```bash\npip install graph4nlp${CUDA}\n```\nwhere `${CUDA}` should be replaced by the specific CUDA version (`none` (CPU version), `\"-cu92\"`, `\"-cu101\"`, `\"-cu102\"`, `\"-cu110\"`). The following table shows the concrete command lines. For CUDA 11.1 users, please refer to `Installation via source code`.\n\n| Platform  | Command                       |\n| --------- | ----------------------------- |\n| CPU       | `pip install graph4nlp`   |\n| CUDA 9.2  | `pip install graph4nlp-cu92`  |\n| CUDA 10.1 | `pip install graph4nlp-cu101` |\n| CUDA 10.2 | `pip install graph4nlp-cu102` |\n| CUDA 11.0 | `pip install graph4nlp-cu110` |\n\n### Installation via source code\n\n#### Ensure that at least PyTorch (\u003e=1.6.0) is installed:\nNote that `\u003e=1.6.0` is ok.\n``` bash\n$ python -c \"import torch; print(torch.__version__)\"\n\u003e\u003e\u003e 1.6.0\n```\n#### Find the CUDA version PyTorch was installed with (for GPU users):\n```bash\n$ python -c \"import torch; print(torch.version.cuda)\"\n\u003e\u003e\u003e 10.2\n```\n\n#### Install the relevant dependencies:\n`torchtext` is needed since Graph4NLP relies on it to implement embeddings.\nPlease pay attention to the PyTorch requirements before installing `torchtext` with the following script! For detailed version matching please refer [here](https://pypi.org/project/torchtext/).\n``` bash\npip install torchtext # \u003e=0.7.0\n```\n\n#### Download the source code of `Graph4NLP` from Github:\n```bash\ngit clone https://github.com/graph4ai/graph4nlp.git\ncd graph4nlp\n```\n#### Configure the CUDA version\nThen run `./configure` (or `./configure.bat`  if you are using Windows 10) to config your installation. The configuration program will ask you to specify your CUDA version. If you do not have a GPU, please type 'cpu'.\n```bash\n./configure\n```\n\n#### Install the relevant packages:\n\nFinally, install the package:\n\n```shell\npython setup.py install\n```\n\n## For Hyperparameter tuning\n\nWe show some of the hyperparameters that are often tuned\n [here](https://docs.google.com/spreadsheets/d/e/2PACX-1vQaE3BTKYt4NX0z5oJrzVESdE7Kx3dnmTCG7zTdtTqj6zuRX12qBz7OoEf0ckTDini0BljFLA9JuF5v/pubhtml?gid=0\u0026single=true).\n\n\n## New to Deep Learning on Graphs for NLP?\n\nIf you want to learn more on applying Deep Learning on Graphs techniques to NLP tasks, welcome to visit our DLG4NLP website (https://dlg4nlp.github.io/index.html) for various learning resources! You can refer to our survey paper which provides an overview of this existing research direction. If you want detailed reference to our library, please refer to our docs.\n\n\u003c!-- [Docs]() | [Graph4nlp survey]() | [Related paper list]() | [Workshops]() --\u003e\n- Documentation: [Docs](https://graph4ai.github.io/graph4nlp/)  \n- Graph4NLP Survey: [Graph4nlp survey](http://arxiv.org/abs/2106.06090)  \n- Graph4NLP Tutorials: \n    - [Graph4NLP-NAACL'21, SIGIR'21, IJCAI'21, KDD'21](https://dlg4nlp.github.io/tutorials.html)\n    - [SyncedReview Invited Chinese talk](https://app6ca5octe2206.pc.xiaoe-tech.com/detail/v_60e832f8e4b0876c0c23c1a7/3?fromH5=true) ([video](https://pan.baidu.com/s/1Lltz_kx7ECDOTLecVC9E9w) (password: wppp), [slides](https://pan.baidu.com/s/1pmgX456Me_lu30VGDY3aaw) (password: flwv))  \n- Graph4NLP Workshops : \n    - [DLG4NLP-ICLR'22](https://dlg4nlp-workshop.github.io/dlg4nlp-iclr22/index.html)  \n- Graph4NLP Demo: [Demo](https://github.com/graph4ai/graph4nlp_demo)\n- Graph4NLP Literature Review: [Literature Lists](https://github.com/graph4ai/graph4nlp_literature)  \n\n## Contributing\n\nPlease let us know if you encounter a bug or have any suggestions by filing an issue.\n\nWe welcome all contributions from bug fixes to new features and extensions.\n\nWe expect all contributions discussed in the issue tracker and going through PRs. \n\n## Citation\n\nIf you found this code useful, please consider citing the following papers.\n\n- [1] Lingfei Wu, Yu Chen, Kai Shen, Xiaojie Guo, Hanning Gao, Shucheng Li, Jian Pei, and Bo Long. [**\"Graph Neural Networks for Natural Language Processing: A Survey\"**](https://arxiv.org/abs/2106.06090).\n- [2] [**NeurIPS 2020**] Yu Chen, Lingfei Wu and Mohammed J Zaki, [**\"Iterative Deep Graph Learning for Graph Neural Networks: Better and  Robust Node Embeddings\"**](https://arxiv.org/abs/2006.13009).\n- [3] [**ICLR 2020**] Yu Chen, Lingfei Wu and Mohammed J. Zaki, [**\"Reinforcement Learning Based Graph-to-Sequence Model for Natural Question Generation\"**](https://arxiv.org/abs/1908.04942).\n- [4] Kun Xu, Lingfei Wu, Zhiguo Wang, Yansong Feng, Michael Witbrock and Vadim Sheinin, [**\"Graph2Seq: Graph to Sequence Learning with Attention-based Neural Networks\"**](https://arxiv.org/abs/1804.00823).\n- [5] [**EMNLP 2020**] Shucheng Li, Lingfei Wu, Shiwei Feng, Fangli Xu, Fengyuan Xu and Sheng Zhong, [**\"Graph-to-Tree Neural Networks for Learning Structured Input-Output Translation with Applications to Semantic Parsing and Math Word Problem\"**](https://aclanthology.org/2020.findings-emnlp.255.pdf).\n- [6] [**ACL 2020**] Luyang Huang, Lingfei Wu and Lu Wang, [**\"Knowledge Graph-Augmented Abstractive Summarization with Semantic-Driven Cloze Reward\"**](https://arxiv.org/abs/2005.01159).\n- [7] [**EMNLP 2018**] Lingfei Wu, Ian E.H. Yen, Kun Xu, Fangli Xu, Avinash Balakrishnan, Pin-Yu Chen, Pradeep Ravikumar and Michael J. Witbrock, [**\"Word Mover's Embedding: From Word2Vec to Document Embedding\"**](https://arxiv.org/abs/1811.01713).\n- [8][**IJCAI 2020**] Yu Chen, Lingfei Wu and Mohammed J Zaki, [**\"GraphFlow: Exploiting Conversation Flow with Graph Neural Networks for Conversational Machine Comprehension\"**](https://www.ijcai.org/Proceedings/2020/171).\n- [9] [**IJCAI 2020**] Kai Shen, Lingfei Wu, Fangli Xu, Siliang Tang, Jun Xiao and Yueting Zhuang, [**\"Hierarchical Attention Based Spatial-Temporal Graph-to-Sequence Learning for Grounded Video Description\"**](https://www.ijcai.org/Proceedings/2020/171).\n- [10] [**IJCAI 2020**] Hanning Gao, Lingfei Wu, Po Hu and Fangli Xu, [**\"RDF-to-Text Generation with Graph-augmented Structural Neural Encoders\"**](https://www.ijcai.org/Proceedings/2020/419).\n\n```\n@article{wu2021graph,\n  title={Graph Neural Networks for Natural Language Processing: A Survey},\n  author={Lingfei Wu and Yu Chen and Kai Shen and Xiaojie Guo and Hanning Gao and Shucheng Li and Jian Pei and Bo Long},\n  journal={arXiv preprint arXiv:2106.06090},\n  year={2021}\n}\n\n@inproceedings{chen2020iterative,\n  title={Iterative Deep Graph Learning for Graph Neural Networks: Better and Robust Node Embeddings},\n  author={Chen, Yu and Wu, Lingfei and Zaki, Mohammed J},\n  booktitle={Proceedings of the 34th Conference on Neural Information Processing Systems},\n  month={Dec. 6-12,},\n  year={2020}\n}\n\n@inproceedings{chen2020reinforcement,\n  author    = {Chen, Yu and Wu, Lingfei and Zaki, Mohammed J.},\n  title     = {Reinforcement Learning Based Graph-to-Sequence Model for Natural Question Generation},\n  booktitle = {Proceedings of the 8th International Conference on Learning Representations},\n  month = {Apr. 26-30,},\n  year      = {2020}\n}\n\n@article{xu2018graph2seq,\n  title={Graph2seq: Graph to sequence learning with attention-based neural networks},\n  author={Xu, Kun and Wu, Lingfei and Wang, Zhiguo and Feng, Yansong and Witbrock, Michael and Sheinin, Vadim},\n  journal={arXiv preprint arXiv:1804.00823},\n  year={2018}\n}\n\n@inproceedings{li-etal-2020-graph-tree,\n    title = {Graph-to-Tree Neural Networks for Learning Structured Input-Output Translation with Applications to Semantic Parsing and Math Word Problem},\n    author = {Li, Shucheng  and\n      Wu, Lingfei  and\n      Feng, Shiwei  and\n      Xu, Fangli  and\n      Xu, Fengyuan  and\n      Zhong, Sheng},\n    booktitle = {Findings of the Association for Computational Linguistics: EMNLP 2020},\n    month = {Nov},\n    year = {2020}\n}\n\n@inproceedings{huang-etal-2020-knowledge,\n    title = {Knowledge Graph-Augmented Abstractive Summarization with Semantic-Driven Cloze Reward},\n    author = {Huang, Luyang  and\n      Wu, Lingfei  and\n      Wang, Lu},\n    booktitle = {Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics},\n    month = {Jul},\n    year = {2020},\n    pages = {5094--5107}\n}\n\n@inproceedings{wu-etal-2018-word,\n    title = {Word Mover{'}s Embedding: From {W}ord2{V}ec to Document Embedding},\n    author = {Wu, Lingfei  and\n      Yen, Ian En-Hsu  and\n      Xu, Kun  and\n      Xu, Fangli  and\n      Balakrishnan, Avinash  and\n      Chen, Pin-Yu  and\n      Ravikumar, Pradeep  and\n      Witbrock, Michael J.},\n    booktitle = {Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing},\n    pages = {4524--4534},\n    year = {2018},\n}\n\n@inproceedings{chen2020graphflow,\n  author    = {Yu Chen and\n               Lingfei Wu and\n               Mohammed J. Zaki},  \ntitle     = {GraphFlow: Exploiting Conversation Flow with Graph Neural Networks\n               for Conversational Machine Comprehension},\n  booktitle = {Proceedings of the Twenty-Ninth International Joint Conference on\n               Artificial Intelligence, {IJCAI} 2020},\n  publisher = {International Joint Conferences on Artificial Intelligence Organization},\n  pages     = {1230--1236},\n  year      = {2020}\n} \n  \n@inproceedings{shen2020hierarchical,\n  title={Hierarchical Attention Based Spatial-Temporal Graph-to-Sequence Learning for Grounded Video Description},\n  author={Shen, Kai and Wu, Lingfei and Xu, Fangli and Tang, Siliang and Xiao, Jun and Zhuang, Yueting},\n  booktitle = {Proceedings of the Twenty-Ninth International Joint Conference on\n               Artificial Intelligence, {IJCAI} 2020},\n  publisher = {International Joint Conferences on Artificial Intelligence Organization},\n  pages     = {941--947},\n  year      = {2020}\n}  \n\n@inproceedings{ijcai2020-419,\n  title     = {RDF-to-Text Generation with Graph-augmented Structural Neural Encoders},\n  author    = {Gao, Hanning and Wu, Lingfei and Hu, Po and Xu, Fangli},\n  booktitle = {Proceedings of the Twenty-Ninth International Joint Conference on\n               Artificial Intelligence, {IJCAI-20}},\n  publisher = {International Joint Conferences on Artificial Intelligence Organization},\n  pages     = {3030--3036},\n  year      = {2020}\n}\n\n\n```\n\n\n## Team\nGraph4AI Team: [**Lingfei Wu**](https://sites.google.com/a/email.wm.edu/teddy-lfwu/home) (team leader), Yu Chen, Kai Shen, Xiaojie Guo, Hanning Gao, Shucheng Li, Saizhuo Wang, Xiao Liu and Jing Hu. We are passionate in developing useful open-source libraries which aim to promote the easy use of various Deep Learning on Graphs techniques for Natural Language Processing. Our team consists of research scientists, applied data scientists, and graduate students from a variety of industrial and academic groups, including Pinterest (Lingfei Wu), Zhejiang University (Kai Shen), Facebook AI (Yu Chen), IBM T.J. Watson Research Center (Xiaojie Guo), Tongji University (Hanning Gao), Nanjing University (Shucheng Li), HKUST (Saizhuo Wang).\n\n## Contact\nIf you have any technical questions, please submit new issues.\n\nIf you have any other questions, please contact us: [**Lingfei Wu**](https://sites.google.com/a/email.wm.edu/teddy-lfwu/home) **[lwu@email.wm.edu]** and Xiaojie Guo **[xiaojie.guo@jd.com]**.\n\n## License\nGraph4NLP uses Apache License 2.0.\n","funding_links":[],"categories":["Graph","其他_NLP自然语言处理","Python","Deep Learning Repositories","🧰 NLP Toolkits"],"sub_categories":["Others","其他_文本生成、文本对话"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fgraph4ai%2Fgraph4nlp","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fgraph4ai%2Fgraph4nlp","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fgraph4ai%2Fgraph4nlp/lists"}