{"id":15027805,"url":"https://github.com/linkedin/detext","last_synced_at":"2025-05-15T07:05:40.410Z","repository":{"id":44899442,"uuid":"234432813","full_name":"linkedin/detext","owner":"linkedin","description":"DeText: A Deep Neural Text Understanding Framework for Ranking and Classification Tasks ","archived":false,"fork":false,"pushed_at":"2023-03-02T22:27:06.000Z","size":11238,"stargazers_count":1261,"open_issues_count":7,"forks_count":135,"subscribers_count":36,"default_branch":"master","last_synced_at":"2025-04-14T13:02:46.884Z","etag":null,"topics":["classification","deep-neural-networks","detext-framework","nlp","ranking","text-embeddings"],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"bsd-2-clause","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/linkedin.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":"CONTRIBUTING.md","funding":null,"license":"LICENSE.md","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null}},"created_at":"2020-01-16T23:38:58.000Z","updated_at":"2025-04-09T05:27:28.000Z","dependencies_parsed_at":"2023-01-20T05:30:29.615Z","dependency_job_id":"8129107b-7832-4bc9-b9a4-007f8faa3383","html_url":"https://github.com/linkedin/detext","commit_stats":null,"previous_names":[],"tags_count":7,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/linkedin%2Fdetext","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/linkedin%2Fdetext/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/linkedin%2Fdetext/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/linkedin%2Fdetext/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/linkedin","download_url":"https://codeload.github.com/linkedin/detext/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":254283350,"owners_count":22045141,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["classification","deep-neural-networks","detext-framework","nlp","ranking","text-embeddings"],"created_at":"2024-09-24T20:07:05.405Z","updated_at":"2025-05-15T07:05:35.396Z","avatar_url":"https://github.com/linkedin.png","language":"Python","readme":"![](thumbnail_DeText.png) **_\"Relax like a sloth, let DeText do the understanding for you\"_**\n\n![Python 3.7 application](https://github.com/linkedin/detext/workflows/Python%203.7%20application/badge.svg)  ![tensorflow](https://img.shields.io/badge/tensorflow-2.4-green.svg) ![License](https://img.shields.io/badge/License-BSD%202--Clause-orange.svg)\n\nDeText: A Deep Neural Text Understanding Framework\n========\n**DeText** is a \u003cb\u003e_De_\u003c/b\u003eep **_Text_** understanding framework for NLP related ranking, classification, \nand language generation tasks.  It leverages semantic matching using deep neural networks to \nunderstand member intents in search and recommender systems. \n\nAs a general NLP framework, DeText can be applied to many tasks, including search \u0026 recommendation ranking, \nmulti-class classification and query understanding tasks.\n\nMore details can be found in the [LinkedIn Engineering blog post](https://engineering.linkedin.com/blog/2020/open-sourcing-detext).\n\n## Highlight\n* Natural language understanding powered by state-of-the-art deep neural networks\n  * automatic feature extraction with deep models\n  * end-to-end training\n  * interaction modeling between ranking sources and targets\n* A general framework with great flexibility\n  * customizable model architectures\n  * multiple text encoder support\n  * multiple data input types support\n  * various optimization choices\n  * standard training flow control\n* Easy-to-use\n  * Configuration based modeling (e.g., all configurations through command line)\n\n## General Model Architecture\nDeText supports a general model architecture that contains following components:\n\n* **Word embedding layer**.  It converts the sequence of words into a d by n matrix.\n\n* **CNN/BERT/LSTM for text encoding layer**.  It takes into the word embedding matrix as input, and maps the text data into a fixed length embedding.\n\n* **Interaction layer**.  It generates deep features based on the text embeddings.  Options include concatenation, cosine similarity, etc.\n\n* **Wide \u0026 Deep Feature Processing**.  We combine the traditional features with the interaction features (deep features) in a wide \u0026 deep fashion.\n\n* **MLP layer**. The MLP layer is to combine wide features and deep features. \n\nAll parameters are jointly updated to optimize the training objective.\n\n![](detext_model_architecture.png) \n\n### Model Configurables\nDeText offers great flexibility for clients to build customized networks for their own use cases:\n\n* **LTR/classification layer**: in-house LTR loss implementation, or tf-ranking LTR loss, multi-class classification support.\n\n* **MLP layer**: customizable number of layers and number of dimensions.\n\n* **Interaction layer**: support Cosine Similarity, Hadamard Product, and Concatenation.\n\n* **Text embedding layer**: support CNN, BERT, LSTM with customized parameters on filters, layers, dimensions, etc.\n\n* **Continuous feature normalization**: element-wise rescaling, value normalization.\n\n* **Categorical feature processing**: modeled as entity embedding.\n\nAll these can be customized via hyper-parameters in the DeText template. Note that tf-ranking is \nsupported in the DeText framework, i.e., users can choose the LTR loss and metrics defined in DeText.\n\n## User Guide\n### Dev environment set up\n1. Create your virtualenv (Python version \u003e= 3.7)\n    ```shell script\n    VENV_DIR = \u003cyour venv dir\u003e\n    python3 -m venv $VENV_DIR  # Make sure your python version \u003e= 3.7\n    source $VENV_DIR/bin/activate  # Enter the virtual environment\n    ```\n1. Upgrade pip and setuptools version\n    ```shell script\n    pip3 install -U pip\n    pip3 install -U setuptools\n    ```\n1. Run setup for DeText:\n    ```shell script\n    pip install . -e\n    ```\n1. Verify environment setup through pytest. If all tests pass, the environment is correctly set up\n    ```shell script\n    pytest \n    ```\n1. Refer to the training manual ([TRAINING.md](user_guide/TRAINING.md)) to find information about customizing the model:\n    * Training data format and preparation\n    * Key parameters to customize and train DeText models\n    * Detailed information about all DeText training parameters for full customization\n1. Train a model using DeText (e.g., [run_detext.sh](test/resources/run_detext.sh))\n\n\n### Tutorial\nIf you would like a simple try out of the library, you can refer to the following notebooks for tutorial\n* [text_classification_demo.ipynb](user_guide/notebooks/text_classification_demo.ipynb)\n\n    This notebook shows how to use DeText to train a multi-class text classification model on a public query intent \n    classification dataset. Detailed instructions on data preparation, model training, model inference are included.\n* [autocompletion.ipynb](user_guide/notebooks/autocompletion.ipynb)\n\n    This notebook shows how to use DeText to train a text ranking model on a public query auto completion dataset.\n    Detailed steps on data preparation, model training, model inference examples are included.\n\n\n## **Citation**\nPlease cite DeText in your publications if it helps your research:\n```\n@manual{guo-liu20,\n  author    = {Weiwei Guo and\n               Xiaowei Liu and\n               Sida Wang and \n               Huiji Gao and\n               Bo Long},\n  title     = {DeText: A Deep NLP Framework for Intelligent Text Understanding},\n  url       = {https://engineering.linkedin.com/blog/2020/open-sourcing-detext},\n  year      = {2020}\n}\n\n@inproceedings{guo-gao19,\n  author    = {Weiwei Guo and\n               Huiji Gao and\n               Jun Shi and \n               Bo Long},\n  title     = {Deep Natural Language Processing for Search Systems},\n  booktitle = {ACM SIGIR 2019},\n  year      = {2019}\n}\n\n@inproceedings{guo-gao19,\n  author    = {Weiwei Guo and\n               Huiji Gao and\n               Jun Shi and \n               Bo Long and \n               Liang Zhang and\n               Bee-Chung Chen and\n               Deepak Agarwal},\n  title     = {Deep Natural Language Processing for Search and Recommender Systems},\n  booktitle = {ACM SIGKDD 2019},\n  year      = {2019}\n}\n\n@inproceedings{guo-liu20,\n  author    = {Weiwei Guo and\n               Xiaowei Liu and\n               Sida Wang and \n               Huiji Gao and\n               Ananth Sankar and \n               Zimeng Yang and \n               Qi Guo and \n               Liang Zhang and\n               Bo Long and \n               Bee-Chung Chen and \n               Deepak Agarwal},\n  title     = {DeText: A Deep Text Ranking Framework with BERT},\n  booktitle = {ACM CIKM 2020},\n  year      = {2020}\n}\n\n@inproceedings{jia-long20,\n  author    = {Jun Jia and\n               Bo Long and\n               Huiji Gao and \n               Weiwei Guo and \n               Jun Shi and\n               Xiaowei Liu and\n               Mingzhou Zhou and\n               Zhoutong Fu and\n               Sida Wang and\n               Sandeep Kumar Jha},\n  title     = {Deep Learning for Search and Recommender Systems in Practice},\n  booktitle = {ACM SIGKDD 2020},\n  year      = {2020}\n}\n\n@inproceedings{wang-guo20,\n  author    = {Sida Wang and\n               Weiwei Guo and\n               Huiji Gao and\n               Bo Long},\n  title     = {Efficient Neural Query Auto Completion},\n  booktitle = {ACM CIKM 2020},\n  year      = {2020}\n}\n\n@inproceedings{liu-guo20,\n  author    = {Xiaowei Liu and\n               Weiwei Guo and\n               Huiji Gao and\n               Bo Long},\n  title     = {Deep Search Query Intent Understanding},\n  booktitle = {arXiv:2008.06759},\n  year      = {2020}\n}\n```\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Flinkedin%2Fdetext","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Flinkedin%2Fdetext","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Flinkedin%2Fdetext/lists"}