{"id":31959408,"url":"https://github.com/huggingface/torchmoji","last_synced_at":"2025-10-14T15:30:00.910Z","repository":{"id":37608347,"uuid":"105265298","full_name":"huggingface/torchMoji","owner":"huggingface","description":"😇A pyTorch implementation of the DeepMoji model: state-of-the-art deep learning model for analyzing sentiment, emotion, sarcasm etc","archived":false,"fork":false,"pushed_at":"2024-02-12T10:04:55.000Z","size":2450,"stargazers_count":925,"open_issues_count":21,"forks_count":191,"subscribers_count":36,"default_branch":"master","last_synced_at":"2025-09-30T18:02:31.897Z","etag":null,"topics":["deep-learning","machine-learning","natural-language-processing","python","pytorch"],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/huggingface.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2017-09-29T11:32:57.000Z","updated_at":"2025-09-08T14:49:59.000Z","dependencies_parsed_at":"2022-07-19T18:09:25.424Z","dependency_job_id":"dcaf4694-2115-4aa5-bd1a-36295ad70926","html_url":"https://github.com/huggingface/torchMoji","commit_stats":null,"previous_names":[],"tags_count":0,"template":false,"template_full_name":null,"purl":"pkg:github/huggingface/torchMoji","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/huggingface%2FtorchMoji","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/huggingface%2FtorchMoji/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/huggingface%2FtorchMoji/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/huggingface%2FtorchMoji/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/huggingface","download_url":"https://codeload.github.com/huggingface/torchMoji/tar.gz/refs/heads/master","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/huggingface%2FtorchMoji/sbom","scorecard":null,"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":279019316,"owners_count":26086711,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","status":"online","status_checked_at":"2025-10-14T02:00:06.444Z","response_time":60,"last_error":null,"robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":true,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["deep-learning","machine-learning","natural-language-processing","python","pytorch"],"created_at":"2025-10-14T15:29:59.284Z","updated_at":"2025-10-14T15:30:00.904Z","avatar_url":"https://github.com/huggingface.png","language":"Python","readme":"### ------ Update September 2018 ------\nIt's been a year since TorchMoji and DeepMoji were released. We're trying to understand how it's being used such that we can make improvements and design better models in the future. \n\nYou can help us achieve this by answering this [4-question Google Form](https://docs.google.com/forms/d/e/1FAIpQLSe1h4NSQD30YM8dsbJQEnki-02_9KVQD34qgP9to0bwAHBvBA/viewform \"DeepMoji Google Form\"). Thanks for your support!\n\n# 😇 TorchMoji\n\n\u003e **Read our blog post about the implementation process [here](https://medium.com/huggingface/understanding-emotions-from-keras-to-pytorch-3ccb61d5a983).**\n\nTorchMoji is a [pyTorch](http://pytorch.org/) implementation of the [DeepMoji](https://github.com/bfelbo/DeepMoji) model developped by Bjarke Felbo, Alan Mislove, Anders Søgaard, Iyad Rahwan and Sune Lehmann.\n\nThis model trained on 1.2 billion tweets with emojis to understand how language is used to express emotions. Through transfer learning the model can obtain state-of-the-art performance on many emotion-related text modeling tasks.\n\nTry the online demo of DeepMoji on this 🤗 [Space](https://huggingface.co/spaces/Pendrokar/DeepMoji)! See the [paper](https://arxiv.org/abs/1708.00524), [blog post](https://medium.com/@bjarkefelbo/what-can-we-learn-from-emojis-6beb165a5ea0) or [FAQ](https://www.media.mit.edu/projects/deepmoji/overview/) for more details.\n\n## Overview\n* [torchmoji/](torchmoji) contains all the underlying code needed to convert a dataset to the vocabulary and use the model.\n* [examples/](examples) contains short code snippets showing how to convert a dataset to the vocabulary, load up the model and run it on that dataset.\n* [scripts/](scripts) contains code for processing and analysing datasets to reproduce results in the paper.\n* [model/](model) contains the pretrained model and vocabulary.\n* [data/](data) contains raw and processed datasets that we include in this repository for testing.\n* [tests/](tests) contains unit tests for the codebase.\n\nTo start out with, have a look inside the [examples/](examples) directory. See [score_texts_emojis.py](examples/score_texts_emojis.py) for how to use DeepMoji to extract emoji predictions, [encode_texts.py](examples/encode_texts.py) for how to convert text into 2304-dimensional emotional feature vectors or [finetune_youtube_last.py](examples/finetune_youtube_last.py) for how to use the model for transfer learning on a new dataset.\n\nPlease consider citing the [paper](https://arxiv.org/abs/1708.00524) of DeepMoji if you use the model or code (see below for citation).\n\n## Installation\n\nWe assume that you're using [Python 2.7-3.5](https://www.python.org/downloads/) with [pip](https://pip.pypa.io/en/stable/installing/) installed.\n\nFirst you need to install [pyTorch (version 0.2+)](http://pytorch.org/), currently by:\n```bash\nconda install pytorch -c pytorch\n```\nAt the present stage the model can't make efficient use of CUDA. See details in the [Hugging Face blog post](https://medium.com/huggingface/understanding-emotions-from-keras-to-pytorch-3ccb61d5a983).\n\nWhen pyTorch is installed, run the following in the root directory to install the remaining dependencies:\n\n```bash\npip install -e .\n```\nThis will install the following dependencies:\n* [scikit-learn](https://github.com/scikit-learn/scikit-learn)\n* [text-unidecode](https://github.com/kmike/text-unidecode)\n* [emoji](https://github.com/carpedm20/emoji)\n\nThen, run the download script to downloads the pretrained torchMoji weights (~85MB) from [here](https://www.dropbox.com/s/q8lax9ary32c7t9/pytorch_model.bin?dl=0) and put them in the model/ directory:\n\n```bash\npython scripts/download_weights.py\n```\n\n## Testing\nTo run the tests, install [nose](http://nose.readthedocs.io/en/latest/). After installing, navigate to the [tests/](tests) directory and run:\n\n```bash\ncd tests\nnosetests -v\n```\n\nBy default, this will also run finetuning tests. These tests train the model for one epoch and then check the resulting accuracy, which may take several minutes to finish. If you'd prefer to exclude those, run the following instead:\n\n```bash\ncd tests\nnosetests -v -a '!slow'\n```\n\n## Disclaimer\nThis code has been tested to work with Python 2.7 and 3.5 on Ubuntu 16.04 and macOS Sierra machines. It has not been optimized for efficiency, but should be fast enough for most purposes. We do not give any guarantees that there are no bugs - use the code on your own responsibility!\n\n## Contributions\nWe welcome pull requests if you feel like something could be improved. You can also greatly help us by telling us how you felt when writing your most recent tweets. Just click [here](http://deepmoji.mit.edu/contribute/) to contribute.\n\n## License\nThis code and the pretrained model is licensed under the MIT license.\n\n## Benchmark datasets\nThe benchmark datasets are uploaded to this repository for convenience purposes only. They were not released by us and we do not claim any rights on them. Use the datasets at your responsibility and make sure you fulfill the licenses that they were released with. If you use any of the benchmark datasets please consider citing the original authors.\n\n## Citation\n```\n@inproceedings{felbo2017,\n  title={Using millions of emoji occurrences to learn any-domain representations for detecting sentiment, emotion and sarcasm},\n  author={Felbo, Bjarke and Mislove, Alan and S{\\o}gaard, Anders and Rahwan, Iyad and Lehmann, Sune},\n  booktitle={Conference on Empirical Methods in Natural Language Processing (EMNLP)},\n  year={2017}\n}\n```\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fhuggingface%2Ftorchmoji","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fhuggingface%2Ftorchmoji","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fhuggingface%2Ftorchmoji/lists"}