{"id":22096178,"url":"https://github.com/aspirincode/diffiupac","last_synced_at":"2025-07-24T22:31:39.686Z","repository":{"id":240385493,"uuid":"794453453","full_name":"AspirinCode/DiffIUPAC","owner":"AspirinCode","description":"Diffusion-based generative drug-like molecular editing with chemical natural language","archived":false,"fork":false,"pushed_at":"2024-11-20T06:42:47.000Z","size":4526,"stargazers_count":7,"open_issues_count":0,"forks_count":0,"subscribers_count":0,"default_branch":"main","last_synced_at":"2024-11-20T07:34:31.494Z","etag":null,"topics":["diffiupac","diffusion-models","iupac","qed","qeppi","transformer"],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"gpl-3.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/AspirinCode.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2024-05-01T07:43:28.000Z","updated_at":"2024-11-20T06:42:50.000Z","dependencies_parsed_at":null,"dependency_job_id":"535415d9-869d-4de3-a3f0-307d40af73b6","html_url":"https://github.com/AspirinCode/DiffIUPAC","commit_stats":null,"previous_names":["aspirincode/diffiupac"],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/AspirinCode%2FDiffIUPAC","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/AspirinCode%2FDiffIUPAC/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/AspirinCode%2FDiffIUPAC/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/AspirinCode%2FDiffIUPAC/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/AspirinCode","download_url":"https://codeload.github.com/AspirinCode/DiffIUPAC/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":227482494,"owners_count":17779968,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["diffiupac","diffusion-models","iupac","qed","qeppi","transformer"],"created_at":"2024-12-01T04:09:51.246Z","updated_at":"2025-07-24T22:31:39.674Z","avatar_url":"https://github.com/AspirinCode.png","language":"Python","readme":"[![License: GNU](https://img.shields.io/badge/License-GNU-yellow)](https://github.com/AspirinCode/DiffIUPAC)\n[![J. Pharm. Anal.](https://img.shields.io/badge/10.1016%2Fj.jpha.2024.101137-green)](https://doi.org/10.1016/j.jpha.2024.101137)\n\n\n## DiffIUPAC\n\n**Diffusion-based generative drug-like molecular editing with chemical natural language**  \n\nRecently, diffusion models have emerged as a promising paradigm for molecular\ndesign and optimization. However, most diffusion-based molecular generative models\nfocus on modeling 2D graphs or 3D geometries, with limited research on molecular\nsequence diffusion models. The International Union of Pure and Applied Chemistry\n(IUPAC) names are more akin to chemical natural language than the Simplified\nMolecular Input Line Entry System (SMILES) for organic compounds. In this work, we\napply an IUPAC-guided conditional diffusion model to facilitate molecular editing from\nchemical natural language to chemical language (SMILES) and explore whether the\npre-trained generative performance of diffusion models can be transferred to chemical\nnatural language. We propose DiffIUPAC, a controllable molecular editing diffusion\nmodel that converts IUPAC names to SMILES strings. Evaluation results demonstrate\nthat our model outperforms existing methods and successfully captures the semantic\nrules of both chemical languages. Chemical space and scaffold analysis show that the\nmodel can generate similar compounds with diverse scaffolds within the specified\nconstraints. Additionally, to illustrate the model's applicability in drug design, we\nconducted case studies in functional group editing, analogue design and linker design.\n\n\n![Model Architecture of DiffIUPAC](https://github.com/AspirinCode/DiffIUPAC/blob/main/figure/framework.png)\n\n\n## Acknowledgements\nWe thank the authors of C5T5: Controllable Generation of Organic Molecules with Transformers, IUPAC2Struct: Transformer-based artificial neural networks for the conversion between chemical notations, Deep molecular generative model based on variant transformer for antiviral drug design, and SeqDiffuSeq: Text Diffusion with Encoder-Decoder Transformers for releasing their code. The code in this repository is based on their source code release (https://github.com/dhroth/c5t5, https://github.com/sergsb/IUPAC2Struct, https://github.com/AspirinCode/TransAntivirus, and https://github.com/yuanhy1997/seqdiffuseq). If you find this code useful, please consider citing their work.\n\n\n## News!\n\n**[2024/11/02]** Available [online](https://doi.org/10.1016/j.jpha.2024.101137) **Journal of Pharmaceutical Analysis**, 2024.  \n\n**[2024/10/29]** Accepted in **Journal of Pharmaceutical Analysis**, 2024.  \n\n**[2024/05/14]** submission to **Journal of Pharmaceutical Analysis**, 2024.  \n\n\n\n## Requirements\n```python\nconda create -n diffiupac python=3.8\nconda install mpi4py\npip install torch==1.10.0+cu111 torchvision==0.11.0+cu111 torchaudio==0.10.0\npip install -r requirements.txt\n\n```\n\nhttps://github.com/rdkit/rdkit  \n\n\n\n\n## System Requirerments\n*  requires system memory larger than 228GB.  \n\n*  (if GPU is available) requires GPU memory larger than 80GB.  \n\n\n\n\n## Data\n\n\n**PubChem**\n\nhttps://pubchem.ncbi.nlm.nih.gov/\n\nIUPAC Name-Canonical SMILES pairs\n\n```\n#example：Aspirin\n2-acetyloxybenzoic acid | CC(=O)OC1=CC=CC=C1C(=O)O\n```\n\n## IUPAC name ⇆ SMILES string\n\n\n### Structure/SMILES2IUPAC  \n\n**IUPAC Naming**  \n\nhttps://web.chemdoodle.com/demos/iupac-naming  \n\n\n**SMILES2IUPAC**  \n\nhttps://huggingface.co/knowledgator/SMILES2IUPAC-canonical-base  \n\n**Smiles-TO-iUpac-Translator**  \n\nhttps://github.com/Kohulan/Smiles-TO-iUpac-Translator  \n\n\n\n### IUPAC2SMILES\n\nhttps://www.antvaset.com/iupac-to-smiles\n\nhttps://web.chemdoodle.com/demos/iupac-naming  \n\n\n## Training\n\nTo run the code, we use iwslt14 en-de as an illustrative example:\n\n**Prepare the data:** \nLearning the BPE tokenizer by\n```\nsh ./tokenizer_utils.py train-byte-level iwslt14 10000 \n```\n\n**To train with the following line:**  \n```\nmkdir ckpts\nbash ./train_scripts/train.sh 0 iupac smiles\n#(for en to de translation) bash ./train_scripts/iwslt_en_de.sh 0 smiles iupac \n```\n\nYou may modify the scripts in ./train_scripts for your own training settings.\n\n\n**To fine tune with the following line:**  \n\n```\nbash ./train_scripts/fine_tune.sh 0 iupac smiles\n\n```\n\n## Generating\n\nTo run the code, example data is in the example folder:\n\n```\nbash ./train_scripts/gen_opt.sh\n\n```\n\n\n## Model Metrics\n\n### MOSES\n\nMolecular Sets (MOSES), a benchmarking platform to support research on machine learning for drug discovery. MOSES implements several popular molecular generation models and provides a set of metrics to evaluate the quality and diversity of generated molecules. With MOSES, MOSES aim to standardize the research on molecular generation and facilitate the sharing and comparison of new models.  \nhttps://github.com/molecularsets/moses  \n\n### QEPPI\nquantitative estimate of protein-protein interaction targeting drug-likeness  \n\nhttps://github.com/ohuelab/QEPPI  \n\n\n## License\nCode is released under GNU GENERAL PUBLIC LICENSE.\n\n\n## Cite:\n\n* J. Wang, P. Zhou, Z. Wang, W. Long, Y. Chen, K.T. No, D. Ouyang, J. Mao, X. Zeng, Diffusion-based generative drug-like molecular editing with chemical natural language, Journal of Pharmaceutical Analysis, https://doi.org/10.1016/j.jpha.2024.101137.  \n\n* Jiashun Mao, Jianmin Wang, Amir Zeb, Kwang-Hwi Cho, Haiyan Jin, Jongwan Kim, Onju Lee, Yunyun Wang, and Kyoung Tai No. \"Transformer-Based Molecular Generative Model for Antiviral Drug Design\" Journal of Chemical Information and Modeling, 2023;, [DOI: 10.1021/acs.jcim.3c00536](https://doi.org/10.1021/acs.jcim.3c00536)  \n\n* Yuan, Hongyi, Zheng Yuan, Chuanqi Tan, Fei Huang, and Songfang Huang. \"SeqDiffuSeq: Text Diffusion with Encoder-Decoder Transformers.\" arXiv preprint arXiv:2212.10325 (2022).  \n\n* Rothchild, Daniel, Alex Tamkin, Julie Yu, Ujval Misra, and Joseph Gonzalez. \"C5t5: Controllable generation of organic molecules with transformers.\" arXiv preprint arXiv:2108.10307 (2021).\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Faspirincode%2Fdiffiupac","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Faspirincode%2Fdiffiupac","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Faspirincode%2Fdiffiupac/lists"}