{"id":24618816,"url":"https://github.com/zjunlp/OmniThink","last_synced_at":"2025-10-06T10:32:22.059Z","repository":{"id":272883273,"uuid":"901162672","full_name":"zjunlp/OmniThink","owner":"zjunlp","description":"OmniThink: Expanding Knowledge Boundaries in Machine Writing through Thinking","archived":false,"fork":false,"pushed_at":"2025-01-17T07:02:04.000Z","size":13270,"stargazers_count":11,"open_issues_count":0,"forks_count":0,"subscribers_count":4,"default_branch":"main","last_synced_at":"2025-01-17T08:18:06.489Z","etag":null,"topics":["artificial-intelligence","generation","gpt","information-seeking","knowledge-augmented-generation","large-language-models","machine-writing","natural-language-processing","ominithink","qwen","retrieval-augmented-generation","slow-thinking"],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/zjunlp.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2024-12-10T06:42:25.000Z","updated_at":"2025-01-17T08:09:55.000Z","dependencies_parsed_at":"2025-01-17T08:28:27.315Z","dependency_job_id":null,"html_url":"https://github.com/zjunlp/OmniThink","commit_stats":null,"previous_names":["zjunlp/omnithink"],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/zjunlp%2FOmniThink","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/zjunlp%2FOmniThink/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/zjunlp%2FOmniThink/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/zjunlp%2FOmniThink/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/zjunlp","download_url":"https://codeload.github.com/zjunlp/OmniThink/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":235519884,"owners_count":19003201,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["artificial-intelligence","generation","gpt","information-seeking","knowledge-augmented-generation","large-language-models","machine-writing","natural-language-processing","ominithink","qwen","retrieval-augmented-generation","slow-thinking"],"created_at":"2025-01-25T00:01:27.262Z","updated_at":"2025-10-06T10:32:22.052Z","avatar_url":"https://github.com/zjunlp.png","language":"Python","readme":"\n\n\u003c/div\u003e\n\u003cdiv align=\"center\"\u003e\n\u003cp align=\"center\"\u003e\n  \u003cimg src=\"assets/logo.png\" width=\"10%\" height=\"10%\" /\u003e\n\u003c/p\u003e\n\u003c/div\u003e\n\n\u003cdiv align=\"center\"\u003e\n\u003ch1\u003eOmniThink\u003c/h1\u003e\n\u003c/div\u003e\n\u003cdiv align=\"center\"\u003e\n\u003ch3\u003eExpanding Knowledge Boundaries in Machine Writing\nthrough Thinking\u003c/h3\u003e\n\u003c/div\u003e\n\n\u003cdiv align=\"center\"\u003e\n\n\n\u003c!-- **Affiliations:** --\u003e\n\n👏 Welcome to try OmniThink in our **[\u003cimg src=\"./assets/tongyi.png\" width=\"14px\" style=\"display:inline;\"\u003e Modelscope online demo](https://www.modelscope.cn/studios/iic/OmniThink) and [🤗HuggingFace online demo]( https://huggingface.co/spaces/zjunlp/OmniThink)**!\n\u003cp align=\"center\"\u003e\n\u003ca href=\"https://zjunlp.github.io/project/OmniThink\"\u003e[🤖Project]\u003c/a\u003e\n\u003ca href=\"https://arxiv.org/abs/2501.09751\"\u003e[📄Paper]\u003c/a\u003e\n\u003ca href=\"https://www.youtube.com/watch?v=5qQSJsiE0Sw\u0026t=152s\"\u003e[📺Youtube]\u003c/a\u003e \n\n\u003c!-- \u003ca href=\"## 🚩Citation\"\u003e[🚩Citation]\u003c/a\u003e --\u003e\n\n\u003c/div\u003e\n\u003cdiv align=\"center\"\u003e\n\u003cp align=\"center\"\u003e\n  \u003cimg src=\"assets/overview.jpg\" width=\"50%\" height=\"50%\" /\u003e\n\u003c/p\u003e\n\u003c/div\u003e\n\n## Table of Contents\n- 🚩[Acknowledgement](#Acknowledgement)\n- 🌻[Quick Start](#quick-start)\n- 🌟[Introduction](#Introduction)\n- 🔧[Dependencies](#Dependencies)\n- 🔍[Local Search Support](#-local-search-support)\n- 📉[Results](#Results)\n- 🧐[Evaluation](#evaluation)\n\n\n# 🔔News\n- `2025-08-24`, We have added **offline local search** support using RAGFlow technology! Now you can search local documents without internet connection.\n- `2025-03-12`, We have optimized the Docker usage for OmniThink.\n- `2025-02-20`, We have added the evaluation methods from the paper to OmniThink, and in the future, we will integrate more evaluation methods.\n- `2025-01-28`, We have provided support for the deepseek-reasoner model. You can try running ./examples/deepseekr1.py to test OmniThink's performance within deepseek-reasoner.\n\u003cdetails\u003e\n\u003csummary\u003e\u003cb\u003ePrevious News\u003c/b\u003e\u003c/summary\u003e\n  \n- `2025-01-18`, we open-sourced OmniThink, a machine writing framework.\n\n\u003c/details\u003e\n\n\n# 🌻Acknowledgement\n\n- This work is implemented by [DsPY](https://github.com/stanfordnlp/dspy), [STORM](https://github.com/stanford-oval/storm) Sincere thanks for their efforts.\n- We are also very grateful to [Zhangjiabao-nudt](https://github.com/Zhangjiabao-nudt) and [techshoww](https://github.com/techshoww) for their contributions to this repository.\n- if you have any questions, please feel free to contact via xizekun.xzk@alibaba-inc.com, 1786594371@qq.com or xizekun2023@zju.edu.cn or create an issue.\n\n\n## 📖 Quick Start\n\n- 🌏 The **Online Demo** is avaiable at [ModelScope](https://www.modelscope.cn/studios/iic/OmniThink) now！\n\n\n\u003cimg src=\"assets/demo.gif\"\u003e\n\n# 📌 Introduction\n\nWelcome to **OmniThink**, an innovative machine writing framework designed to replicate the human cognitive process of iterative expansion and reflection in generating insightful long-form articles. \n\n- **Iterative Expansion and Reflection**: OmniThink uses a unique mechanism that simulates human cognitive behaviors to deepen the understanding of complex topics.\n- **Enhanced Knowledge Density**: OmniThink focuses on expanding knowledge boundaries, resulting in articles that are rich in information and insights.\n- **Comprehensive Article Generation**: OmniThink constructs outlines and generates articles, delivering high-quality content that is both coherent and contextually robust.\n\u003cdiv align=\"center\"\u003e\n    \u003cimg src=\"assets/main.jpg\" width=\"80%\" height=\"auto\" /\u003e\n\u003c/div\u003e\n\n\n\n# 🛠 Dependencies\n\n\n## 📦 Conda\n\n```bash\nconda create -n OmniThink python=3.11\ngit clone https://github.com/zjunlp/OmniThink.git\ncd OmniThink\n# Install requirements\npip install -r requirements.txt\n```\n\n## 🔍 Local Search Support\n\nOmniThink now supports **offline local search** using RAGFlow technology! This feature allows you to:\n\n- **Search local documents** without internet connection\n- **Use vector embeddings** for semantic search\n- **Index and retrieve** your own document collections\n- **Maintain data privacy** with local-only processing\n\n### Local Search Features\n\n- **OfflineRAGFlow**: Core RAG engine with FAISS vector database\n- **LocalSearch**: DSPy-compatible search interface\n- **Sentence Transformers**: High-quality text embeddings\n- **Smart Chunking**: Intelligent document segmentation\n- **Semantic Retrieval**: Context-aware search results\n\n### Quick Local Search Setup\n\n```python\nfrom src.tools.rm import OfflineRAGFlow, LocalSearch\n\n# Initialize the local RAG engine\nrag_engine = OfflineRAGFlow(\n    model_name=\"sentence-transformers/all-MiniLM-L6-v2\",\n    chunk_size=800,\n    overlap=120,\n    k=5\n)\n\n# Add documents to your local index\nrag_engine.ingest(\n    text=\"Your document content here...\",\n    meta={\"title\": \"Document Title\", \"doc_id\": \"doc1\"}\n)\n\n# Create DSPy-compatible search interface\nlocal_search = LocalSearch(search=rag_engine, k=3)\n\n# Use in your DSPy pipeline\nresults = local_search.forward(\"your search query\")\n```\n\n## 🐳 Docker\n```\ngit clone https://github.com/zjunlp/OmniThink.git\ndocker pull zjunlp/omnithink:latest\ndocker run -it zjunlp/omnithink:latest\n```\n\n🔑 Before running, please export the LM API key and SEARCH key as an environment variable:\n\n\n```bash\nexport LM_KEY=YOUR_API_KEY\nexport SEARCHKEY=YOUR_SEARCHKEY\n```\n\n### Local Search Dependencies\n\nFor local search functionality, additional packages are required:\n\n```bash\n# Install local search dependencies\npip install sentence-transformers faiss-cpu numpy\n\n# Or use the updated requirements.txt\npip install -r requirements.txt\n```\n\n\u003e You can define your own [LM API](https://github.com/zjunlp/OmniThink/blob/main/src/tools/lm.py) and [SEARCH API](https://github.com/zjunlp/OmniThink/blob/main/src/tools/rm.py)\n\n\u003e Note that the output of the LM should be a LIST.\n\n# Results in OmniThink\nThe preformance of OmniThink is shown below:\n\u003cdiv align=\"center\"\u003e\n    \u003cimg src=\"assets/table.jpg\" width=\"95%\" height=\"auto\" /\u003e\n\u003c/div\u003e\n\n# Generate Article in OmniThink\nJust one command required\n```bash\nsh run.sh\n```\nYou can find your Article, Outline and mindmap in ./results/\n\n# 🔍 Evaluation\n\nWe provide convenient scripts for evaluating your method. The evaluation is divided into three categories: **Rubric_Grading**, **Knowledge_Density**, and **Information_Diversity**. \n\nWe use the `factscore` library. Please run the following code before starting the evaluation.\n```\ncd eval\ngit clone https://github.com/shmsw25/FActScore.git\n```\n\nFor Rubric Grading\n ```\n python Rubric_Grading.py \\\n  --articlepath articlepath \\\n  --modelpath modelpath\n ```\n\nFor Information Diversity\n ```\n python Information_Diversity.py \\\n  --mappath mappath \\\n  --model_path model_path\n ```\n\n For Knowledge_Density\n ```\n python Knowledge_Density.py \\\n  --articlepath articlepath \\\n  --api_path api_path \\\n  --threads threads\n ```\n\n\n## Citation\nIf you find our repo useful in your research, please kindly consider cite:\n```angular2\n@misc{xi2025omnithinkexpandingknowledgeboundaries,\n      title={OmniThink: Expanding Knowledge Boundaries in Machine Writing through Thinking}, \n      author={Zekun Xi and Wenbiao Yin and Jizhan Fang and Jialong Wu and Runnan Fang and Ningyu Zhang and Jiang Yong and Pengjun Xie and Fei Huang and Huajun Chen},\n      year={2025},\n      eprint={2501.09751},\n      archivePrefix={arXiv},\n      primaryClass={cs.CL},\n      url={https://arxiv.org/abs/2501.09751}, \n}\n```\n\n","funding_links":[],"categories":["Agent应用"],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fzjunlp%2FOmniThink","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fzjunlp%2FOmniThink","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fzjunlp%2FOmniThink/lists"}