https://github.com/MediaBrain-SJTU/MING
明医 (MING):中文医疗问诊大模型
https://github.com/MediaBrain-SJTU/MING
consultation huggingface llm medical pytorch transformers
Last synced: 6 months ago
JSON representation
明医 (MING):中文医疗问诊大模型
- Host: GitHub
- URL: https://github.com/MediaBrain-SJTU/MING
- Owner: MediaBrain-SJTU
- License: apache-2.0
- Created: 2023-04-07T14:09:56.000Z (over 2 years ago)
- Default Branch: ming-moe
- Last Pushed: 2024-10-24T05:02:47.000Z (12 months ago)
- Last Synced: 2024-11-26T17:02:51.707Z (11 months ago)
- Topics: consultation, huggingface, llm, medical, pytorch, transformers
- Language: Python
- Homepage:
- Size: 134 MB
- Stars: 871
- Watchers: 15
- Forks: 109
- Open Issues: 21
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
- StarryDivineSky - MediaBrain-SJTU/MedicalGPT-zh
- Awesome-LLM-Healthcare - 2023/07
README
# 明医 (MING):中文医疗问诊大模型
![]()
![]()
![]()
![]()
## 🌐项目简介
本项目开源了基于医疗指令微调的中文医疗问诊模型:**明医 (MING)**。目前模型的主要功能如下:
![]()
![]()
医疗问答:对医疗问题进行解答,对案例进行分析。
智能问诊:多轮问诊后给出诊断结果和建议。
## 📄相关论文
* MING-MOE技术报告: MING-MOE: Enhancing Medical Multi-Task Learning in Large Language Models with Sparse Mixture of Low-Rank Adapter Experts] [[paper](https://arxiv.org/pdf/2404.09027.pdf)]* 基于多智能体交互的大语言模型多轮问诊自动评估框架: Automatic Interactive Evaluation for Large Language Models with State Aware Patient Simulator [[paper](https://arxiv.org/pdf/2403.08495.pdf)][[code](https://github.com/BlueZeros/Automatic_Interactive_Evaluation)]
* 二阶段解耦学习的临床大模型对齐方法: MEDCARE: Advancing Medical LLMs through Decoupling Clinical Alignment and Knowledge Aggregation [[paper](https://arxiv.org/pdf/2406.17484v3)] [[code](https://github.com/BlueZeros/MedCare)]
* 基于工具自适应学习与反思的医学智能体和多维度评估基准: ReflecTool: Towards Reflection-Aware Tool-Augmented Clinical Agents [[paper](https://arxiv.org/abs/2410.17657)] [[code](https://github.com/BlueZeros/ReflecTool)]
## 💫更新
* 🔥 [2024/04/14] 开源了基于Qwen1.5指令微调的专家混合模型MING-MOE* [2024/03/14] 开源了基于Qwen1.5-1.8b指令微调的MING-1.8B
* [2023/07/25] 开源了基于bloomz-7b指令微调的MING-7B
* [2023/07/25] MedicalGPT-zh更名为**MING**
## 🔬开源模型
模型
基座
HuggingFace
MING-7B
bloomz-7b1-mt
🤗MING-7B
MING-1.8B
Qwen1.5-1.8B
🤗MING-1.8B
MING-MOE-1.8B
Qwen1.5-1.8B
🤗MING-MOE-1.8B
MING-MOE-4B
Qwen1.5-4B
🤗MING-MOE-4B
MING-MOE-7B
Qwen1.5-7B
🤗MING-MOE-7B
MING-MOE-14B
Qwen1.5-14B
🤗MING-MOE-14B
## ⚡快速开始
1. 配置环境(测试环境如下,具体版本可以根据实际需求配置)
* python==3.9.16
* pytorch==2.0.1+cu117
* peft==0.9.02. 安装项目依赖
```bash
git clone https://github.com/MediaBrain-SJTU/MING
cd MING
pip install -e .
```2. 下载模型参数并运行(要求单卡显存 >= 15G)
* MING-MOE
```bash
CUDA_VISIBLE_DEVICES=0 python -m ming/serve/cli.py \
--model_path {path_to_checkpoint} \ # 模型路径
--model_base {path_to_base_model} \ # 基座模型路径
--max_new_token 3072 # 输出最大长度
```* MING-1.8B
```bash
CUDA_VISIBLE_DEVICES=0 python -m ming/serve/cli.py \
--model_path {path_to_checkpoint} \ # 模型路径
--max_new_token 2048 # 输出最大长度
```* MING-7B
```bash
CUDA_VISIBLE_DEVICES=0 python -m ming/serve/cli.py \
--model_path {path_to_checkpoint} \ # 模型路径
--conv_template bloom \ # prompt
--max_new_token 512 \ # 输出最大长度
--beam_size 3 \ # beam search宽度
--temperature 1.2 # 采样温度
```
* 注:由于transformers库的问题,当beam-size > 1时,需要满足temperature>=1.0,否则会报错。4. 命令行运行实例
* 对话支持多轮
* 对话中输入关键词 `new chat` 能够开启新一轮对话。
## 🧭测试样例
![]()
![]()
## 🪶贡献
本项目由上海交通大学未来媒体网络协同创新中心和上海人工智能实验室智慧医疗中心合作研发。模型数据系统主要由廖育生,江书洋,刘泓呈,孟昱同完成,指导教师为[王钰](https://mediabrain.sjtu.edu.cn/yuwang/)副教授。
## 免责声明
预训练模型是基于大量语料库和算法模型进行训练的,并且在训练过程中可能存在偏差、错误和不完整的信息。因此,本项目提供的预训练模型仅供参考和研究使用,并不能保证其准确性和可靠性。使用预训练模型产生的结果可能存在误差和偏差,不能用于实际应用或决策。本项目不对使用预训练模型所产生的结果承担任何责任,也不对因使用预训练模型所产生的任何损失承担责任。使用者在使用预训练模型时应自行承担风险并进行自我验证。
## 引用
如果你使用了本项目的数据或者代码,请声明引用
```latex
@article{liao2024ming,
title={MING-MOE: Enhancing Medical Multi-Task Learning in Large Language Models with Sparse Mixture of Low-Rank Adapter Experts},
author={Liao, Yusheng and Jiang, Shuyang and Wang, Yu and Wang, Yanfeng},
journal={arXiv preprint arXiv:2404.09027},
year={2024}
}
``````latex
@misc{MING,
author={Yusheng Liao, Yutong Meng, Hongcheng Liu, Yu Wang, Yanfeng Wang},
title = {明医 (MING):中文医疗问诊大模型},
year = {2023},
publisher = {GitHub},
journal = {GitHub repository},
howpublished = {\url{https://github.com/MediaBrain-SJTU/MING}},
}
```