{"id":25971813,"url":"https://github.com/SwanHubX/SwanLab","last_synced_at":"2025-03-05T00:02:05.487Z","repository":{"id":213707646,"uuid":"722915433","full_name":"SwanHubX/SwanLab","owner":"SwanHubX","description":"⚡️SwanLab: your ML experiment notebook. 你的AI实验笔记本，日志记录与可视化AI训练全流程。","archived":false,"fork":false,"pushed_at":"2024-10-29T06:45:44.000Z","size":31936,"stargazers_count":490,"open_issues_count":43,"forks_count":48,"subscribers_count":6,"default_branch":"main","last_synced_at":"2024-10-29T17:39:02.444Z","etag":null,"topics":["data-science","deep-learning","fastapi","jax","machine-learning","mlops","model-versioning","python","pytorch","tensorboard","tensorflow","tracking","transformers","visualization"],"latest_commit_sha":null,"homepage":"https://swanlab.cn?utm_source=github_description-hompage","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"apache-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/SwanHubX.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":"CONTRIBUTING.md","funding":null,"license":"LICENSE","code_of_conduct":"CODE_OF_CONDUCT.md","threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":"SECURITY.md","support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2023-11-24T08:54:45.000Z","updated_at":"2024-10-29T16:49:59.000Z","dependencies_parsed_at":"2023-12-26T16:40:58.194Z","dependency_job_id":"ae25129a-2fca-4397-8c87-d0155a5fde7e","html_url":"https://github.com/SwanHubX/SwanLab","commit_stats":{"total_commits":428,"total_committers":18,"mean_commits":23.77777777777778,"dds":0.6401869158878505,"last_synced_commit":"63bd0971bbb6ccd09f620e68e6b271a5d0ca9561"},"previous_names":["swanhubx/swanlab"],"tags_count":67,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/SwanHubX%2FSwanLab","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/SwanHubX%2FSwanLab/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/SwanHubX%2FSwanLab/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/SwanHubX%2FSwanLab/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/SwanHubX","download_url":"https://codeload.github.com/SwanHubX/SwanLab/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":241940572,"owners_count":20045881,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["data-science","deep-learning","fastapi","jax","machine-learning","mlops","model-versioning","python","pytorch","tensorboard","tensorflow","tracking","transformers","visualization"],"created_at":"2025-03-05T00:01:57.765Z","updated_at":"2025-03-05T00:02:05.459Z","avatar_url":"https://github.com/SwanHubX.png","language":"Python","readme":"\u003cdiv align=\"center\"\u003e\n\n\u003cpicture\u003e\n  \u003csource media=\"(prefers-color-scheme: dark)\" srcset=\"readme_files/swanlab-logo-single-dark.svg\"\u003e\n  \u003csource media=\"(prefers-color-scheme: light)\" srcset=\"readme_files/swanlab-logo-single.svg\"\u003e\n  \u003cimg alt=\"SwanLab\" src=\"readme_files/swanlab-logo-single.svg\" width=\"70\" height=\"70\"\u003e\n\u003c/picture\u003e\n\n\u003ch1\u003eSwanLab\u003c/h1\u003e\n\n一个开源、现代化设计的深度学习训练跟踪与可视化工具  \n同时支持云端/离线使用，适配30+主流框架，与你的实验代码轻松集成\n\n\u003ca href=\"https://swanlab.cn\"\u003e🔥SwanLab 在线版\u003c/a\u003e · \u003ca href=\"https://docs.swanlab.cn\"\u003e📃 文档\u003c/a\u003e · \u003ca href=\"https://github.com/swanhubx/swanlab/issues\"\u003e报告问题\u003c/a\u003e · \u003ca href=\"https://geektechstudio.feishu.cn/share/base/form/shrcnyBlK8OMD0eweoFcc2SvWKc\"\u003e建议反馈\u003c/a\u003e · \u003ca href=\"https://docs.swanlab.cn/zh/guide_cloud/general/changelog.html\"\u003e更新日志\u003c/a\u003e\n\n[![][release-shield]][release-link]\n[![][github-stars-shield]][github-stars-link]\n[![][github-issues-shield]][github-issues-shield-link]\n[![][github-contributors-shield]][github-contributors-link]\n[![][license-shield]][license-shield-link]  \n[![][tracking-swanlab-shield]][tracking-swanlab-shield-link]\n[![][last-commit-shield]][last-commit-shield-link]\n[![][pypi-version-shield]][pypi-version-shield-link]\n[![][wechat-shield]][wechat-shield-link]\n[![][pypi-downloads-shield]][pypi-downloads-shield-link]\n[![][colab-shield]][colab-shield-link]\n\n\n![](readme_files/swanlab-overview.png)\n\n中文 / [English](README_EN.md) / [日本語](README_JP.md) / [Русский](README_RU.md)\n\n👋 加入我们的[微信群](https://docs.swanlab.cn/zh/guide_cloud/community/online-support.html)\n\n\u003ca href=\"https://hellogithub.com/repository/b442a9fa270e4ccb8847c9ee3445e41b\" target=\"_blank\"\u003e\u003cimg src=\"https://abroad.hellogithub.com/v1/widgets/recommend.svg?rid=b442a9fa270e4ccb8847c9ee3445e41b\u0026claim_uid=Oh5UaGjfrblg0yZ\" alt=\"Featured｜HelloGitHub\" style=\"width: 250px; height: 54px;\" width=\"250\" height=\"54\" /\u003e\u003c/a\u003e\n\n\n\u003c/div\u003e\n\n\u003cbr/\u003e\n\n\n## 目录\n\n- [🌟 最近更新](#-最近更新)\n- [👋🏻 什么是SwanLab](#-什么是swanlab)\n- [📃 在线演示](#-在线演示)\n- [🏁 快速开始](#-快速开始)\n- [💻 自托管](#-自托管)\n- [🚗 框架集成](#-框架集成)\n- [🆚 与熟悉的工具的比较](#-与熟悉的工具的比较)\n- [👥 社区](#-社区)\n- [📃 协议](#-协议)\n\n\u003cbr/\u003e\n\n\n## 🌟 最近更新\n\n- 2025.02.24：我们与[EasyR1](https://github.com/hiyouga/EasyR1)完成了联合集成，[使用指引](https://github.com/hiyouga/EasyR1?tab=readme-ov-file#merge-checkpoint-in-hugging-face-format)\n\n- 2025.02.18：我们与 [Swift](https://github.com/modelscope/ms-swift) 完成了联合集成，现在你可以在Swift的CLI/WebUI中使用SwanLab来**跟踪和可视化大模型微调实验**，[使用指引](https://docs.swanlab.cn/guide_cloud/integration/integration-swift.html)。\n\n- 2025.02.16：新增 **图表移动分组、创建分组** 功能。\n\n- 2025.02.09：我们与 [veRL](https://github.com/volcengine/verl) 完成了联合集成，现在你可以在veRL中使用SwanLab来**跟踪和可视化大模型强化学习实验**，[使用指引](https://docs.swanlab.cn/guide_cloud/integration/integration-verl.html)。\n\n- 2025.02.05：`swanlab.log`支持嵌套字典 [#812](https://github.com/SwanHubX/SwanLab/pull/812)，适配Jax框架特性；支持`name`与`notes`参数；\n\n- 2025.01.22：新增`sync_tensorboardX`与`sync_tensorboard_torch`功能，支持与此两种TensorBoard框架同步实验跟踪；\n\n- 2025.01.17：新增`sync_wandb`功能，[文档](https://docs.swanlab.cn/guide_cloud/integration/integration-wandb.html)，支持与Weights \u0026 Biases实验跟踪同步；大幅改进了日志渲染性能\n\n- 2025.01.11：云端版大幅优化了项目表格的性能，并支持拖拽、排序、筛选等交互\n\n- 2025.01.01：新增折线图**持久化平滑**、折线图拖拽式改变大小，优化图表浏览体验\n\n\n\u003cdetails\u003e\u003csummary\u003e完整更新日志\u003c/summary\u003e\n\n- 2024.12.22：我们与 [LLaMA Factory](https://github.com/hiyouga/LLaMA-Factory) 完成了联合集成，现在你可以在LLaMA Factory中使用SwanLab来**跟踪和可视化大模型微调实验**，[使用指引](https://github.com/hiyouga/LLaMA-Factory?tab=readme-ov-file#use-swanlab-logger)。\n\n- 2024.12.15：**硬件监控（0.4.0）** 功能上线，支持CPU、NPU（Ascend）、GPU（Nvidia）的系统级信息记录与监控。\n\n- 2024.12.06：新增对[LightGBM](https://docs.swanlab.cn/guide_cloud/integration/integration-lightgbm.html)、[XGBoost](https://docs.swanlab.cn/guide_cloud/integration/integration-xgboost.html)的集成；提高了对日志记录单行长度的限制。\n\n- 2024.11.26：环境选项卡-硬件部分支持识别**华为昇腾NPU**与**鲲鹏CPU**；云厂商部分支持识别青云**基石智算**。\n\n\u003c/details\u003e\n\n\u003cbr\u003e\n\n## 👋🏻 什么是SwanLab\n\nSwanLab 是一款开源、轻量的 AI 模型训练跟踪与可视化工具，提供了一个跟踪、记录、比较、和协作实验的平台。\n\nSwanLab 面向人工智能研究者，设计了友好的Python API 和漂亮的UI界面，并提供**训练可视化、自动日志记录、超参数记录、实验对比、多人协同**等功能。在SwanLab上，研究者能基于直观的可视化图表发现训练问题，对比多个实验找到研究灵感，并通过**在线网页**的分享与基于组织的**多人协同训练**，打破团队沟通的壁垒，提高组织训练效率。\n\n以下是其核心特性列表：\n\n**1. 📊 实验指标与超参数跟踪**: 极简的代码嵌入您的机器学习 pipeline，跟踪记录训练关键指标\n\n- 支持**云端**使用（类似Weights \u0026 Biases），随时随地查看训练进展。[手机看实验的方法](https://docs.swanlab.cn/guide_cloud/general/app.html)\n- 支持**超参数记录**与表格展示\n- **支持的元数据类型**：标量指标、图像、音频、文本、...\n- **支持的图表类型**：折线图、媒体图（图像、音频、文本）、...\n- **后台自动记录**：日志logging、硬件环境、Git 仓库、Python 环境、Python 库列表、项目运行目录\n\n**2. ⚡️ 全面的框架集成**: PyTorch、🤗HuggingFace Transformers、PyTorch Lightning、🦙LLaMA Factory、MMDetection、Ultralytics、PaddleDetetion、LightGBM、XGBoost、Keras、Tensorboard、Weights\u0026Biases、OpenAI、Swift、XTuner、Stable Baseline3、Hydra 在内的 **30+** 框架\n\n![](readme_files/integrations.png)\n\n**3. 💻 硬件监控**: 支持实时记录与监控CPU、NPU（昇腾Ascend）、GPU（英伟达Nvidia）、内存的系统级硬件指标\n\n**4. 📦 实验管理**: 通过专为训练场景设计的集中式仪表板，通过整体视图速览全局，快速管理多个项目与实验\n\n**4. 🆚 比较结果**: 通过在线表格与对比图表比较不同实验的超参数和结果，挖掘迭代灵感\n\n**5. 👥 在线协作**: 您可以与团队进行协作式训练，支持将实验实时同步在一个项目下，您可以在线查看团队的训练记录，基于结果发表看法与建议\n\n**6. ✉️ 分享结果**: 复制和发送持久的 URL 来共享每个实验，方便地发送给伙伴，或嵌入到在线笔记中\n\n**7. 💻 支持自托管**: 支持离线环境使用，自托管的社区版同样可以查看仪表盘与管理实验\n\n\u003e \\[!IMPORTANT]\n\u003e\n\u003e **收藏项目**，你将从 GitHub 上无延迟地接收所有发布通知～ ⭐️\n\n![star-us](readme_files/star-us.png)\n\n\u003cbr\u003e\n\n## 📃 在线演示\n\n来看看 SwanLab 的在线演示：\n\n| [ResNet50 猫狗分类][demo-cats-dogs] | [Yolov8-COCO128 目标检测][demo-yolo] |\n| :--------: | :--------: |\n| [![][demo-cats-dogs-image]][demo-cats-dogs] | [![][demo-yolo-image]][demo-yolo] |\n| 跟踪一个简单的 ResNet50 模型在猫狗数据集上训练的图像分类任务。 | 使用 Yolov8 在 COCO128 数据集上进行目标检测任务，跟踪训练超参数和指标。 |\n\n| [Qwen2 指令微调][demo-qwen2-sft] | [LSTM Google 股票预测][demo-google-stock] |\n| :--------: | :--------: |\n| [![][demo-qwen2-sft-image]][demo-qwen2-sft] | [![][demo-google-stock-image]][demo-google-stock] |\n| 跟踪 Qwen2 大语言模型的指令微调训练，完成简单的指令遵循。 | 使用简单的 LSTM 模型在 Google 股价数据集上训练，实现对未来股价的预测。 |\n\n| [ResNeXt101 音频分类][demo-audio-classification] | [Qwen2-VL COCO数据集微调][demo-qwen2-vl] |\n| :--------: | :--------: |\n| [![][demo-audio-classification-image]][demo-audio-classification] | [![][demo-qwen2-vl-image]][demo-qwen2-vl] |\n| 从ResNet到ResNeXt在音频分类任务上的渐进式实验过程 | 基于Qwen2-VL多模态大模型，在COCO2014数据集上进行Lora微调。 |\n\n\n[更多案例](https://docs.swanlab.cn/zh/examples/mnist.html)\n\n\u003cbr\u003e\n\n## 🏁 快速开始\n\n### 1.安装\n\n```bash\npip install swanlab\n```\n\n### 2.登录并获取 API Key\n\n1. 免费[注册账号](https://swanlab.cn)\n\n2. 登录账号，在用户设置 \u003e [API Key](https://swanlab.cn/settings) 里复制您的 API Key\n\n3. 打开终端，输入：\n\n```bash\nswanlab login\n```\n\n出现提示时，输入您的 API Key，按下回车，完成登陆。\n\n### 3.将 SwanLab 与你的代码集成\n\n```python\nimport swanlab\n\n# 初始化一个新的swanlab实验\nswanlab.init(\n    project=\"my-first-ml\",\n    config={'learning-rate': 0.003},\n)\n\n# 记录指标\nfor i in range(10):\n    swanlab.log({\"loss\": i, \"acc\": i})\n```\n\n大功告成！前往[SwanLab](https://swanlab.cn)查看你的第一个 SwanLab 实验。\n\n\u003cbr\u003e\n\n## 💻 自托管\n\n自托管社区版支持离线查看 SwanLab 仪表盘。\n\n### 离线实验跟踪\n\n在 swanlab.init 中设置`logir`和`mode`这两个参数，即可离线跟踪实验：\n\n```python\n...\n\nswanlab.init(\n    logdir='./logs',\n    mode='local',\n)\n\n...\n```\n\n- 参数`mode`设置为`local`，关闭将实验同步到云端\n\n- 参数`logdir`的设置是可选的，它的作用是指定了 SwanLab 日志文件的保存位置（默认保存在`swanlog`文件夹下）\n\n  - 日志文件会在跟踪实验的过程中被创建和更新，离线看板的启动也将基于这些日志文件\n\n其他部分和云端使用完全一致。\n\n### 开启离线看板\n\n打开终端，使用下面的指令，开启一个 SwanLab 仪表板:\n\n```bash\nswanlab watch ./logs\n```\n\n运行完成后，SwanLab 会给你 1 个本地的 URL 链接（默认是[http://127.0.0.1:5092](http://127.0.0.1:5092)）\n\n访问该链接，就可以在浏览器用离线看板查看实验了。\n\n\u003cbr\u003e\n\n## 🚗 框架集成\n\n将你最喜欢的框架与 SwanLab 结合使用！  \n下面是我们已集成的框架列表，欢迎提交 [Issue](https://github.com/swanhubx/swanlab/issues) 来反馈你想要集成的框架。\n\n**基础框架**\n- [PyTorch](https://docs.swanlab.cn/guide_cloud/integration/integration-pytorch.html)\n- [MindSpore](https://docs.swanlab.cn/guide_cloud/integration/integration-ascend.html)\n- [Keras](https://docs.swanlab.cn/guide_cloud/integration/integration-keras.html)\n\n**专有/微调框架**\n- [PyTorch Lightning](https://docs.swanlab.cn/guide_cloud/integration/integration-pytorch-lightning.html)\n- [HuggingFace Transformers](https://docs.swanlab.cn/guide_cloud/integration/integration-huggingface-transformers.html)\n- [OpenMind](https://modelers.cn/docs/zh/openmind-library/1.0.0/basic_tutorial/finetune/finetune_pt.html#%E8%AE%AD%E7%BB%83%E7%9B%91%E6%8E%A7)\n- [LLaMA Factory](https://docs.swanlab.cn/guide_cloud/integration/integration-llama-factory.html)\n- [Modelscope Swift](https://docs.swanlab.cn/guide_cloud/integration/integration-swift.html)\n- [Sentence Transformers](https://docs.swanlab.cn/guide_cloud/integration/integration-sentence-transformers.html)\n- [Torchtune](https://docs.swanlab.cn/guide_cloud/integration/integration-pytorch-torchtune.html)\n- [XTuner](https://docs.swanlab.cn/guide_cloud/integration/integration-xtuner.html)\n- [MMEngine](https://docs.swanlab.cn/guide_cloud/integration/integration-mmengine.html)\n- [FastAI](https://docs.swanlab.cn/guide_cloud/integration/integration-fastai.html)\n- [LightGBM](https://docs.swanlab.cn/guide_cloud/integration/integration-lightgbm.html)\n- [XGBoost](https://docs.swanlab.cn/guide_cloud/integration/integration-xgboost.html)\n\n\n**计算机视觉**\n- [Ultralytics](https://docs.swanlab.cn/guide_cloud/integration/integration-ultralytics.html)\n- [MMDetection](https://docs.swanlab.cn/guide_cloud/integration/integration-mmdetection.html)\n- [MMSegmentation](https://docs.swanlab.cn/guide_cloud/integration/integration-mmsegmentation.html)\n- [PaddleDetection](https://docs.swanlab.cn/guide_cloud/integration/integration-paddledetection.html)\n- [PaddleYOLO](https://docs.swanlab.cn/guide_cloud/integration/integration-paddleyolo.html)\n\n**强化学习**\n- [Stable Baseline3](https://docs.swanlab.cn/guide_cloud/integration/integration-sb3.html)\n- [veRL](https://docs.swanlab.cn/guide_cloud/integration/integration-verl.html)\n- [HuggingFace trl](https://docs.swanlab.cn/guide_cloud/integration/integration-huggingface-trl.html)\n- [EasyR1](https://docs.swanlab.cn/guide_cloud/integration/integration-easyr1.html)\n\n**其他框架：**\n- [Tensorboard](https://docs.swanlab.cn/guide_cloud/integration/integration-tensorboard.html)\n- [Weights\u0026Biases](https://docs.swanlab.cn/guide_cloud/integration/integration-wandb.html)\n- [HuggingFace Accelerate](https://docs.swanlab.cn/guide_cloud/integration/integration-huggingface-accelerate.html)\n- [Unsloth](https://docs.swanlab.cn/guide_cloud/integration/integration-unsloth.html)\n- [Hydra](https://docs.swanlab.cn/guide_cloud/integration/integration-hydra.html)\n- [Omegaconf](https://docs.swanlab.cn/guide_cloud/integration/integration-omegaconf.html)\n- [OpenAI](https://docs.swanlab.cn/guide_cloud/integration/integration-openai.html)\n- [ZhipuAI](https://docs.swanlab.cn/guide_cloud/integration/integration-zhipuai.html)\n\n[更多集成](https://docs.swanlab.cn/zh/guide_cloud/integration/integration-pytorch-lightning.html)\n\n\u003cbr\u003e\n\n## 🆚 与熟悉的工具的比较\n\n### Tensorboard vs SwanLab\n\n- **☁️ 支持在线使用**：\n  通过 SwanLab 可以方便地将训练实验在云端在线同步与保存，便于远程查看训练进展、管理历史项目、分享实验链接、发送实时消息通知、多端看实验等。而 Tensorboard 是一个离线的实验跟踪工具。\n\n- **👥 多人协作**：\n  在进行多人、跨团队的机器学习协作时，通过 SwanLab 可以轻松管理多人的训练项目、分享实验链接、跨空间交流讨论。而 Tensorboard 主要为个人设计，难以进行多人协作和分享实验。\n\n- **💻 持久、集中的仪表板**：\n  无论你在何处训练模型，无论是在本地计算机上、在实验室集群还是在公有云的 GPU 实例中，你的结果都会记录到同一个集中式仪表板中。而使用 TensorBoard 需要花费时间从不同的机器复制和管理\n  TFEvent 文件。\n\n- **💪 更强大的表格**：\n  通过 SwanLab 表格可以查看、搜索、过滤来自不同实验的结果，可以轻松查看数千个模型版本并找到适合不同任务的最佳性能模型。\n  TensorBoard 不适用于大型项目。\n\n### Weights and Biases vs SwanLab\n\n- Weights and Biases 是一个必须联网使用的闭源 MLOps 平台\n\n- SwanLab 不仅支持联网使用，也支持开源、免费、自托管的版本\n\n\u003cbr\u003e\n\n## 👥 社区\n\n### 社区与支持\n\n- [GitHub Issues](https://github.com/SwanHubX/SwanLab/issues)：使用 SwanLab 时遇到的错误和问题\n- [电子邮件支持](zeyi.lin@swanhub.co)：反馈关于使用 SwanLab 的问题\n- \u003ca href=\"https://docs.swanlab.cn/guide_cloud/community/online-support.html\"\u003e微信交流群\u003c/a\u003e：交流使用 SwanLab 的问题、分享最新的 AI 技术\n\n### SwanLab README 徽章\n\n如果你喜欢在工作中使用 SwanLab，请将 SwanLab 徽章添加到你的 README 中：\n\n[![][tracking-swanlab-shield]][tracking-swanlab-shield-link]、[![][visualize-swanlab-shield]][visualize-swanlab-shield-link]\n\n```\n[![](https://raw.githubusercontent.com/SwanHubX/assets/main/badge2.svg)](your experiment url)\n[![](https://raw.githubusercontent.com/SwanHubX/assets/main/badge1.svg)](your experiment url)\n```\n\n更多设计素材：[assets](https://github.com/SwanHubX/assets)\n\n### 在论文中引用 SwanLab\n\n如果您发现 SwanLab 对您的研究之旅有帮助，请考虑以下列格式引用：\n\n```bibtex\n@software{Zeyilin_SwanLab_2023,\n  author = {Zeyi Lin, Shaohong Chen, Kang Li, Qiushan Jiang, Zirui Cai,  Kaifang Ji and {The SwanLab team}},\n  doi = {10.5281/zenodo.11100550},\n  license = {Apache-2.0},\n  title = {{SwanLab}},\n  url = {https://github.com/swanhubx/swanlab},\n  year = {2023}\n}\n```\n\n### 为 SwanLab 做出贡献\n\n考虑为 SwanLab 做出贡献吗？首先，请花点时间阅读 [贡献指南](CONTRIBUTING.md)。\n\n同时，我们非常欢迎通过社交媒体、活动和会议的分享来支持 SwanLab，衷心感谢！\n\n\n\n\u003cbr\u003e\n\n**Contributors**\n\n\u003ca href=\"https://github.com/swanhubx/swanlab/graphs/contributors\"\u003e\n  \u003cimg src=\"https://contrib.rocks/image?repo=swanhubx/swanlab\" /\u003e\n\u003c/a\u003e\n\n\u003cbr\u003e\n\n## 📃 协议\n\n本仓库遵循 [Apache 2.0 License](https://github.com/SwanHubX/SwanLab/blob/main/LICENSE) 开源协议\n\n## Star History\n\n[![Star History Chart](https://api.star-history.com/svg?repos=swanhubx/swanlab\u0026type=Date)](https://star-history.com/#swanhubx/swanlab\u0026Date)\n\n\u003c!-- link --\u003e\n\n[release-shield]: https://img.shields.io/github/v/release/swanhubx/swanlab?color=369eff\u0026labelColor=black\u0026logo=github\u0026style=flat-square\n[release-link]: https://github.com/swanhubx/swanlab/releases\n\n[license-shield]: https://img.shields.io/badge/license-apache%202.0-white?labelColor=black\u0026style=flat-square\n[license-shield-link]: https://github.com/SwanHubX/SwanLab/blob/main/LICENSE\n\n[last-commit-shield]: https://img.shields.io/github/last-commit/swanhubx/swanlab?color=c4f042\u0026labelColor=black\u0026style=flat-square\n[last-commit-shield-link]: https://github.com/swanhubx/swanlab/commits/main\n\n[pypi-version-shield]: https://img.shields.io/pypi/v/swanlab?color=orange\u0026labelColor=black\u0026style=flat-square\n[pypi-version-shield-link]: https://pypi.org/project/swanlab/\n\n[pypi-downloads-shield]: https://static.pepy.tech/badge/swanlab?labelColor=black\u0026style=flat-square\n[pypi-downloads-shield-link]: https://pepy.tech/project/swanlab\n\n[swanlab-cloud-shield]: https://img.shields.io/badge/Product-SwanLab云端版-636a3f?labelColor=black\u0026style=flat-square\n[swanlab-cloud-shield-link]: https://swanlab.cn/\n\n[wechat-shield]: https://img.shields.io/badge/WeChat-微信-4cb55e?labelColor=black\u0026style=flat-square\n[wechat-shield-link]: https://docs.swanlab.cn/guide_cloud/community/online-support.html\n\n[colab-shield]: https://colab.research.google.com/assets/colab-badge.svg\n[colab-shield-link]: https://colab.research.google.com/drive/1RWsrY_1bS8ECzaHvYtLb_1eBkkdzekR3?usp=sharing\n\n[github-stars-shield]: https://img.shields.io/github/stars/swanhubx/swanlab?labelColor\u0026style=flat-square\u0026color=ffcb47\n[github-stars-link]: https://github.com/swanhubx/swanlab\n\n[github-issues-shield]: https://img.shields.io/github/issues/swanhubx/swanlab?labelColor=black\u0026style=flat-square\u0026color=ff80eb\n[github-issues-shield-link]: https://github.com/swanhubx/swanlab/issues\n\n[github-contributors-shield]: https://img.shields.io/github/contributors/swanhubx/swanlab?color=c4f042\u0026labelColor=black\u0026style=flat-square\n[github-contributors-link]: https://github.com/swanhubx/swanlab/graphs/contributors\n\n[demo-cats-dogs]: https://swanlab.cn/@ZeyiLin/Cats_Dogs_Classification/runs/jzo93k112f15pmx14vtxf/chart\n[demo-cats-dogs-image]: readme_files/example-catsdogs.png\n\n[demo-yolo]: https://swanlab.cn/@ZeyiLin/ultratest/runs/yux7vclmsmmsar9ear7u5/chart\n[demo-yolo-image]: readme_files/example-yolo.png\n\n[demo-qwen2-sft]: https://swanlab.cn/@ZeyiLin/Qwen2-fintune/runs/cfg5f8dzkp6vouxzaxlx6/chart\n[demo-qwen2-sft-image]: readme_files/example-qwen2.png\n\n[demo-google-stock]:https://swanlab.cn/@ZeyiLin/Google-Stock-Prediction/charts\n[demo-google-stock-image]: readme_files/example-lstm.png\n\n[demo-audio-classification]:https://swanlab.cn/@ZeyiLin/PyTorch_Audio_Classification/charts\n[demo-audio-classification-image]: readme_files/example-audio-classification.png\n\n[demo-qwen2-vl]:https://swanlab.cn/@ZeyiLin/Qwen2-VL-finetune/runs/pkgest5xhdn3ukpdy6kv5/chart\n[demo-qwen2-vl-image]: readme_files/example-qwen2-vl.jpg\n\n[tracking-swanlab-shield-link]:https://swanlab.cn\n[tracking-swanlab-shield]: https://raw.githubusercontent.com/SwanHubX/assets/main/badge2.svg\n\n[visualize-swanlab-shield-link]:https://swanlab.cn\n[visualize-swanlab-shield]: https://raw.githubusercontent.com/SwanHubX/assets/main/badge1.svg","funding_links":[],"categories":["Evaluation and Monitoring","其他_机器学习与深度学习"],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2FSwanHubX%2FSwanLab","html_url":"https://awesome.ecosyste.ms/projects/github.com%2FSwanHubX%2FSwanLab","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2FSwanHubX%2FSwanLab/lists"}