{"id":22068385,"url":"https://github.com/Drchip61/Dual_SAM","last_synced_at":"2025-07-24T06:31:04.964Z","repository":{"id":226765772,"uuid":"769591066","full_name":"Drchip61/Dual_SAM","owner":"Drchip61","description":null,"archived":false,"fork":false,"pushed_at":"2024-12-30T13:54:28.000Z","size":1730,"stargazers_count":36,"open_issues_count":8,"forks_count":2,"subscribers_count":2,"default_branch":"main","last_synced_at":"2024-12-30T14:41:23.260Z","etag":null,"topics":["cvpr2024","highlight"],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":null,"status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/Drchip61.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":null,"code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2024-03-09T14:08:54.000Z","updated_at":"2024-12-30T13:54:32.000Z","dependencies_parsed_at":"2024-04-01T13:41:28.740Z","dependency_job_id":"97cda8ba-4a4b-4262-9281-0256bab63d37","html_url":"https://github.com/Drchip61/Dual_SAM","commit_stats":null,"previous_names":["drchip61/dual_sam"],"tags_count":0,"template":false,"template_full_name":null,"purl":"pkg:github/Drchip61/Dual_SAM","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Drchip61%2FDual_SAM","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Drchip61%2FDual_SAM/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Drchip61%2FDual_SAM/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Drchip61%2FDual_SAM/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/Drchip61","download_url":"https://codeload.github.com/Drchip61/Dual_SAM/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Drchip61%2FDual_SAM/sbom","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":266802637,"owners_count":23986384,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","status":"online","status_checked_at":"2025-07-24T02:00:09.469Z","response_time":99,"last_error":null,"robots_txt_status":null,"robots_txt_updated_at":null,"robots_txt_url":"https://github.com/robots.txt","online":true,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["cvpr2024","highlight"],"created_at":"2024-11-30T20:03:59.473Z","updated_at":"2025-07-24T06:31:04.952Z","avatar_url":"https://github.com/Drchip61.png","language":"Python","readme":"\u003cdiv align=\"center\"\u003e\n\u003ch1\u003eDual-SAM [CVPR2024-Highlight]\u003c/h1\u003e\n\u003ch3\u003eFantastic Animals and Where to Find Them: Segment Any Marine Animal with Dual SAM\u003c/h3\u003e\n\n[*Pingping Zhang✉️](http://faculty.dlut.edu.cn/zhangpingping/en/index.htm)\u003csup\u003e1\u003c/sup\u003e,[Tianyu Yan](https://github.com/Drchip61)\u003csup\u003e1\u003c/sup\u003e,[Yang Liu](http://faculty.dlut.edu.cn/liuyang1/zh_CN/index.htm)\u003csup\u003e1\u003c/sup\u003e, [Huchuan Lu](http://faculty.dlut.edu.cn/Huchuan_Lu/zh_CN/index.htm)\u003csup\u003e1\u003c/sup\u003e\n\n[Dalian University of Technology, IIAU-Lab](https://futureschool.dlut.edu.cn/IIAU.htm)\u003csup\u003e1\u003c/sup\u003e\n\n\n## Abstract\n\nAs an important pillar of underwater intelligence, Marine Animal Segmentation (MAS) involves segmenting animals within marine environments. Previous methods don't excel in extracting long-range contextual features and overlook the connectivity between pixels. Recently, Segment Anything Model (SAM) offers a universal framework for general segmentation tasks. Unfortunately, trained with natural images, SAM does not obtain the prior knowledge from marine images. In addition, the single-position prompt of SAM is very insufficient for prior guidance. To address these issues, we propose a novel learning framework, named Dual-SAM for high-performance MAS. To this end, we first introduce a dual structure with SAM's paradigm to enhance feature learning of marine images. Then, we propose a Multi-level Coupled Prompt (MCP) strategy to instruct comprehensive underwater prior information, and enhance the multi-level features of SAM's encoder with adapters. Subsequently, we design a Dilated Fusion Attention Module (DFAM) to progressively integrate multi-level features from SAM's encoder. With dual decoders, it generates pseudo-labels and achieves mutual supervision for harmonious feature representations. Finally, instead of directly predicting the masks of marine animals, we propose a Criss-Cross Connectivity Prediction (\nC3P) paradigm to capture the inter-connectivity between pixels. It shows significant improvements over previous techniques. Extensive experiments show that our proposed method achieve state-of-the-art performances on five widely-used MAS datasets.\n## Overview\n\n* [**Dual-SAM**] is a novel learning framework for high performance Marine Animal Segmentation (MAS). The framework inherits the ability of SAM and adaptively incorporate prior knowledge of underwater scenarios.\n\n\u003cp align=\"center\"\u003e\n  \u003cimg src=\"github_show/framework.png\" alt=\"accuracy\" width=\"80%\"\u003e\n\u003c/p\u003e\n\n* **Motivation of Our proposed Mehtod**\n\n\u003cp align=\"center\"\u003e\n  \u003cimg src=\"github_show/motivation.png\" alt=\"arch\" width=\"60%\"\u003e\n\u003c/p\u003e\n\n* **Multi-level Coupled Prompt**\n\n\u003cp align=\"center\"\u003e\n  \u003cimg src=\"github_show/MCP.png\" alt=\"arch\" width=\"60%\"\u003e\n\u003c/p\u003e\n\n* **Criss-Cross Connectivity Prediction**\n\n\u003cp align=\"center\"\u003e\n  \u003cimg src=\"github_show/c3p.png\" alt=\"arch\" width=\"60%\"\u003e\n\u003c/p\u003e\n\n* **Dilated Fusion Attention Module**\n\n\u003cp align=\"center\"\u003e\n  \u003cimg src=\"github_show/dfam.png\" alt=\"arch\" width=\"60%\"\u003e\n\u003c/p\u003e\n\n\n## Main Results\n\nWe rely on five public datasets and five evaluation metrics to thoroughly validate our model’s performance.\n\u003cp align=\"center\"\u003e\n  \u003cimg src=\"github_show/res1.png\" alt=\"arch\" width=\"60%\"\u003e\n\u003c/p\u003e\n\n\u003cp align=\"center\"\u003e\n  \u003cimg src=\"github_show/res2.png\" alt=\"arch\" width=\"60%\"\u003e\n\u003c/p\u003e\n\n\n\n## Getting Started\n\n\u003c/div\u003e\n\n### Installation\n\n**step1:Clone the Dual_SAM repository:**\n\nTo get started, first clone the Dual_SAM repository and navigate to the project directory:\n\n```bash\ngit clone https://github.com/Drchip61/Dual_SAM.git\ncd Dual_SAM\n\n```\n\n**step2:Environment Setup:**\n\nDual_SAM recommends setting up a conda environment and installing dependencies via pip. Use the following commands to set up your environment:\n#### Create and activate a new conda environment\n\n```bash\nconda create -n Dual_SAM\nconda activate Dual_SAM\n```\n#### Install Dependencies.\n```bash\npip install -r requirements.txt\n```\n\n#### Download pretrained model.\nPlease put the pretrained [SAM model](https://drive.google.com/file/d/1_oCdoEEu3mNhRfFxeWyRerOKt8OEUvcg/view?usp=share_link) in the Dual-SAM file.\n\n### Model Training and Testing\n\n**Training**\n```bash\n# Change the hyper parameter in the train_s.py \npython train_s.py\n```\n\n**Testing**\n```bash\n# Change the hyper parameter in the test_y.py \npython test_y.py\n```\n\n### Analysis Tools\n\n\n```bash\n# First threshold the prediction mask\npython bimap.py\n# Then evaluate the perdiction mask\npython test_score.py\n```\n\n### Visual Results\nWe provide our [predicted results](链接：https://pan.baidu.com/s/18p7qFOC_J3GGz1QherHyMQ?pwd=qi0f \n提取码：qi0f) in the following attachment.\n\nWe also provide USOD10K dataset [predicted results](链接：https://pan.baidu.com/s/1ZjV65Gh5_86y2nEdV6VBeA?pwd=ii60 \n提取码：ii60) in the following attachment. \n\n\n### Concat\nIf you have any question, please contact at 2981431354@qq.com!\n\n## Citation\n\n```\n@inproceedings{\nanonymous2024fantastic,\ntitle={Fantastic Animals and Where to Find Them: Segment Any Marine Animal with Dual {SAM}},\nauthor={Pingping Zhang，Tianyu Yan， Yang Liu，Huchuan Lu},\nbooktitle={Conference on Computer Vision and Pattern Recognition 2024},\nyear={2024}\n}\n```\n","funding_links":[],"categories":["Paper List"],"sub_categories":["Follow-up Papers"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2FDrchip61%2FDual_SAM","html_url":"https://awesome.ecosyste.ms/projects/github.com%2FDrchip61%2FDual_SAM","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2FDrchip61%2FDual_SAM/lists"}