{"id":13458344,"url":"https://github.com/yxKryptonite/RAM_code","last_synced_at":"2025-03-24T15:31:16.131Z","repository":{"id":247318028,"uuid":"825064088","full_name":"yxKryptonite/RAM_code","owner":"yxKryptonite","description":"Official implementation of RAM: Retrieval-Based Affordance Transfer for Generalizable Zero-Shot Robotic Manipulation","archived":false,"fork":false,"pushed_at":"2024-09-17T14:13:20.000Z","size":17004,"stargazers_count":29,"open_issues_count":0,"forks_count":2,"subscribers_count":3,"default_branch":"release","last_synced_at":"2024-09-17T17:45:11.398Z","etag":null,"topics":[],"latest_commit_sha":null,"homepage":"https://yxkryptonite.github.io/RAM/","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"other","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/yxKryptonite.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2024-07-06T17:02:23.000Z","updated_at":"2024-09-17T14:13:24.000Z","dependencies_parsed_at":"2024-10-26T09:24:43.465Z","dependency_job_id":"87cd149c-066e-4e2f-839c-04f7739ad097","html_url":"https://github.com/yxKryptonite/RAM_code","commit_stats":null,"previous_names":["yxkryptonite/ram_code"],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/yxKryptonite%2FRAM_code","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/yxKryptonite%2FRAM_code/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/yxKryptonite%2FRAM_code/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/yxKryptonite%2FRAM_code/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/yxKryptonite","download_url":"https://codeload.github.com/yxKryptonite/RAM_code/tar.gz/refs/heads/release","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":245297993,"owners_count":20592513,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":[],"created_at":"2024-07-31T09:00:51.214Z","updated_at":"2025-03-24T15:31:11.111Z","avatar_url":"https://github.com/yxKryptonite.png","language":"Python","readme":"\u003ch2 align=\"center\"\u003e\n  \u003cb\u003eRAM: Retrieval-Based Affordance Transfer for Generalizable Zero-Shot Robotic Manipulation\u003c/b\u003e\n\n  \u003cb\u003e\u003ci\u003eCoRL 2024 (Oral Presentation)\u003c/i\u003e\u003c/b\u003e\n\u003c/h2\u003e\n\nThis is the official code release of [RAM: Retrieval-Based Affordance Transfer for Generalizable Zero-Shot Robotic Manipulation](https://arxiv.org/abs/2407.04689).\n\n**[[paper]](https://arxiv.org/abs/2407.04689) [[project]](https://yxkryptonite.github.io/RAM/) [[code]](https://github.com/yxKryptonite/RAM_code)**\n\n\u003cdiv align=center\u003e\n    \u003cimg src=\"assets/img/teaser.png\" width=100%\u003e\n\u003c/div\u003e\n\n\n## Installation\n\n1. Create conda environment and install pytorch\n\n    This code is tested on Python 3.8.19 on Ubuntu 20.04, with PyTorch 2.0.1+cu118:\n\n    ```\n    conda create -n ram python=3.8\n    conda activate ram\n    # pytorch 2.0.1 with cuda 11.8\n    pip install torch==2.0.1 torchvision==0.15.2 torchaudio==2.0.2 --index-url https://download.pytorch.org/whl/cu118\n    ```\n\n2. Grounded-SAM\n\n    Install dependencies and download the checkpoints:\n\n    ```\n    pip install -e vision/GroundedSAM/GroundingDINO\n    pip install -e vision/GroundedSAM/segment_anything\n    wget https://dl.fbaipublicfiles.com/segment_anything/sam_vit_h_4b8939.pth -P assets/ckpts/\n    wget https://github.com/IDEA-Research/GroundingDINO/releases/download/v0.1.0-alpha/groundingdino_swint_ogc.pth -P assets/ckpts/\n    ```\n\n3. GSNet\n\n    First, download the pretrained [checkpoints](https://drive.google.com/drive/folders/1iYTIxsLIvXOaYYj4SVxWYT47fpPGhgpm?usp=sharing) and put the `.tar` file into `assets/ckpts/`. We use `minkuresunet_kinect.tar` by default.\n\n    ```\n    # MinkowskiEngine, this may take a while\n    git clone git@github.com:NVIDIA/MinkowskiEngine.git\n    cd MinkowskiEngine\n    conda install openblas-devel -c anaconda\n    python setup.py install --blas_include_dirs=${CONDA_PREFIX}/include --blas=openblas\n\n    ## pointnet2 \u0026 graspnetAPI\n    cd graspness_implementation\n    pip install -r requirements.txt\n    cd pointnet2\n    python setup.py install\n    cd ..\n    cd graspnetAPI\n    pip install .\n    pip install \"numpy\u003c1.24\"\n    pip install pytorch-utils\n    ```\n\n    If you want to use close-sourced [AnyGrasp](https://github.com/graspnet/anygrasp_sdk) as an alternative, please follow [anygrasp_sdk](https://github.com/graspnet/anygrasp_sdk) to setup the SDK and put the `checkpoint_detection.tar` checkpoint to `assets/ckpts/`. And `gsnet.so`, `lib_cxx.so`, and `license/` should be in the project root directory.\n\n4. pointnet2_ops\n\n    ```\n    # this may take a while\n    git clone git@github.com:erikwijmans/Pointnet2_PyTorch.git\n    cd Pointnet2_PyTorch/pointnet2_ops_lib\n    pip install -e .\n    ```\n\n5. Other requirements\n\n    ```\n    pip install -r requirements.txt\n    ```\n\n6. (Optional) Retrieval data\n\n    If you want to use the retrieval pipeline, please download the retrieval data from [Google Drive](https://drive.google.com/file/d/1Ta4MJioAlvrQiczQo-8A-PELRynhsHs1/view?usp=sharing) and unzip the data to `assets/data/`.\n\n## Inference and Visualization\n\nRun commands below to run the demo:\n\n```bash\nexport PYTHONPATH=$PWD\npython run_realworld/run.py --config configs/drawer_open.yaml # add --retrieve to enable retrieval\n```\n\nAfter finished, you shall see printed 3D affordance results w/ grasp and visualization at `run_realworld/gym_outputs/drawer_open/` like below:\n\n\u003cdiv align=center\u003e\n    \u003cimg src=\"assets/img/transfer.png\" width=\"60%\"\u003e\n\u003c/div\u003e\n\u003cdiv align=center\u003e\n    \u003cimg src=\"assets/img/grasp.png\" width=\"30%\"\u003e\n    \u003cimg src=\"assets/img/direction.png\" width=\"30%\"\u003e\n\u003c/div\u003e\n\n## TODO\n\n- [x] Release the method code and demo.\n- [x] Release the retrieval pipeline and data.\n- [ ] More to come... (Feel free to open issues and PRs!)\n\n**Please stay tuned for any updates of the dataset and code!**\n\n## Acknowledgments\n\nWe thank authors of [dift](https://github.com/Tsingularity/dift), [GeoAware-SC](https://github.com/Junyi42/geoaware-sc), [graspness_implementation](https://github.com/rhett-chen/graspness_implementation) and [Grounded-Segment-Anything](https://github.com/IDEA-Research/Grounded-Segment-Anything) for their great work and open-source spirit!\n\n## Citation\n\nIf you find this work helpful, please consider citing:\n\n```\n@article{kuang2024ram,\n  title={RAM: Retrieval-Based Affordance Transfer for Generalizable Zero-Shot Robotic Manipulation},\n  author={Kuang, Yuxuan and Ye, Junjie and Geng, Haoran and Mao, Jiageng and Deng, Congyue and Guibas, Leonidas and Wang, He and Wang, Yue},\n  journal={arXiv preprint arXiv:2407.04689},\n  year={2024}\n}\n```\n\n","funding_links":[],"categories":["Robot Arm"],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2FyxKryptonite%2FRAM_code","html_url":"https://awesome.ecosyste.ms/projects/github.com%2FyxKryptonite%2FRAM_code","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2FyxKryptonite%2FRAM_code/lists"}