{"id":21739709,"url":"https://github.com/berkeleyautomation/fast_grasp_detect","last_synced_at":"2026-04-11T13:32:38.376Z","repository":{"id":75469257,"uuid":"152477440","full_name":"BerkeleyAutomation/fast_grasp_detect","owner":"BerkeleyAutomation","description":"For supporting our IL_ROS_HSR code","archived":false,"fork":false,"pushed_at":"2018-10-10T19:24:33.000Z","size":37,"stargazers_count":1,"open_issues_count":0,"forks_count":1,"subscribers_count":5,"default_branch":"master","last_synced_at":"2025-03-21T00:45:12.638Z","etag":null,"topics":[],"latest_commit_sha":null,"homepage":null,"language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":null,"status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/BerkeleyAutomation.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":null,"code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2018-10-10T19:22:14.000Z","updated_at":"2018-11-08T01:12:50.000Z","dependencies_parsed_at":null,"dependency_job_id":"7cf146db-7da7-4420-bf59-5a2655dff56b","html_url":"https://github.com/BerkeleyAutomation/fast_grasp_detect","commit_stats":null,"previous_names":[],"tags_count":0,"template":false,"template_full_name":null,"purl":"pkg:github/BerkeleyAutomation/fast_grasp_detect","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/BerkeleyAutomation%2Ffast_grasp_detect","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/BerkeleyAutomation%2Ffast_grasp_detect/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/BerkeleyAutomation%2Ffast_grasp_detect/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/BerkeleyAutomation%2Ffast_grasp_detect/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/BerkeleyAutomation","download_url":"https://codeload.github.com/BerkeleyAutomation/fast_grasp_detect/tar.gz/refs/heads/master","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/BerkeleyAutomation%2Ffast_grasp_detect/sbom","scorecard":null,"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":286080680,"owners_count":31682953,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2026-04-11T13:07:20.380Z","status":"ssl_error","status_checked_at":"2026-04-11T13:06:47.903Z","response_time":54,"last_error":"SSL_connect returned=1 errno=0 peeraddr=140.82.121.5:443 state=error: unexpected eof while reading","robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":false,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":[],"created_at":"2024-11-26T06:10:17.766Z","updated_at":"2026-04-11T13:32:38.342Z","avatar_url":"https://github.com/BerkeleyAutomation.png","language":"Python","readme":"# fast_grasp_detect\n\nUsed to support our IL_ROS_HSR code.\n\nFor the bed-making project code, [see this snapshot in a separate repository][4].\n\n## install \n\n```\npython setup.py develop\n```\n\n(Or `pip install -e .` in this directory ... that might be better)\n\n## main\n\nContains scripts that can be run to train the networks. We use two networks, one\nfor the grasping (\"grasp network\") and one for the transition policy (\"success\nnetwork\").\n\n- `train_bed_grasp.py` - trains the grasping network ([hyperparameters here][1])\n- `train_bed_success.py` - trains the network that identifies successful\n  rollouts ([hyperparameters here][2])\n\nBoth of these call `solver = Solver(bed_grasp_options,yolo,pascal)` and then\n`solver.train()` to train the network, based on `core/train_network.py`. The\ndifference is that `yolo` can be the output from `networks/grasp_net_cs.py` or\n`networks/success_net.py`. The `pascal` portion handles the YOLO pre-trained\nfeatures, so that the two networks can simply fine-tune.\n\nThe networks use TF slim. [Documentation here][3].\n\n**Probably better to run for now**: use the scripts `main/grasp.sh` and `main/success.sh` since\nthese iterate through all indices in a cross validation set, and then put the output to a file, such\nas in:\n\n```\n./main/grasp.sh | tee logs/grasp.log\n```\n\nso that we can inspect the output later.\n\n\n## src\n\n(This is older stuff from Michael's documentation)\n\n\t• CONFIG\n\t\t○ Bed_grasp_config = grasp network (trains the hsr to use end-effector to make an effective grasp)\n\t\t○ Bed_success_config = success network (tells you whether the sequence of states and actions resulted in a successfully made bed)\n\t\t○ tl_experiments/ = contains training scripts for transfer learning experiments\n    ○ Other config files are for different bed-making experiments.\n\t• CORE\n\t\t○ Data_manager loads the yolo network and can be used to get the data as an array of images and labels\n\t\t○ Grasp_data.py is a data_manager for only the management of the grasp-net data\n\t\t○ Timer.py has 3 methods. Tic records the start time, toc returns the difference from the start time or the average time. Remain returns the time at which the timer runs out\n\t\t○ Train_fast.py trains the grasp network fast using a Solver class that takes in a network (yolo) and the dataset (pascal)\n\t\t○ Yolo_conv_features_cs.py makes the yolo CNN (builds or restores from a given weights file)\n\t\t\t§ Also extracts features from an image using a tensorflow feature extraction function\n\t\t○ Train_network.py = used to train the yolo network\n\t• DATA_AUG\n\t\t○ Augment_lighting has functions to add lighting noise to images that will later be used as training inputs\n\t\t○ Data_augment.py goes through the images and adds lighting-noisy and flipped images to the data set\n\t• DETECTORS\n\t\t○ Grasp_detector.py = Detects the optimal grasp point given an image of the bed\n\t\t\t§ Returns that grasp as a pose, which is a set of x,y coordinates that can be manipulated as necessary\n\t\t○ Tran_detector.py = detects whether the image of the bed is ready for a transition to the next robot position\n\t\t\t§ Detects if the robot has successfully done all it can from a spot on the floor\n\t• LABELERS\n\t\t○ Labeler.py = puts bounding boxes on an image upon loading it [offline]\n\t\t○ Online_labeler.py = recomputes bounding boxes after an action is completed\n\t• NETWORKS\n\t\t○ Grasp_net_cs.py = builds the entire grasp net\n\t\t○ Success_net.py = builds the success net (whether the bed is made or not)\n\t• VISUALIZERS\n\t\t○ Draw_cross_hair.py = used to draw an x on top of some location on the given image\n\n\n[1]:https://github.com/DanielTakeshi/fast_grasp_detect/blob/master/src/fast_grasp_detect/configs/bed_grasp_config.py\n[2]:https://github.com/DanielTakeshi/fast_grasp_detect/blob/master/src/fast_grasp_detect/configs/bed_success_config.py\n[3]:https://github.com/tensorflow/tensorflow/tree/master/tensorflow/contrib/slim\n[4]:https://github.com/DanielTakeshi/fast_grasp_detect\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fberkeleyautomation%2Ffast_grasp_detect","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fberkeleyautomation%2Ffast_grasp_detect","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fberkeleyautomation%2Ffast_grasp_detect/lists"}