{"id":19855233,"url":"https://github.com/leggedrobotics/delora","last_synced_at":"2025-04-09T16:08:10.128Z","repository":{"id":43290455,"uuid":"308324297","full_name":"leggedrobotics/delora","owner":"leggedrobotics","description":"Self-supervised Deep LiDAR Odometry for Robotic Applications","archived":false,"fork":false,"pushed_at":"2022-10-08T07:26:22.000Z","size":889,"stargazers_count":242,"open_issues_count":4,"forks_count":51,"subscribers_count":19,"default_branch":"main","last_synced_at":"2025-04-09T16:08:00.639Z","etag":null,"topics":["lidar-odometry","paper","robotic-applications","robotics"],"latest_commit_sha":null,"homepage":"https://arxiv.org/pdf/2011.05418.pdf","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"bsd-3-clause","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/leggedrobotics.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null}},"created_at":"2020-10-29T12:40:22.000Z","updated_at":"2025-03-30T20:37:33.000Z","dependencies_parsed_at":"2023-01-19T15:47:42.121Z","dependency_job_id":null,"html_url":"https://github.com/leggedrobotics/delora","commit_stats":null,"previous_names":[],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/leggedrobotics%2Fdelora","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/leggedrobotics%2Fdelora/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/leggedrobotics%2Fdelora/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/leggedrobotics%2Fdelora/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/leggedrobotics","download_url":"https://codeload.github.com/leggedrobotics/delora/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":248065286,"owners_count":21041871,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["lidar-odometry","paper","robotic-applications","robotics"],"created_at":"2024-11-12T14:12:05.667Z","updated_at":"2025-04-09T16:08:10.090Z","avatar_url":"https://github.com/leggedrobotics.png","language":"Python","readme":"# DeLORA: Self-supervised Deep LiDAR Odometry for Robotic Applications\n\n## Overview\n\n* Paper: [link](https://arxiv.org/pdf/2011.05418.pdf)\n* Video: [link](https://youtu.be/rgeRseUTryA)\n* ICRA Presentation: [link](https://youtu.be/83Bn2zKi-bQ)\n\nThis is the corresponding code to the above paper (\"Self-supervised Learning of LiDAR Odometry for Robotic\nApplications\") which is published at the International Conference on Robotics and Automation (ICRA) 2021. The code is\nprovided by the [Robotics Systems Lab](https://rsl.ethz.ch/) at ETH Zurich, Switzerland.\n\n**Authors:** [Julian Nubert](https://juliannubert.com/) ([julian.nubert@mavt.ethz.ch](mailto:julian.nubert@mavt.ethz.ch?subject=[GitHub]))\n, [Shehryar Khattak](https://www.linkedin.com/in/shehryar-khattak/)\n, [Marco Hutter](https://rsl.ethz.ch/the-lab/people/person-detail.MTIxOTEx.TGlzdC8yNDQxLC0xNDI1MTk1NzM1.html)\n\n![title_img](images/title_img.png)\n\n*Copyright IEEE*\n\n## Python Setup\n\nWe provide a conda environment for running our code.\n\n### Conda\n\nThe conda environment is very comfortable to use in combination with PyTorch because only NVidia drivers are needed. The\nInstallation of suitable CUDA and CUDNN libraries is all handle by Conda.\n\n* Install conda: [link](https://docs.conda.io/projects/conda/en/latest/user-guide/install/index.html)\n* To set up the conda environment run the following command:\n\n```bash\nconda env create -f conda/DeLORA-py3.9.yml\n```\n\nThis installs an environment including GPU-enabled PyTorch, including any needed CUDA and cuDNN dependencies.\n\n* Activate the environment:\n\n```bash\nconda activate DeLORA-py3.9\n```\n\n* Install the package to set all paths correctly:\n\n```bash\npip3 install -e .\n```\n\n## ROS Setup\n\nFor running ROS code in the [./src/ros_utils/](./src/ros_utils/) folder you need to have ROS\ninstalled ([link](http://wiki.ros.org/ROS/Installation)). We recommend Ubuntu 20.04 and ROS Noetic due to its native\nPython3 support. For performing inference in Python2.7, convert your PyTorch model\nwith [./scripts/convert_pytorch_models.py](./scripts/convert_pytorch_models.py) and run an older PyTorch version (\u003c1.3).\n\n### ros-numpy\n\nIn any case you need to install ros-numpy if you want to make use of the provided rosnode:\n\n```bash\nsudo apt install ros-\u003cdistro\u003e-ros-numpy\n```\n\n## Datasets and Preprocessing\n\nInstructions on how to use and preprocess the datasets can be found in the [./datasets/](./datasets/) folder. We provide\nscripts for doing the preprocessing for:\n\n1. general **rosbags** containing LiDAR scans,\n2. and for the **KITTI dataset** in its own format.\n\n### Example: KITTI Dataset\n\n#### LiDAR Scans\n\nDownload the \"velodyne laster data\" from the official KITTI odometry evaluation (\n80GB): [link](http://www.cvlibs.net/datasets/kitti/eval_odometry.php). Put it to ```\u003cdelora_ws\u003e/datasets/kitti```,\nwhere ```kitti``` contains ```/data_odometry_velodyne/dataset/sequences/00..21```.\n\n#### Groundtruth poses\n\nPlease also download the groundtruth poses [here](http://www.cvlibs.net/datasets/kitti/eval_odometry.php). Make sure\nthat the files are located at ```\u003cdelora_ws\u003e/datasets/kitti```, where ```kitti```\ncontains ```/data_odometry_poses/dataset/poses/00..10.txt```.\n\n#### Preprocessing\n\nIn the file ```./config/deployment_options.yaml``` make sure to set ```datasets: [\"kitti\"]```. Then run\n\n```bash\npreprocess_data.py\n```\n\n### Custom Dataset\n\nIf you want to add an own dataset please add its sensor specifications\nto [./config/config_datasets.yaml](./config/config_datasets.yaml). Information that needs\nto be added is the dataset name, its sequences and its sensor specifications such as vertical field of view and number\nof rings.\n\n## Deploy\n\nAfter preprocessing, for each dataset we assume the following hierarchical structure:\n```dataset_name/sequence/scan``` (see previous dataset example). Our code natively supports training and/or testing on\nvarious datasets with various sequences at the same time.\n\n### Training\n\nRun the training with the following command:\n\n```bash\nrun_training.py\n```\n\nThe training will be executed for the dataset(s) specified\nin [./config/deployment_options.yaml](./config/deployment_options.yaml). You will be prompted to enter a name for this\ntraining run, which will be used for reference in the MLFlow logging.\n\n#### Custom Settings\n\nFor custom settings and hyper-parameters please have a look in [./config/](./config/).\n\nBy default loading from RAM is disabled. If you have enough memory, enable it\nin [./config/deployment_options.yaml](./config/deployment_options.yaml). When loading from disk, the first few\niterations are sometimes slow due to I/O, but it should accelerate quite quickly. For storing the KITTI training set\nentirely in memory, roughly 50GB of RAM are required.\n\n#### Continuing Training\n\nFor continuing training provide the ```--checkpoint``` flag with a path to the model checkpoint to the script above.\n\n### Visualizing progress and results\n\nFor visualizing progress we use MLFlow. It allows for simple logging of parameters, metrics, images, and artifacts.\nArtifacts could e.g. also be whole TensorBoard logfiles. To visualize the training progress execute (from DeLORA\nfolder):\n\n```bash\nmlflow ui \n```\n\nThe MLFlow can then be visualized in your browser following the link in the terminal.\n\n### Testing\n\nTesting can be run along the line:\n\n```bash\nrun_testing.py --checkpoint \u003cpath_to_checkpoint\u003e\n```\n\nThe checkpoint can be found in MLFlow after training. It runs testing for the dataset specified\nin [./config/deployment_options.yaml](./config/deployment_options.yaml).\n\nWe provide an exemplary trained model in [./checkpoints/kitti_example.pth](./checkpoints/kitti_example.pth).\n\n### ROS-Node\n\nThis ROS-node takes the pretrained model at location ```\u003cmodel_location\u003e``` and performs inference; i.e. it predicts and\npublishes the relative transformation between incoming point cloud scans. The variable ```\u003cdataset\u003e``` should contain\nthe name of the dataset in the config files, e.g. kitti, in order to load the corresponding parameters. Topic and frame\nnames can be specified in the following way:\n\n```bash\nrun_rosnode.py --checkpoint \u003cmodel_location\u003e --dataset \u003cdataset\u003e --lidar_topic=\u003cname_of_lidar_topic\u003e --lidar_frame=\u003cname_of_lidar_frame\u003e\n```\n\nThe resulting odometry will be published as a ```nav_msgs.msg.Odometry``` message under the topic ```/delora/odometry```\n.\n\n#### Example: DARPA Dataset\n\nFor the darpa dataset this could look as follows:\n\n```bash\nrun_rosnode.py --checkpoint ~/Downloads/checkpoint_epoch_0.pth --dataset darpa --lidar_topic \"/sherman/lidar_points\" --lidar_frame sherman/ouster_link\n```\n\n### Comfort Functions\n\nAdditional functionalities are provided in [./bin/](./bin/) and [./scripts/](./scripts/).\n\n#### Visualization of Normals (mainly for debugging)\n\nLocated in [./bin/](./bin/), see the readme-file [./dataset/README.md](./dataset/README.md) for more information.\n\n#### Creation of Rosbags for KITTI Dataset\n\nAfter starting a roscore, conversion from KITTI dataset format to a rosbag can be done using the following command:\n\n```bash\npython scripts/convert_kitti_to_rosbag.py\n```\n\nThe point cloud scans will be contained in the topic ```\"/velodyne_points\"```, located in the frame ```velodyne```. E.g.\nfor the created rosbag, our provided rosnode can be run using the following command:\n\n```bash\nrun_rosnode.py --checkpoint ~/Downloads/checkpoint_epoch_30.pth --lidar_topic \"/velodyne_points\" --lidar_frame \"velodyne\"\n```\n\n#### Convert PyTorch Model to older PyTorch Compatibility\n\nConverion of the new model ```\u003cpath_to_model\u003e/model.pth``` to old (compatible with \u003c\nPyTorch1.3) ```\u003cpath_to_model\u003e/model_py27.pth``` can be done with the following:\n\n```bash\npython scripts/convert_pytorch_models.py --checkpoint \u003cpath_to_model\u003e/model\n```\n\nNote that there is no .pth ending in the script.\n\n#### Time The Network\n\nThe execution time of the network can be timed using:\n\n```bash\npython scripts/time_network.py\n```\n\n## Paper\n\nThank you for citing [DeLORA (ICRA-2021)](https://arxiv.org/abs/2011.05418) if you use any of this code.\n\n```\n@inproceedings{nubert2021self,\n  title={Self-supervised Learning of LiDAR Odometry for Robotic Applications},\n  author={Nubert, Julian and Khattak, Shehryar and Hutter, Marco},\n  booktitle={IEEE International Conference on Robotics and Automation (ICRA)},\n  year={2021},\n  organization={IEEE}\n}\n```\n\n### Dependencies ###\n\nDependencies are specified in [./conda/DeLORA-py3.9.yml](./conda/DeLORA-py3.9.yml)\nand [./pip/requirements.txt](./pip/requirements.txt).\n\n### Tuning\n\nIf the result does not achieve the desired performance, please have a look at the normal estimation, since the loss is\nusually dominated by the plane-to-plane loss, which is impacted by noisy normal estimates. For the results presented in\nthe paper we picked some reasonable parameters without further fine-tuning, but we are convinced that less noisy normal\nestimates would lead to an even better convergence.\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fleggedrobotics%2Fdelora","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fleggedrobotics%2Fdelora","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fleggedrobotics%2Fdelora/lists"}