Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/mje-nz/natnet_ros
NatNet 3 ROS driver
https://github.com/mje-nz/natnet_ros
Last synced: about 1 month ago
JSON representation
NatNet 3 ROS driver
- Host: GitHub
- URL: https://github.com/mje-nz/natnet_ros
- Owner: mje-nz
- License: bsd-3-clause
- Created: 2017-11-10T04:20:32.000Z (about 7 years ago)
- Default Branch: master
- Last Pushed: 2019-10-21T23:07:25.000Z (about 5 years ago)
- Last Synced: 2023-08-07T00:36:54.864Z (over 1 year ago)
- Language: Python
- Size: 49.8 KB
- Stars: 18
- Watchers: 3
- Forks: 9
- Open Issues: 7
-
Metadata Files:
- Readme: Readme.md
- License: LICENSE
Awesome Lists containing this project
README
# NatNet 3 ROS driver
[![Build Status](https://travis-ci.org/mje-nz/natnet_ros.svg?branch=master)](https://travis-ci.org/mje-nz/natnet_ros)
[![codecov](https://codecov.io/gh/mje-nz/natnet_ros/branch/master/graph/badge.svg)](https://codecov.io/gh/mje-nz/natnet_ros)This package contains a ROS driver for the NatNet protocol used by the NaturalPoint OptiTrack motion capture system.
It supports NatNet versions 3.0 and 3.1 (the versions used by Motive 2.0 and 2.1), which are a bit more efficient than older verions and add more accurate timing.
The actual NatNet implementation is in [mje-nz/python_natnet](https://github.com/mje-nz/python_natnet), which is included as a submodule and repackaged.The NatNet protocol is used for streaming live motion capture data (rigid bodies, skeletons etc) across a network.
If you're just looking to record some motion capture data and you don't need to synchronise it with other sensors, the easiest method is to record it in Motive directly.Only tested on Motive 2.0 and 2.1, although older versions (i.e., older NatNet protocol versions) probably mostly work.
Skeletons, force plates, and other peripherals probably mostly work in the underlying library but are not tested and are not published as ROS topics.Only supported on ROS Melodic and Kinetic, but it works on Indigo and probably on newer distributions too.
The underlying library supports Python 2.7 and 3.4+ on Linux, Windows and macOS.
Both this package and the library have CI for all supported platforms.**Note**: The `dev` branch is quite a bit ahead of `master` at the moment; if you run into problems then check if they're fixed there.
Features:
* Doesn't crash all the time, unlike mocap_optitrack
* Synchronizes clocks to get timestamps right, unlike vrpn_client_ros
* All topics are timestamped with the camera mid-exposure time (give or take a few tenths of a millisecond)
* Publishes rigid bodies as `geometry_msgs/PoseStamped`
* Publishes markers as `geometry_msgs/PointStamped`
* Publishes markers that aren't in a rigid body as `natnet_msgs/MarkerList`, because they change ID a lot
* Publishes all markers together as `visualization_msgs/Marker` (`SPHERE_LIST`) for Rviz## TODO
* Add duplicate marker ID check and report bug
* Document "solver replaces occlusion" behaviour and implement workaround
* Rename MocapFrame to FrameOfData
* Make clock sync optional
* Release python_natnet on PyPI
* Add CONTRIBUTING etc
* Make sure unknown message IDs, failure to decode messages etc doesn't result in a crash
* Make sure to use the same variable signedness as the SDK
* Use rospy.spin
* Add warning when unnecessary data is being streamed
* Add guidance for appropriate Motive settings## Installation
From Kinetic (any metapackage) or Indigo (any metapackage plus pip):
```
mkdir -p catkin_ws/src/
cd catkin_ws/src
git clone --recursive https://github.com/mje-nz/natnet_ros.git
cd ..
rosdep install -y --from-paths src --ignore-src
catkin_make
source devel/setup.bash
```Then, if you have a ROS Master running, you can test it with a fake server:
```
rosrun natnet_ros client _fake:=true
```or run against your real Motive instance:
```
rosrun natnet_ros client _server:=(Motive IP)
```## ROS API
The `client` node connects to a NatNet server and publishes the data as ROS topics.
The data and descriptions are used directly, so make to give your rigid bodies unique streaming IDs and sensible names, and set the axis convention to z-up.
If you need Motive to use y-up for some reason, you can fix it with:```
rosrun tf static_transform_publisher 0 0 0 0 0 1.57079632679 mocap_z_up mocap 100
```### Published topics
* `~rigid_bodies//pose` (geometry_msgs/PoseStamped)
Rigid body pose.
The name is taken from the NatNet stream (i.e., from Motive), with a bit of an attempt to make sure it's a valid ROS name.* `~rigid_bodies//marker` (geometry_msgs/PointStamped)
Position of each marker of each rigid body.
The ID is taken from the NatNet stream (i.e., from Motive).* `~rigid_bodies//markers` (natnet_msgs/MarkerList)
Position and ID of each marker of each marker of each rigid body as a list.
* `~markers/leftovers` (natnet_msgs/MarkerList)
Position and ID of any markers that aren't in a rigid body.
* `~markers/vis` (visualization_msgs/Marker)
Position of all markers, for visualization with Rviz.
The size is set to the average estimated marker size.### Parameters
* `~server` (`string`, optional)
NatNet server to connect to; will autodetect if not provided.
* `~fake` (`bool`, default: false)
Use fake data instead of connecting to a real server.
* `~debug` (`bool`, default: false)
Enable debug logging.
* `~rate` (`int`, default: 100)
If `fake` is true, the rate at which to publish the fake data (in Hz).
* `~mocap_frame` (`string`, default: 'mocap')
The name of the `tf` frame for the mocap data.
* `~tf_remap` (`dict`, optional)
A dictionary mapping rigid body names to the name of the `tf` frame they should be published as.
For example, `{'UAV': 'base_link'}` would cause the pose for the rigid body named `UAV` to be published as a transform from `mocap` to `base_link`.Dict params are a bit tricky to use in Kinetic or earlier: a literal works with `rosrun`, but using a dict literal as a parameter value in a launch file doesn't work until Lunar.
Using a `rosparam` tag works though.