7.1.5. PhysX Lidar Sensors

7.1.5.1. Learning Objectives

In this example, we will add a lidar sensor to match the one on top of Turtlebot3, and add the rostopics to publish lidar sensor data and info.

  • Add lidar sensor to the robot

  • Connect lidar sensor output to a ROS lidar publisher node to publish the data.

  • Visualize everything in RViz.

7.1.5.2. Getting Started

Prerequisite

  • Completed the URDF Import: Turtlebot tutorial so that Turtlebot is loaded and moving around.

  • ROS bridge is enabled, and roscore is running.

7.1.5.3. Adding a PhysX Lidar ROS Bridge

7.1.5.3.1. Adding a PhysX Lidar Sensor

First we need to add a lidar sensor to the robot.

  1. Go to Create -> Isaac -> Sensors -> PhysX Lidar -> Rotating.

  2. To place the synthetic lidar sensor at the same place as the robot’s lidar unit, drag the lidar prim under /World/turtlebot3_burger/base_scan. Zero out any displacement in the Transform fields inside the Property tab. The lidar prim should now be overlapping with the scanning unit of the robot.

  3. Inside the RawUSDProperties tab for the lidar prim, set the maxRange to 25. This way the lidar will ignore anything that’s beyond 25 meters. This will prevent the lidar reporting a hit everywhere in the room because of the walls.

  4. We’ll check drawLines to visualize the lidar scans.

  5. Press Play to see the lidar comes to life. Red lines of the scan means hit, green means no hit, the color spectrum from green to yellow to red is proportional to the distance of the object detected.

7.1.5.3.2. Lidar ROS OG Graph

Once the lidar sensor is in place, we can add the corresponding OG nodes to stream the detection data to a Rostopic. OG nodes for Lidar publisher should matches the images below.

Turtlebot Lidar Graph

7.1.5.3.3. Graph Explained

  • On Playback Tick Node: Producing a tick when simulation is “Playing”. Nodes that receives ticks from this node will execute their compute functions every simulation step.

  • Isaac Read Lidar Beam Node: Retrieve information about the Lidar and data. For inputs:LidarPrim, add target to point to the Lidar sensor we just added at /World/turtlebot3_burger/base_scan/Lidar.

  • ROS1 Publish Laser Scan: Publishing laser scan data. Type /laser_scan into the Topic Name field.

  • Isaac Read Simulation Time: Use Simulation time to timestamp the /laser_scan messages.

7.1.5.3.4. Verifying ROS connections

  1. Press Play to start ticking the graph and the physics simulation.

  2. In a separate ROS-sourced terminal , check that the associated rostopics exist with rostopic list. /laser_scan should be listed in addition to /rosout and /rosout_agg.

  3. To visualize the laser scan data, open RViz by typing in rviz on the command line and enter.

  4. Inside rviz, add a Laser Scan type to visualize. Make sure the Topic that the laser scan is listening to matches the topic name inside the ROS1 Publish Laser Scan, and fixed frame matches the frameID inside the ROS1 Publish Laser Scan node.

Turtlebot Lidar RViz

7.1.5.4. Multiple Sensors in RViz

To display multiple sensors in RViz, there are a few things that are important to make sure all the messages are synced up and timestamped correctly.

Simulation Timestamp

Use Isaac Read Simulation Time as the node that feeds the timestamp into all of the publishing nodes’ timestamps.

ROS clock

To publish the simulation time to the ROS clock topic, you can setup the graph as shown in the Running ROS Clock Publisher tutorial:

ROS Clock publisher

frameId and topicName

  1. To visualize all the sensors as well as the tf tree all at once inside RViz, the frameId and topicNames must follow a certain convention in order for Rviz to recognize them all at once. The table below roughly describes these conventions. To see the multi-sensor example below, consult the USD asset Isaac/Samples/ROS/Scenario/turtlebot_tutorial.usd.

    Source

    frameId

    nodeNamespace

    topicName

    type

    Camera RGB

    (device_name):(data_type)

    (device_name)/(data_type)

    image_raw

    rgb

    Camera Depth

    (device_name):(data_type)

    (device_name)/(data_type)

    image_rect_raw

    depth

    Lidar

    world

    laser_scan

    laser_scan

    TF

    tf

    tf

  2. To see the rviz image below, make sure the simulation is playing. In a ROS-sourced terminal, open with the configuration provided using the command: rviz -d <noetic_ws>/src/isaac_tutorials/rviz/camera_lidar.rviz. Once the Rviz window finishes loading, you can enable and disable the sensor streams inside the “Display” Panel on the left hand side.

7.1.5.5. Summary

This tutorial covered

  • Adding a PhysX Lidar sensor

  • Adding a PhysX Lidar Rostopic

  • Displaying multiple sensors in RViz.

7.1.5.5.1. Next Steps

Continue on to the next tutorial in our ROS Tutorials series, Transform Trees and Odometry, to learn how to add global and relative transforms to a TF tree.

7.1.5.5.2. Further Learning