Isaac Sensor Extension [omni.isaac.sensor]
The Isaac Sensor Extension provides a set of simulated physics based sensors like contact sensor, inertial measurement unit (IMU) sensor, effort sensor, RTX lidar, and interfaces to access them in the simulator
Contact Sensor
- class ContactSensor(prim_path: str, name: Optional[str] = 'contact_sensor', frequency: Optional[int] = None, dt: Optional[float] = None, translation: Optional[numpy.ndarray] = None, position: Optional[numpy.ndarray] = None, min_threshold: Optional[float] = None, max_threshold: Optional[float] = None, radius: Optional[float] = None)
- add_raw_contact_data_to_frame() None
- apply_visual_material(visual_material: omni.isaac.core.materials.visual_material.VisualMaterial, weaker_than_descendants: bool = False) None
Apply visual material to the held prim and optionally its descendants.
- Parameters
visual_material (VisualMaterial) – visual material to be applied to the held prim. Currently supports PreviewSurface, OmniPBR and OmniGlass.
weaker_than_descendants (bool, optional) – True if the material shouldn’t override the descendants materials, otherwise False. Defaults to False.
Example:
>>> from omni.isaac.core.materials import OmniGlass >>> >>> # create a dark-red glass visual material >>> material = OmniGlass( ... prim_path="/World/material/glass", # path to the material prim to create ... ior=1.25, ... depth=0.001, ... thin_walled=False, ... color=np.array([0.5, 0.0, 0.0]) ... ) >>> prim.apply_visual_material(material)
- get_applied_visual_material() omni.isaac.core.materials.visual_material.VisualMaterial
Return the current applied visual material in case it was applied using apply_visual_material or it’s one of the following materials that was already applied before: PreviewSurface, OmniPBR and OmniGlass.
- Returns
the current applied visual material if its type is currently supported.
- Return type
Example:
>>> # given a visual material applied >>> prim.get_applied_visual_material() <omni.isaac.core.materials.omni_glass.OmniGlass object at 0x7f36263106a0>
- get_current_frame() None
- get_default_state() omni.isaac.core.utils.types.XFormPrimState
Get the default prim states (spatial position and orientation).
- Returns
an object that contains the default state of the prim (position and orientation)
- Return type
Example:
>>> state = prim.get_default_state() >>> state <omni.isaac.core.utils.types.XFormPrimState object at 0x7f33addda650> >>> >>> state.position [-4.5299529e-08 -1.8347054e-09 -2.8610229e-08] >>> state.orientation [1. 0. 0. 0.]
- get_dt() float
- get_frequency() int
- get_local_pose() Tuple[numpy.ndarray, numpy.ndarray]
Get prim’s pose with respect to the local frame (the prim’s parent frame)
- Returns
first index is the position in the local frame (with shape (3, )). Second index is quaternion orientation (with shape (4, )) in the local frame
- Return type
Tuple[np.ndarray, np.ndarray]
Example:
>>> # if the prim is in position (1.0, 0.5, 0.0) with respect to the world frame >>> position, orientation = prim.get_local_pose() >>> position [0. 0. 0.] >>> orientation [0. 0. 0.]
- get_local_scale() numpy.ndarray
Get prim’s scale with respect to the local frame (the parent’s frame)
- Returns
scale applied to the prim’s dimensions in the local frame. shape is (3, ).
- Return type
np.ndarray
Example:
>>> prim.get_local_scale() [1. 1. 1.]
- get_max_threshold() float
- get_min_threshold() float
- get_radius() float
- get_visibility() bool
- Returns
true if the prim is visible in stage. false otherwise.
- Return type
bool
Example:
>>> # get the visible state of an visible prim on the stage >>> prim.get_visibility() True
- get_world_pose() Tuple[numpy.ndarray, numpy.ndarray]
Get prim’s pose with respect to the world’s frame
- Returns
first index is the position in the world frame (with shape (3, )). Second index is quaternion orientation (with shape (4, )) in the world frame
- Return type
Tuple[np.ndarray, np.ndarray]
Example:
>>> # if the prim is in position (1.0, 0.5, 0.0) with respect to the world frame >>> position, orientation = prim.get_world_pose() >>> position [1. 0.5 0. ] >>> orientation [1. 0. 0. 0.]
- get_world_scale() numpy.ndarray
Get prim’s scale with respect to the world’s frame
- Returns
scale applied to the prim’s dimensions in the world frame. shape is (3, ).
- Return type
np.ndarray
Example:
>>> prim.get_world_scale() [1. 1. 1.]
- initialize(physics_sim_view=None) None
Create a physics simulation view if not passed and using PhysX tensor API
Note
If the prim has been added to the world scene (e.g.,
world.scene.add(prim)
), it will be automatically initialized when the world is reset (e.g.,world.reset()
).- Parameters
physics_sim_view (omni.physics.tensors.SimulationView, optional) – current physics simulation view. Defaults to None.
Example:
>>> prim.initialize()
- is_paused() bool
- is_valid() bool
Check if the prim path has a valid USD Prim at it
- Returns
True is the current prim path corresponds to a valid prim in stage. False otherwise.
- Return type
bool
Example:
>>> # given an existing and valid prim >>> prims.is_valid() True
- is_visual_material_applied() bool
Check if there is a visual material applied
- Returns
True if there is a visual material applied. False otherwise.
- Return type
bool
Example:
>>> # given a visual material applied >>> prim.is_visual_material_applied() True
- property name: Optional[str]
Returns: str: name given to the prim when instantiating it. Otherwise None.
- property non_root_articulation_link: bool
Used to query if the prim is a non root articulation link
- Returns
True if the prim itself is a non root link
- Return type
bool
Example:
>>> # for a wrapped articulation (where the root prim has the Physics Articulation Root property applied) >>> prim.non_root_articulation_link False
- pause() None
- post_reset() None
Reset the prim to its default state (position and orientation).
Note
For an articulation, in addition to configuring the root prim’s default position and spatial orientation (defined via the
set_default_state
method), the joint’s positions, velocities, and efforts (defined via theset_joints_default_state
method) are imposedExample:
>>> prim.post_reset()
- property prim: pxr.Usd.Prim
Returns: Usd.Prim: USD Prim object that this object holds.
- property prim_path: str
Returns: str: prim path in the stage
- remove_raw_contact_data_from_frame() None
- resume() None
- set_default_state(position: Optional[Sequence[float]] = None, orientation: Optional[Sequence[float]] = None) None
Set the default state of the prim (position and orientation), that will be used after each reset.
- Parameters
position (Optional[Sequence[float]], optional) – position in the world frame of the prim. shape is (3, ). Defaults to None, which means left unchanged.
orientation (Optional[Sequence[float]], optional) – quaternion orientation in the world frame of the prim. quaternion is scalar-first (w, x, y, z). shape is (4, ). Defaults to None, which means left unchanged.
Example:
>>> # configure default state >>> prim.set_default_state(position=np.array([1.0, 0.5, 0.0]), orientation=np.array([1, 0, 0, 0])) >>> >>> # set default states during post-reset >>> prim.post_reset()
- set_dt(value: float) None
- set_frequency(value: float) None
- set_local_pose(translation: Optional[Sequence[float]] = None, orientation: Optional[Sequence[float]] = None) None
Set prim’s pose with respect to the local frame (the prim’s parent frame).
Warning
This method will change (teleport) the prim pose immediately to the indicated value
- Parameters
translation (Optional[Sequence[float]], optional) – translation in the local frame of the prim (with respect to its parent prim). shape is (3, ). Defaults to None, which means left unchanged.
orientation (Optional[Sequence[float]], optional) – quaternion orientation in the local frame of the prim. quaternion is scalar-first (w, x, y, z). shape is (4, ). Defaults to None, which means left unchanged.
Hint
This method belongs to the methods used to set the prim state
Example:
>>> prim.set_local_pose(translation=np.array([1.0, 0.5, 0.0]), orientation=np.array([1., 0., 0., 0.]))
- set_local_scale(scale: Optional[Sequence[float]]) None
Set prim’s scale with respect to the local frame (the prim’s parent frame).
- Parameters
scale (Optional[Sequence[float]]) – scale to be applied to the prim’s dimensions. shape is (3, ). Defaults to None, which means left unchanged.
Example:
>>> # scale prim 10 times smaller >>> prim.set_local_scale(np.array([0.1, 0.1, 0.1]))
- set_max_threshold(value: float) None
- set_min_threshold(value: float) None
- set_radius(value: float) None
- set_visibility(visible: bool) None
Set the visibility of the prim in stage
- Parameters
visible (bool) – flag to set the visibility of the usd prim in stage.
Example:
>>> # make prim not visible in the stage >>> prim.set_visibility(visible=False)
- set_world_pose(position: Optional[Sequence[float]] = None, orientation: Optional[Sequence[float]] = None) None
Ses prim’s pose with respect to the world’s frame
Warning
This method will change (teleport) the prim pose immediately to the indicated value
- Parameters
position (Optional[Sequence[float]], optional) – position in the world frame of the prim. shape is (3, ). Defaults to None, which means left unchanged.
orientation (Optional[Sequence[float]], optional) – quaternion orientation in the world frame of the prim. quaternion is scalar-first (w, x, y, z). shape is (4, ). Defaults to None, which means left unchanged.
Hint
This method belongs to the methods used to set the prim state
Example:
>>> prim.set_world_pose(position=np.array([1.0, 0.5, 0.0]), orientation=np.array([1., 0., 0., 0.]))
IMU sensor
- class IMUSensor(prim_path: str, name: Optional[str] = 'imu_sensor', frequency: Optional[int] = None, dt: Optional[float] = None, translation: Optional[numpy.ndarray] = None, position: Optional[numpy.ndarray] = None, orientation: Optional[numpy.ndarray] = None, linear_acceleration_filter_size: Optional[int] = 1, angular_velocity_filter_size: Optional[int] = 1, orientation_filter_size: Optional[int] = 1)
- apply_visual_material(visual_material: omni.isaac.core.materials.visual_material.VisualMaterial, weaker_than_descendants: bool = False) None
Apply visual material to the held prim and optionally its descendants.
- Parameters
visual_material (VisualMaterial) – visual material to be applied to the held prim. Currently supports PreviewSurface, OmniPBR and OmniGlass.
weaker_than_descendants (bool, optional) – True if the material shouldn’t override the descendants materials, otherwise False. Defaults to False.
Example:
>>> from omni.isaac.core.materials import OmniGlass >>> >>> # create a dark-red glass visual material >>> material = OmniGlass( ... prim_path="/World/material/glass", # path to the material prim to create ... ior=1.25, ... depth=0.001, ... thin_walled=False, ... color=np.array([0.5, 0.0, 0.0]) ... ) >>> prim.apply_visual_material(material)
- get_applied_visual_material() omni.isaac.core.materials.visual_material.VisualMaterial
Return the current applied visual material in case it was applied using apply_visual_material or it’s one of the following materials that was already applied before: PreviewSurface, OmniPBR and OmniGlass.
- Returns
the current applied visual material if its type is currently supported.
- Return type
Example:
>>> # given a visual material applied >>> prim.get_applied_visual_material() <omni.isaac.core.materials.omni_glass.OmniGlass object at 0x7f36263106a0>
- get_current_frame(read_gravity=True) dict
- get_default_state() omni.isaac.core.utils.types.XFormPrimState
Get the default prim states (spatial position and orientation).
- Returns
an object that contains the default state of the prim (position and orientation)
- Return type
Example:
>>> state = prim.get_default_state() >>> state <omni.isaac.core.utils.types.XFormPrimState object at 0x7f33addda650> >>> >>> state.position [-4.5299529e-08 -1.8347054e-09 -2.8610229e-08] >>> state.orientation [1. 0. 0. 0.]
- get_dt() float
- get_frequency() int
- get_local_pose() Tuple[numpy.ndarray, numpy.ndarray]
Get prim’s pose with respect to the local frame (the prim’s parent frame)
- Returns
first index is the position in the local frame (with shape (3, )). Second index is quaternion orientation (with shape (4, )) in the local frame
- Return type
Tuple[np.ndarray, np.ndarray]
Example:
>>> # if the prim is in position (1.0, 0.5, 0.0) with respect to the world frame >>> position, orientation = prim.get_local_pose() >>> position [0. 0. 0.] >>> orientation [0. 0. 0.]
- get_local_scale() numpy.ndarray
Get prim’s scale with respect to the local frame (the parent’s frame)
- Returns
scale applied to the prim’s dimensions in the local frame. shape is (3, ).
- Return type
np.ndarray
Example:
>>> prim.get_local_scale() [1. 1. 1.]
- get_visibility() bool
- Returns
true if the prim is visible in stage. false otherwise.
- Return type
bool
Example:
>>> # get the visible state of an visible prim on the stage >>> prim.get_visibility() True
- get_world_pose() Tuple[numpy.ndarray, numpy.ndarray]
Get prim’s pose with respect to the world’s frame
- Returns
first index is the position in the world frame (with shape (3, )). Second index is quaternion orientation (with shape (4, )) in the world frame
- Return type
Tuple[np.ndarray, np.ndarray]
Example:
>>> # if the prim is in position (1.0, 0.5, 0.0) with respect to the world frame >>> position, orientation = prim.get_world_pose() >>> position [1. 0.5 0. ] >>> orientation [1. 0. 0. 0.]
- get_world_scale() numpy.ndarray
Get prim’s scale with respect to the world’s frame
- Returns
scale applied to the prim’s dimensions in the world frame. shape is (3, ).
- Return type
np.ndarray
Example:
>>> prim.get_world_scale() [1. 1. 1.]
- initialize(physics_sim_view=None) None
Create a physics simulation view if not passed and using PhysX tensor API
Note
If the prim has been added to the world scene (e.g.,
world.scene.add(prim)
), it will be automatically initialized when the world is reset (e.g.,world.reset()
).- Parameters
physics_sim_view (omni.physics.tensors.SimulationView, optional) – current physics simulation view. Defaults to None.
Example:
>>> prim.initialize()
- is_paused() bool
- is_valid() bool
Check if the prim path has a valid USD Prim at it
- Returns
True is the current prim path corresponds to a valid prim in stage. False otherwise.
- Return type
bool
Example:
>>> # given an existing and valid prim >>> prims.is_valid() True
- is_visual_material_applied() bool
Check if there is a visual material applied
- Returns
True if there is a visual material applied. False otherwise.
- Return type
bool
Example:
>>> # given a visual material applied >>> prim.is_visual_material_applied() True
- property name: Optional[str]
Returns: str: name given to the prim when instantiating it. Otherwise None.
- property non_root_articulation_link: bool
Used to query if the prim is a non root articulation link
- Returns
True if the prim itself is a non root link
- Return type
bool
Example:
>>> # for a wrapped articulation (where the root prim has the Physics Articulation Root property applied) >>> prim.non_root_articulation_link False
- pause() None
- post_reset() None
Reset the prim to its default state (position and orientation).
Note
For an articulation, in addition to configuring the root prim’s default position and spatial orientation (defined via the
set_default_state
method), the joint’s positions, velocities, and efforts (defined via theset_joints_default_state
method) are imposedExample:
>>> prim.post_reset()
- property prim: pxr.Usd.Prim
Returns: Usd.Prim: USD Prim object that this object holds.
- property prim_path: str
Returns: str: prim path in the stage
- resume() None
- set_default_state(position: Optional[Sequence[float]] = None, orientation: Optional[Sequence[float]] = None) None
Set the default state of the prim (position and orientation), that will be used after each reset.
- Parameters
position (Optional[Sequence[float]], optional) – position in the world frame of the prim. shape is (3, ). Defaults to None, which means left unchanged.
orientation (Optional[Sequence[float]], optional) – quaternion orientation in the world frame of the prim. quaternion is scalar-first (w, x, y, z). shape is (4, ). Defaults to None, which means left unchanged.
Example:
>>> # configure default state >>> prim.set_default_state(position=np.array([1.0, 0.5, 0.0]), orientation=np.array([1, 0, 0, 0])) >>> >>> # set default states during post-reset >>> prim.post_reset()
- set_dt(value: float) None
- set_frequency(value: int) None
- set_local_pose(translation: Optional[Sequence[float]] = None, orientation: Optional[Sequence[float]] = None) None
Set prim’s pose with respect to the local frame (the prim’s parent frame).
Warning
This method will change (teleport) the prim pose immediately to the indicated value
- Parameters
translation (Optional[Sequence[float]], optional) – translation in the local frame of the prim (with respect to its parent prim). shape is (3, ). Defaults to None, which means left unchanged.
orientation (Optional[Sequence[float]], optional) – quaternion orientation in the local frame of the prim. quaternion is scalar-first (w, x, y, z). shape is (4, ). Defaults to None, which means left unchanged.
Hint
This method belongs to the methods used to set the prim state
Example:
>>> prim.set_local_pose(translation=np.array([1.0, 0.5, 0.0]), orientation=np.array([1., 0., 0., 0.]))
- set_local_scale(scale: Optional[Sequence[float]]) None
Set prim’s scale with respect to the local frame (the prim’s parent frame).
- Parameters
scale (Optional[Sequence[float]]) – scale to be applied to the prim’s dimensions. shape is (3, ). Defaults to None, which means left unchanged.
Example:
>>> # scale prim 10 times smaller >>> prim.set_local_scale(np.array([0.1, 0.1, 0.1]))
- set_visibility(visible: bool) None
Set the visibility of the prim in stage
- Parameters
visible (bool) – flag to set the visibility of the usd prim in stage.
Example:
>>> # make prim not visible in the stage >>> prim.set_visibility(visible=False)
- set_world_pose(position: Optional[Sequence[float]] = None, orientation: Optional[Sequence[float]] = None) None
Ses prim’s pose with respect to the world’s frame
Warning
This method will change (teleport) the prim pose immediately to the indicated value
- Parameters
position (Optional[Sequence[float]], optional) – position in the world frame of the prim. shape is (3, ). Defaults to None, which means left unchanged.
orientation (Optional[Sequence[float]], optional) – quaternion orientation in the world frame of the prim. quaternion is scalar-first (w, x, y, z). shape is (4, ). Defaults to None, which means left unchanged.
Hint
This method belongs to the methods used to set the prim state
Example:
>>> prim.set_world_pose(position=np.array([1.0, 0.5, 0.0]), orientation=np.array([1., 0., 0., 0.]))
Effort sensor
- class EffortSensor(prim_path: str, sensor_period: float = - 1, use_latest_data: bool = False, enabled: bool = True)
- apply_action(control_actions: omni.isaac.core.utils.types.ArticulationAction) None
Apply joint positions, velocities and/or efforts to control an articulation
- Parameters
control_actions (ArticulationAction) – actions to be applied for next physics step.
indices (Optional[Union[list, np.ndarray]], optional) – degree of freedom indices to apply actions to. Defaults to all degrees of freedom.
Hint
High stiffness makes the joints snap faster and harder to the desired target, and higher damping smoothes but also slows down the joint’s movement to target
For position control, set relatively high stiffness and low damping (to reduce vibrations)
For velocity control, stiffness must be set to zero with a non-zero damping
For effort control, stiffness and damping must be set to zero
Example:
>>> from omni.isaac.core.utils.types import ArticulationAction >>> >>> # move all the robot joints to the indicated position >>> action = ArticulationAction(joint_positions=np.array([0.0, -1.0, 0.0, -2.2, 0.0, 2.4, 0.8, 0.04, 0.04])) >>> prim.apply_action(action) >>> >>> # close the robot fingers: panda_finger_joint1 (7) and panda_finger_joint2 (8) to 0.0 >>> action = ArticulationAction(joint_positions=np.array([0.0, 0.0]), joint_indices=np.array([7, 8])) >>> prim.apply_action(action)
- apply_visual_material(visual_material: omni.isaac.core.materials.visual_material.VisualMaterial, weaker_than_descendants: bool = False) None
Apply visual material to the held prim and optionally its descendants.
- Parameters
visual_material (VisualMaterial) – visual material to be applied to the held prim. Currently supports PreviewSurface, OmniPBR and OmniGlass.
weaker_than_descendants (bool, optional) – True if the material shouldn’t override the descendants materials, otherwise False. Defaults to False.
Example:
>>> from omni.isaac.core.materials import OmniGlass >>> >>> # create a dark-red glass visual material >>> material = OmniGlass( ... prim_path="/World/material/glass", # path to the material prim to create ... ior=1.25, ... depth=0.001, ... thin_walled=False, ... color=np.array([0.5, 0.0, 0.0]) ... ) >>> prim.apply_visual_material(material)
- change_buffer_size(new_buffer_size: int) None
- disable_gravity() None
Keep gravity from affecting the robot
Example:
>>> prim.disable_gravity()
- property dof_names: List[str]
List of prim names for each DOF.
- Returns
prim names
- Return type
list(string)
Example:
>>> prim.dof_names ['panda_joint1', 'panda_joint2', 'panda_joint3', 'panda_joint4', 'panda_joint5', 'panda_joint6', 'panda_joint7', 'panda_finger_joint1', 'panda_finger_joint2']
- property dof_properties: numpy.ndarray
Articulation DOF properties
Index
Property name
Description
0
type
DOF type: invalid/unknown/uninitialized (0), rotation (1), translation (2)
1
hasLimits
Whether the DOF has limits
2
lower
Lower DOF limit (in radians or meters)
3
upper
Upper DOF limit (in radians or meters)
4
driveMode
Drive mode for the DOF: force (1), acceleration (2)
5
maxVelocity
Maximum DOF velocity. In radians/s, or stage_units/s
6
maxEffort
Maximum DOF effort. In N or N*stage_units
7
stiffness
DOF stiffness
8
damping
DOF damping
- Returns
named NumPy array of shape (num_dof, 9)
- Return type
np.ndarray
Example:
>>> # get properties for all DOFs >>> prim.dof_properties [(1, True, -2.8973, 2.8973, 1, 1.0000000e+01, 5220., 60000., 3000.) (1, True, -1.7628, 1.7628, 1, 1.0000000e+01, 5220., 60000., 3000.) (1, True, -2.8973, 2.8973, 1, 5.9390470e+36, 5220., 60000., 3000.) (1, True, -3.0718, -0.0698, 1, 5.9390470e+36, 5220., 60000., 3000.) (1, True, -2.8973, 2.8973, 1, 5.9390470e+36, 720., 25000., 3000.) (1, True, -0.0175, 3.7525, 1, 5.9390470e+36, 720., 15000., 3000.) (1, True, -2.8973, 2.8973, 1, 1.0000000e+01, 720., 5000., 3000.) (2, True, 0. , 0.04 , 1, 3.4028235e+38, 720., 6000., 1000.) (2, True, 0. , 0.04 , 1, 3.4028235e+38, 720., 6000., 1000.)] >>> >>> # property names >>> prim.dof_properties.dtype.names ('type', 'hasLimits', 'lower', 'upper', 'driveMode', 'maxVelocity', 'maxEffort', 'stiffness', 'damping') >>> >>> # get DOF upper limits >>> prim.dof_properties["upper"] [ 2.8973 1.7628 2.8973 -0.0698 2.8973 3.7525 2.8973 0.04 0.04 ] >>> >>> # get the last DOF (panda_finger_joint2) upper limit >>> prim.dof_properties["upper"][8] # or prim.dof_properties[8][3] 0.04
- enable_gravity() None
Gravity will affect the robot
Example:
>>> prim.enable_gravity()
- get_angular_velocity() numpy.ndarray
Get the angular velocity of the root articulation prim
- Returns
3D angular velocity vector. Shape (3,)
- Return type
np.ndarray
Example:
>>> prim.get_angular_velocity() [0. 0. 0.]
- get_applied_action() omni.isaac.core.utils.types.ArticulationAction
Get the last applied action
- Returns
last applied action. Note: a dictionary is used as the object’s string representation
- Return type
Example:
>>> # last applied action: joint_positions -> [0.0, -1.0, 0.0, -2.2, 0.0, 2.4, 0.8, 0.04, 0.04] >>> prim.get_applied_action() {'joint_positions': [0.0, -1.0, 0.0, -2.200000047683716, 0.0, 2.4000000953674316, 0.800000011920929, 0.03999999910593033, 0.03999999910593033], 'joint_velocities': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'joint_efforts': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]}
- get_applied_joint_efforts(joint_indices: Optional[Union[List, numpy.ndarray]] = None) numpy.ndarray
Get the efforts applied to the joints set by the
set_joint_efforts
method- Parameters
joint_indices (Optional[Union[List, np.ndarray]], optional) – indices to specify which joints to read. Defaults to None (all joints)
- Raises
Exception – If the handlers are not initialized
- Returns
all or selected articulation joint applied efforts
- Return type
np.ndarray
Example:
>>> # get all applied joint efforts >>> prim.get_applied_joint_efforts() [ 0. 0. 0. 0. 0. 0. 0. 0. 0.] >>> >>> # get finger applied efforts: panda_finger_joint1 (7) and panda_finger_joint2 (8) >>> prim.get_applied_joint_efforts(joint_indices=np.array([7, 8])) [0. 0.]
- get_applied_visual_material() omni.isaac.core.materials.visual_material.VisualMaterial
Return the current applied visual material in case it was applied using apply_visual_material or it’s one of the following materials that was already applied before: PreviewSurface, OmniPBR and OmniGlass.
- Returns
the current applied visual material if its type is currently supported.
- Return type
Example:
>>> # given a visual material applied >>> prim.get_applied_visual_material() <omni.isaac.core.materials.omni_glass.OmniGlass object at 0x7f36263106a0>
- get_articulation_body_count() int
Get the number of bodies (links) that make up the articulation
- Returns
amount of bodies
- Return type
int
Example:
>>> prim.get_articulation_body_count() 12
- get_articulation_controller() omni.isaac.core.controllers.articulation_controller.ArticulationController
Get the articulation controller
Note
If no
articulation_controller
was passed during class instantiation, a default controller of typeArticulationController
(a Proportional-Derivative controller that can apply position targets, velocity targets and efforts) will be used- Returns
articulation controller
- Return type
Example:
>>> prim.get_articulation_controller() <omni.isaac.core.controllers.articulation_controller.ArticulationController object at 0x7f04a0060190>
- get_default_state() omni.isaac.core.utils.types.XFormPrimState
Get the default prim states (spatial position and orientation).
- Returns
an object that contains the default state of the prim (position and orientation)
- Return type
Example:
>>> state = prim.get_default_state() >>> state <omni.isaac.core.utils.types.XFormPrimState object at 0x7f33addda650> >>> >>> state.position [-4.5299529e-08 -1.8347054e-09 -2.8610229e-08] >>> state.orientation [1. 0. 0. 0.]
- get_dof_index(dof_name: str) int
Get a DOF index given its name
- Parameters
dof_name (str) – name of the DOF
- Returns
DOF index
- Return type
int
Example:
>>> prim.get_dof_index("panda_finger_joint2") 8
- get_enabled_self_collisions() int
Get the enable self collisions flag (
physxArticulation:enabledSelfCollisions
)- Returns
self collisions flag (boolean interpreted as int)
- Return type
int
Example:
>>> prim.get_enabled_self_collisions() 0
- get_joint_positions(joint_indices: Optional[Union[List, numpy.ndarray]] = None) numpy.ndarray
Get the articulation joint positions
- Parameters
joint_indices (Optional[Union[List, np.ndarray]], optional) – indices to specify which joints to read. Defaults to None (all joints)
- Returns
all or selected articulation joint positions
- Return type
np.ndarray
Example:
>>> # get all joint positions >>> prim.get_joint_positions() [ 1.1999920e-02 -5.6962633e-01 1.3480479e-08 -2.8105433e+00 6.8284894e-06 3.0301569e+00 7.3234749e-01 3.9912373e-02 3.9999999e-02] >>> >>> # get finger positions: panda_finger_joint1 (7) and panda_finger_joint2 (8) >>> prim.get_joint_positions(joint_indices=np.array([7, 8])) [0.03991237 3.9999999e-02]
- get_joint_velocities(joint_indices: Optional[Union[List, numpy.ndarray]] = None) numpy.ndarray
Get the articulation joint velocities
- Parameters
joint_indices (Optional[Union[List, np.ndarray]], optional) – indices to specify which joints to read. Defaults to None (all joints)
- Returns
all or selected articulation joint velocities
- Return type
np.ndarray
Example:
>>> # get all joint velocities >>> prim.get_joint_velocities() [ 1.91603772e-06 -7.67638255e-03 -2.19138826e-07 1.10636465e-02 -4.63412944e-05 3.48245539e-02 8.84692147e-02 5.40335372e-04 1.02849208e-05] >>> >>> # get finger velocities: panda_finger_joint1 (7) and panda_finger_joint2 (8) >>> prim.get_joint_velocities(joint_indices=np.array([7, 8])) [5.4033537e-04 1.0284921e-05]
- get_joints_default_state() omni.isaac.core.utils.types.JointsState
Get the default joint states (positions and velocities).
- Returns
an object that contains the default joint positions and velocities
- Return type
Example:
>>> state = prim.get_joints_default_state() >>> state <omni.isaac.core.utils.types.JointsState object at 0x7f04a0061240> >>> >>> state.positions [ 0.012 -0.57000005 0. -2.81 0. 3.037 0.785398 0.04 0.04 ] >>> state.velocities [0. 0. 0. 0. 0. 0. 0. 0. 0.]
- get_joints_state() omni.isaac.core.utils.types.JointsState
Get the current joint states (positions and velocities)
- Returns
an object that contains the current joint positions and velocities
- Return type
Example:
>>> state = prim.get_joints_state() >>> state <omni.isaac.core.utils.types.JointsState object at 0x7f02f6df57b0> >>> >>> state.positions [ 1.1999920e-02 -5.6962633e-01 1.3480479e-08 -2.8105433e+00 6.8284894e-06 3.0301569e+00 7.3234749e-01 3.9912373e-02 3.9999999e-02] >>> state.velocities [ 1.91603772e-06 -7.67638255e-03 -2.19138826e-07 1.10636465e-02 -4.63412944e-05 245539e-02 8.84692147e-02 5.40335372e-04 1.02849208e-05]
- get_linear_velocity() numpy.ndarray
Get the linear velocity of the root articulation prim
- Returns
3D linear velocity vector. Shape (3,)
- Return type
np.ndarray
Example:
>>> prim.get_linear_velocity() [0. 0. 0.]
- get_local_pose() Tuple[numpy.ndarray, numpy.ndarray]
Get prim’s pose with respect to the local frame (the prim’s parent frame)
- Returns
first index is the position in the local frame (with shape (3, )). Second index is quaternion orientation (with shape (4, )) in the local frame
- Return type
Tuple[np.ndarray, np.ndarray]
Example:
>>> # if the prim is in position (1.0, 0.5, 0.0) with respect to the world frame >>> position, orientation = prim.get_local_pose() >>> position [0. 0. 0.] >>> orientation [0. 0. 0.]
- get_local_scale() numpy.ndarray
Get prim’s scale with respect to the local frame (the parent’s frame)
- Returns
scale applied to the prim’s dimensions in the local frame. shape is (3, ).
- Return type
np.ndarray
Example:
>>> prim.get_local_scale() [1. 1. 1.]
- get_measured_joint_efforts(joint_indices: Optional[Union[List, numpy.ndarray]] = None) numpy.ndarray
Returns the efforts computed/measured by the physics solver of the joint forces in the DOF motion direction
- Parameters
joint_indices (Optional[Union[List, np.ndarray]], optional) – indices to specify which joints to read. Defaults to None (all joints)
- Raises
Exception – If the handlers are not initialized
- Returns
all or selected articulation joint measured efforts
- Return type
np.ndarray
Example:
>>> # get all joint efforts >>> prim.get_measured_joint_efforts() [ 2.7897308e-06 -6.9083519e+00 -3.6398471e-06 1.9158335e+01 -4.3552645e-06 1.1866090e+00 -4.7079347e-06 3.2339853e-04 -3.2044132e-04] >>> >>> # get finger efforts: panda_finger_joint1 (7) and panda_finger_joint2 (8) >>> prim.get_measured_joint_efforts(joint_indices=np.array([7, 8])) [ 0.0003234 -0.00032044]
- get_measured_joint_forces(joint_indices: Optional[Union[List, numpy.ndarray]] = None) numpy.ndarray
Get the measured joint reaction forces and torques (link incoming joint forces and torques) to external loads
Note
Since the name->index map for joints has not been exposed yet, it is possible to access the joint names and their indices through the articulation metadata.
prim._articulation_view._metadata.joint_names # list of names prim._articulation_view._metadata.joint_indices # dict of name: index
To retrieve a specific row for the link incoming joint force/torque use
joint_index + 1
- Parameters
joint_indices (Optional[Union[List, np.ndarray]], optional) – indices to specify which joints to read. Defaults to None (all joints)
- Raises
Exception – If the handlers are not initialized
- Returns
measured joint forces and torques. Shape is (num_joint + 1, 6). Row index 0 is the incoming joint of the base link. For the last dimension the first 3 values are for forces and the last 3 for torques
- Return type
np.ndarray
Example:
>>> # get all measured joint forces and torques >>> prim.get_measured_joint_forces() [[ 0.0000000e+00 0.0000000e+00 0.0000000e+00 0.0000000e+00 0.0000000e+00 0.0000000e+00] [ 1.4995076e+02 4.2574748e-06 5.6364370e-04 4.8701895e-05 -6.9072924e+00 3.1881387e-05] [-2.8971717e-05 -1.0677823e+02 -6.8384506e+01 -6.9072924e+00 -5.4927128e-05 6.1222494e-07] [ 8.7120995e+01 -4.3871860e-05 -5.5795174e+01 5.3687054e-05 -2.4538563e+01 1.3333466e-05] [ 5.3519474e-05 -4.8109909e+01 6.0709282e+01 1.9157074e+01 -5.9258469e-05 8.2744418e-07] [-3.1691040e+01 2.3313689e-04 3.9990173e+01 -5.8968733e-05 -1.1863431e+00 2.2335558e-05] [-1.0809851e-04 1.5340537e+01 -1.5458489e+01 1.1863426e+00 6.1094368e-05 -1.5940281e-05] [-7.5418940e+00 -5.0814648e+00 -5.6512990e+00 -5.6385466e-05 3.8859999e-01 -3.4943256e-01] [ 4.7421460e+00 -3.1945827e+00 3.5528181e+00 5.5852943e-05 8.4794536e-03 7.6405057e-03] [ 4.0760727e+00 2.1640673e-01 -4.0513167e+00 -5.9565349e-04 1.1407082e-02 2.1432268e-06] [ 5.1680198e-03 -9.7754575e-02 -9.7093947e-02 -8.4155556e-12 -1.2910691e-12 -1.9347857e-11] [-5.1910793e-03 9.7588278e-02 -9.7106412e-02 8.4155573e-12 1.2910637e-12 -1.9347855e-11]] >>> >>> # get measured joint force and torque for the fingers >>> metadata = prim._articulation_view._metadata >>> joint_indices = 1 + np.array([ ... metadata.joint_indices["panda_finger_joint1"], ... metadata.joint_indices["panda_finger_joint2"], ... ]) >>> joint_indices [10 11] >>> prim.get_measured_joint_forces(joint_indices) [[ 5.1680198e-03 -9.7754575e-02 -9.7093947e-02 -8.4155556e-12 -1.2910691e-12 -1.9347857e-11] [-5.1910793e-03 9.7588278e-02 -9.7106412e-02 8.4155573e-12 1.2910637e-12 -1.9347855e-11]]
- get_sensor_reading(interpolation_function=None, use_latest_data=False) <omni.isaac.sensor.scripts.effort_sensor.EsSensorReading object at 0x7fd146e7c910>
- get_sleep_threshold() float
Get the threshold for articulations to enter a sleep state
Search for Articulations and Sleeping in PhysX docs for more details
- Returns
sleep threshold
- Return type
float
Example:
>>> prim.get_sleep_threshold() 0.005
- get_solver_position_iteration_count() int
Get the solver (position) iteration count for the articulation
The solver iteration count determines how accurately contacts, drives, and limits are resolved. Search for Solver Iteration Count in PhysX docs for more details.
- Returns
position iteration count
- Return type
int
Example:
>>> prim.get_solver_position_iteration_count() 32
- get_solver_velocity_iteration_count() int
Get the solver (velocity) iteration count for the articulation
The solver iteration count determines how accurately contacts, drives, and limits are resolved. Search for Solver Iteration Count in PhysX docs for more details.
- Returns
velocity iteration count
- Return type
int
Example:
>>> prim.get_solver_velocity_iteration_count() 32
- get_stabilization_threshold() float
Get the mass-normalized kinetic energy below which the articulation may participate in stabilization
Search for Stabilization Threshold in PhysX docs for more details
- Returns
stabilization threshold
- Return type
float
Example:
>>> prim.get_stabilization_threshold() 0.0009999999
- get_visibility() bool
- Returns
true if the prim is visible in stage. false otherwise.
- Return type
bool
Example:
>>> # get the visible state of an visible prim on the stage >>> prim.get_visibility() True
- get_world_pose() Tuple[numpy.ndarray, numpy.ndarray]
Get prim’s pose with respect to the world’s frame
- Returns
first index is the position in the world frame (with shape (3, )). Second index is quaternion orientation (with shape (4, )) in the world frame
- Return type
Tuple[np.ndarray, np.ndarray]
Example:
>>> # if the prim is in position (1.0, 0.5, 0.0) with respect to the world frame >>> position, orientation = prim.get_world_pose() >>> position [1. 0.5 0. ] >>> orientation [1. 0. 0. 0.]
- get_world_scale() numpy.ndarray
Get prim’s scale with respect to the world’s frame
- Returns
scale applied to the prim’s dimensions in the world frame. shape is (3, ).
- Return type
np.ndarray
Example:
>>> prim.get_world_scale() [1. 1. 1.]
- get_world_velocity() numpy.ndarray
Get the articulation root velocity
- Returns
current velocity of the the root prim. Shape (3,).
- Return type
np.ndarray
- property handles_initialized: bool
Check if articulation handler is initialized
- Returns
whether the handler was initialized
- Return type
bool
Example:
>>> prim.handles_initialized True
- initialize(physics_sim_view: Optional[omni.physics.tensors.bindings._physicsTensors.SimulationView] = None) None
Create a physics simulation view if not passed and an articulation view using PhysX tensor API
Note
If the articulation has been added to the world scene (e.g.,
world.scene.add(prim)
), it will be automatically initialized when the world is reset (e.g.,world.reset()
).Warning
This method needs to be called after each hard reset (e.g., Stop + Play on the timeline) before interacting with any other class method.
- Parameters
physics_sim_view (omni.physics.tensors.SimulationView, optional) – current physics simulation view. Defaults to None.
Example:
>>> prim.initialize()
- initialize_callbacks() None
- is_valid() bool
Check if the prim path has a valid USD Prim at it
- Returns
True is the current prim path corresponds to a valid prim in stage. False otherwise.
- Return type
bool
Example:
>>> # given an existing and valid prim >>> prims.is_valid() True
- is_visual_material_applied() bool
Check if there is a visual material applied
- Returns
True if there is a visual material applied. False otherwise.
- Return type
bool
Example:
>>> # given a visual material applied >>> prim.is_visual_material_applied() True
- lerp(start: float, end: float, time: float) float
- property name: Optional[str]
Returns: str: name given to the prim when instantiating it. Otherwise None.
- property non_root_articulation_link: bool
Used to query if the prim is a non root articulation link
- Returns
True if the prim itself is a non root link
- Return type
bool
Example:
>>> # for a wrapped articulation (where the root prim has the Physics Articulation Root property applied) >>> prim.non_root_articulation_link False
- property num_bodies: int
Number of articulation links
- Returns
number of links
- Return type
int
Example:
>>> prim.num_bodies 9
- property num_dof: int
Number of DOF of the articulation
- Returns
amount of DOFs
- Return type
int
Example:
>>> prim.num_dof 9
- post_reset() None
Reset the prim to its default state (position and orientation).
Note
For an articulation, in addition to configuring the root prim’s default position and spatial orientation (defined via the
set_default_state
method), the joint’s positions, velocities, and efforts (defined via theset_joints_default_state
method) are imposedExample:
>>> prim.post_reset()
- property prim: pxr.Usd.Prim
Returns: Usd.Prim: USD Prim object that this object holds.
- property prim_path: str
Returns: str: prim path in the stage
- set_angular_velocity(velocity: numpy.ndarray) None
Set the angular velocity of the root articulation prim
Warning
This method will immediately set the articulation state
- Parameters
velocity (np.ndarray) – 3D angular velocity vector. Shape (3,)
Hint
This method belongs to the methods used to set the articulation kinematic state:
set_linear_velocity
,set_angular_velocity
,set_joint_positions
,set_joint_velocities
,set_joint_efforts
Example:
>>> prim.set_angular_velocity(np.array([0.1, 0.0, 0.0]))
- set_default_state(position: Optional[Sequence[float]] = None, orientation: Optional[Sequence[float]] = None) None
Set the default state of the prim (position and orientation), that will be used after each reset.
- Parameters
position (Optional[Sequence[float]], optional) – position in the world frame of the prim. shape is (3, ). Defaults to None, which means left unchanged.
orientation (Optional[Sequence[float]], optional) – quaternion orientation in the world frame of the prim. quaternion is scalar-first (w, x, y, z). shape is (4, ). Defaults to None, which means left unchanged.
Example:
>>> # configure default state >>> prim.set_default_state(position=np.array([1.0, 0.5, 0.0]), orientation=np.array([1, 0, 0, 0])) >>> >>> # set default states during post-reset >>> prim.post_reset()
- set_enabled_self_collisions(flag: bool) None
Set the enable self collisions flag (
physxArticulation:enabledSelfCollisions
)- Parameters
flag (bool) – whether to enable self collisions
Example:
>>> prim.set_enabled_self_collisions(True)
- set_joint_efforts(efforts: numpy.ndarray, joint_indices: Optional[Union[List, numpy.ndarray]] = None) None
Set the articulation joint efforts
Note
This method can be used for effort control. For this purpose, there must be no joint drive or the stiffness and damping must be set to zero.
- Parameters
efforts (np.ndarray) – articulation joint efforts
joint_indices (Optional[Union[list, np.ndarray]], optional) – indices to specify which joints to manipulate. Defaults to None (all joints)
Hint
This method belongs to the methods used to set the articulation kinematic state:
set_linear_velocity
,set_angular_velocity
,set_joint_positions
,set_joint_velocities
,set_joint_efforts
Example:
>>> # set all the robot joint efforts to 0.0 >>> prim.set_joint_efforts(np.array([0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0])) >>> >>> # set only the fingers efforts: panda_finger_joint1 (7) and panda_finger_joint2 (8) to 10 >>> prim.set_joint_efforts(np.array([10, 10]), joint_indices=np.array([7, 8]))
- set_joint_positions(positions: numpy.ndarray, joint_indices: Optional[Union[List, numpy.ndarray]] = None) None
Set the articulation joint positions
Warning
This method will immediately set (teleport) the affected joints to the indicated value. Use the
apply_action
method to control robot joints.- Parameters
positions (np.ndarray) – articulation joint positions
joint_indices (Optional[Union[list, np.ndarray]], optional) – indices to specify which joints to manipulate. Defaults to None (all joints)
Hint
This method belongs to the methods used to set the articulation kinematic state:
set_linear_velocity
,set_angular_velocity
,set_joint_positions
,set_joint_velocities
,set_joint_efforts
Example:
>>> # set all the robot joints >>> prim.set_joint_positions(np.array([0.0, -1.0, 0.0, -2.2, 0.0, 2.4, 0.8, 0.04, 0.04])) >>> >>> # set only the fingers in closed position: panda_finger_joint1 (7) and panda_finger_joint2 (8) to 0.0 >>> prim.set_joint_positions(np.array([0.04, 0.04]), joint_indices=np.array([7, 8]))
- set_joint_velocities(velocities: numpy.ndarray, joint_indices: Optional[Union[List, numpy.ndarray]] = None) None
Set the articulation joint velocities
Warning
This method will immediately set the affected joints to the indicated value. Use the
apply_action
method to control robot joints.- Parameters
velocities (np.ndarray) – articulation joint velocities
joint_indices (Optional[Union[list, np.ndarray]], optional) – indices to specify which joints to manipulate. Defaults to None (all joints)
Hint
This method belongs to the methods used to set the articulation kinematic state:
set_linear_velocity
,set_angular_velocity
,set_joint_positions
,set_joint_velocities
,set_joint_efforts
Example:
>>> # set all the robot joint velocities to 0.0 >>> prim.set_joint_velocities(np.array([0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0])) >>> >>> # set only the fingers velocities: panda_finger_joint1 (7) and panda_finger_joint2 (8) to -0.01 >>> prim.set_joint_velocities(np.array([-0.01, -0.01]), joint_indices=np.array([7, 8]))
- set_joints_default_state(positions: Optional[numpy.ndarray] = None, velocities: Optional[numpy.ndarray] = None, efforts: Optional[numpy.ndarray] = None) None
Set the joint default states (positions, velocities and/or efforts) to be applied after each reset.
Note
The default states will be set during post-reset (e.g., calling
.post_reset()
orworld.reset()
methods)- Parameters
positions (Optional[np.ndarray], optional) – joint positions. Defaults to None.
velocities (Optional[np.ndarray], optional) – joint velocities. Defaults to None.
efforts (Optional[np.ndarray], optional) – joint efforts. Defaults to None.
Example:
>>> # configure default joint states >>> prim.set_joints_default_state( ... positions=np.array([0.0, -1.0, 0.0, -2.2, 0.0, 2.4, 0.8, 0.04, 0.04]), ... velocities=np.zeros(shape=(prim.num_dof,)), ... efforts=np.zeros(shape=(prim.num_dof,)) ... ) >>> >>> # set default states during post-reset >>> prim.post_reset()
- set_linear_velocity(velocity: numpy.ndarray) None
Set the linear velocity of the root articulation prim
Warning
This method will immediately set the articulation state
- Parameters
velocity (np.ndarray) – 3D linear velocity vector. Shape (3,).
Hint
This method belongs to the methods used to set the articulation kinematic state:
set_linear_velocity
,set_angular_velocity
,set_joint_positions
,set_joint_velocities
,set_joint_efforts
Example:
>>> prim.set_linear_velocity(np.array([0.1, 0.0, 0.0]))
- set_local_pose(translation: Optional[Sequence[float]] = None, orientation: Optional[Sequence[float]] = None) None
Set prim’s pose with respect to the local frame (the prim’s parent frame).
Warning
This method will change (teleport) the prim pose immediately to the indicated value
- Parameters
translation (Optional[Sequence[float]], optional) – translation in the local frame of the prim (with respect to its parent prim). shape is (3, ). Defaults to None, which means left unchanged.
orientation (Optional[Sequence[float]], optional) – quaternion orientation in the local frame of the prim. quaternion is scalar-first (w, x, y, z). shape is (4, ). Defaults to None, which means left unchanged.
Hint
This method belongs to the methods used to set the prim state
Example:
>>> prim.set_local_pose(translation=np.array([1.0, 0.5, 0.0]), orientation=np.array([1., 0., 0., 0.]))
- set_local_scale(scale: Optional[Sequence[float]]) None
Set prim’s scale with respect to the local frame (the prim’s parent frame).
- Parameters
scale (Optional[Sequence[float]]) – scale to be applied to the prim’s dimensions. shape is (3, ). Defaults to None, which means left unchanged.
Example:
>>> # scale prim 10 times smaller >>> prim.set_local_scale(np.array([0.1, 0.1, 0.1]))
- set_sleep_threshold(threshold: float) None
Set the threshold for articulations to enter a sleep state
Search for Articulations and Sleeping in PhysX docs for more details
- Parameters
threshold (float) – sleep threshold
Example:
>>> prim.set_sleep_threshold(0.01)
- set_solver_position_iteration_count(count: int) None
Set the solver (position) iteration count for the articulation
The solver iteration count determines how accurately contacts, drives, and limits are resolved. Search for Solver Iteration Count in PhysX docs for more details.
Warning
Setting a higher number of iterations may improve the fidelity of the simulation, although it may affect its performance.
- Parameters
count (int) – position iteration count
Example:
>>> prim.set_solver_position_iteration_count(64)
- set_solver_velocity_iteration_count(count: int)
Set the solver (velocity) iteration count for the articulation
The solver iteration count determines how accurately contacts, drives, and limits are resolved. Search for Solver Iteration Count in PhysX docs for more details.
Warning
Setting a higher number of iterations may improve the fidelity of the simulation, although it may affect its performance.
- Parameters
count (int) – velocity iteration count
Example:
>>> prim.set_solver_velocity_iteration_count(64)
- set_stabilization_threshold(threshold: float) None
Set the mass-normalized kinetic energy below which the articulation may participate in stabilization
Search for Stabilization Threshold in PhysX docs for more details
- Parameters
threshold (float) – stabilization threshold
Example:
>>> prim.set_stabilization_threshold(0.005)
- set_visibility(visible: bool) None
Set the visibility of the prim in stage
- Parameters
visible (bool) – flag to set the visibility of the usd prim in stage.
Example:
>>> # make prim not visible in the stage >>> prim.set_visibility(visible=False)
- set_world_pose(position: Optional[Sequence[float]] = None, orientation: Optional[Sequence[float]] = None) None
Ses prim’s pose with respect to the world’s frame
Warning
This method will change (teleport) the prim pose immediately to the indicated value
- Parameters
position (Optional[Sequence[float]], optional) – position in the world frame of the prim. shape is (3, ). Defaults to None, which means left unchanged.
orientation (Optional[Sequence[float]], optional) – quaternion orientation in the world frame of the prim. quaternion is scalar-first (w, x, y, z). shape is (4, ). Defaults to None, which means left unchanged.
Hint
This method belongs to the methods used to set the prim state
Example:
>>> prim.set_world_pose(position=np.array([1.0, 0.5, 0.0]), orientation=np.array([1., 0., 0., 0.]))
- set_world_velocity(velocity: numpy.ndarray)
Set the articulation root velocity
- Parameters
velocity (np.ndarray) – linear and angular velocity to set the root prim to. Shape (6,).
- update_dof_name(dof_name: str) None
- class EsSensorReading(is_valid: bool = False, time: float = 0, value: float = 0)
Lidar RTX sensor
- class LidarRtx(prim_path: str, name: str = 'lidar_rtx', position: Optional[numpy.ndarray] = None, translation: Optional[numpy.ndarray] = None, orientation: Optional[numpy.ndarray] = None, config_file_name: Optional[str] = None, firing_frequency: Optional[int] = None, firing_dt: Optional[float] = None, rotation_frequency: Optional[int] = None, rotation_dt: Optional[float] = None, valid_range: Optional[Tuple[float, float]] = None, scan_type: Optional[str] = None, elevation_range: Optional[Tuple[float, float]] = None, range_resolution: Optional[float] = None, range_accuracy: Optional[float] = None, avg_power: Optional[float] = None, wave_length: Optional[float] = None, pulse_time: Optional[float] = None)
- add_azimuth_data_to_frame()
- add_azimuth_range_to_frame()
- add_elevation_data_to_frame()
- add_horizontal_resolution_to_frame()
- add_intensities_data_to_frame()
- add_linear_depth_data_to_frame()
- add_point_cloud_data_to_frame()
- add_range_data_to_frame()
- apply_visual_material(visual_material: omni.isaac.core.materials.visual_material.VisualMaterial, weaker_than_descendants: bool = False) None
Apply visual material to the held prim and optionally its descendants.
- Parameters
visual_material (VisualMaterial) – visual material to be applied to the held prim. Currently supports PreviewSurface, OmniPBR and OmniGlass.
weaker_than_descendants (bool, optional) – True if the material shouldn’t override the descendants materials, otherwise False. Defaults to False.
Example:
>>> from omni.isaac.core.materials import OmniGlass >>> >>> # create a dark-red glass visual material >>> material = OmniGlass( ... prim_path="/World/material/glass", # path to the material prim to create ... ior=1.25, ... depth=0.001, ... thin_walled=False, ... color=np.array([0.5, 0.0, 0.0]) ... ) >>> prim.apply_visual_material(material)
- disable_visualization()
- enable_visualization()
- get_applied_visual_material() omni.isaac.core.materials.visual_material.VisualMaterial
Return the current applied visual material in case it was applied using apply_visual_material or it’s one of the following materials that was already applied before: PreviewSurface, OmniPBR and OmniGlass.
- Returns
the current applied visual material if its type is currently supported.
- Return type
Example:
>>> # given a visual material applied >>> prim.get_applied_visual_material() <omni.isaac.core.materials.omni_glass.OmniGlass object at 0x7f36263106a0>
- get_azimuth_range() Tuple[float, float]
- get_current_frame() dict
- get_default_state() omni.isaac.core.utils.types.XFormPrimState
Get the default prim states (spatial position and orientation).
- Returns
an object that contains the default state of the prim (position and orientation)
- Return type
Example:
>>> state = prim.get_default_state() >>> state <omni.isaac.core.utils.types.XFormPrimState object at 0x7f33addda650> >>> >>> state.position [-4.5299529e-08 -1.8347054e-09 -2.8610229e-08] >>> state.orientation [1. 0. 0. 0.]
- get_depth_range() Tuple[float, float]
- get_horizontal_fov() float
- get_horizontal_resolution() float
- get_local_pose() Tuple[numpy.ndarray, numpy.ndarray]
Get prim’s pose with respect to the local frame (the prim’s parent frame)
- Returns
first index is the position in the local frame (with shape (3, )). Second index is quaternion orientation (with shape (4, )) in the local frame
- Return type
Tuple[np.ndarray, np.ndarray]
Example:
>>> # if the prim is in position (1.0, 0.5, 0.0) with respect to the world frame >>> position, orientation = prim.get_local_pose() >>> position [0. 0. 0.] >>> orientation [0. 0. 0.]
- get_local_scale() numpy.ndarray
Get prim’s scale with respect to the local frame (the parent’s frame)
- Returns
scale applied to the prim’s dimensions in the local frame. shape is (3, ).
- Return type
np.ndarray
Example:
>>> prim.get_local_scale() [1. 1. 1.]
- get_num_cols() int
- get_num_rows() int
- get_render_product_path() str
- Returns
gets the path to the render product used by the lidar
- Return type
string
- get_rotation_frequency() float
- get_visibility() bool
- Returns
true if the prim is visible in stage. false otherwise.
- Return type
bool
Example:
>>> # get the visible state of an visible prim on the stage >>> prim.get_visibility() True
- get_world_pose() Tuple[numpy.ndarray, numpy.ndarray]
Get prim’s pose with respect to the world’s frame
- Returns
first index is the position in the world frame (with shape (3, )). Second index is quaternion orientation (with shape (4, )) in the world frame
- Return type
Tuple[np.ndarray, np.ndarray]
Example:
>>> # if the prim is in position (1.0, 0.5, 0.0) with respect to the world frame >>> position, orientation = prim.get_world_pose() >>> position [1. 0.5 0. ] >>> orientation [1. 0. 0. 0.]
- get_world_scale() numpy.ndarray
Get prim’s scale with respect to the world’s frame
- Returns
scale applied to the prim’s dimensions in the world frame. shape is (3, ).
- Return type
np.ndarray
Example:
>>> prim.get_world_scale() [1. 1. 1.]
- initialize(physics_sim_view=None) None
Create a physics simulation view if not passed and using PhysX tensor API
Note
If the prim has been added to the world scene (e.g.,
world.scene.add(prim)
), it will be automatically initialized when the world is reset (e.g.,world.reset()
).- Parameters
physics_sim_view (omni.physics.tensors.SimulationView, optional) – current physics simulation view. Defaults to None.
Example:
>>> prim.initialize()
- is_paused() bool
- is_valid() bool
Check if the prim path has a valid USD Prim at it
- Returns
True is the current prim path corresponds to a valid prim in stage. False otherwise.
- Return type
bool
Example:
>>> # given an existing and valid prim >>> prims.is_valid() True
- is_visual_material_applied() bool
Check if there is a visual material applied
- Returns
True if there is a visual material applied. False otherwise.
- Return type
bool
Example:
>>> # given a visual material applied >>> prim.is_visual_material_applied() True
- property name: Optional[str]
Returns: str: name given to the prim when instantiating it. Otherwise None.
- property non_root_articulation_link: bool
Used to query if the prim is a non root articulation link
- Returns
True if the prim itself is a non root link
- Return type
bool
Example:
>>> # for a wrapped articulation (where the root prim has the Physics Articulation Root property applied) >>> prim.non_root_articulation_link False
- pause() None
- post_reset() None
Reset the prim to its default state (position and orientation).
Note
For an articulation, in addition to configuring the root prim’s default position and spatial orientation (defined via the
set_default_state
method), the joint’s positions, velocities, and efforts (defined via theset_joints_default_state
method) are imposedExample:
>>> prim.post_reset()
- property prim: pxr.Usd.Prim
Returns: Usd.Prim: USD Prim object that this object holds.
- property prim_path: str
Returns: str: prim path in the stage
- remove_azimuth_data_from_frame()
- remove_azimuth_range_from_frame()
- remove_elevation_data_from_frame()
- remove_horizontal_resolution_from_frame()
- remove_intensities_data_from_frame()
- remove_linear_depth_data_from_frame()
- remove_point_cloud_data_from_frame()
- remove_range_data_from_frame()
- resume() None
- set_default_state(position: Optional[Sequence[float]] = None, orientation: Optional[Sequence[float]] = None) None
Set the default state of the prim (position and orientation), that will be used after each reset.
- Parameters
position (Optional[Sequence[float]], optional) – position in the world frame of the prim. shape is (3, ). Defaults to None, which means left unchanged.
orientation (Optional[Sequence[float]], optional) – quaternion orientation in the world frame of the prim. quaternion is scalar-first (w, x, y, z). shape is (4, ). Defaults to None, which means left unchanged.
Example:
>>> # configure default state >>> prim.set_default_state(position=np.array([1.0, 0.5, 0.0]), orientation=np.array([1, 0, 0, 0])) >>> >>> # set default states during post-reset >>> prim.post_reset()
- set_local_pose(translation: Optional[Sequence[float]] = None, orientation: Optional[Sequence[float]] = None) None
Set prim’s pose with respect to the local frame (the prim’s parent frame).
Warning
This method will change (teleport) the prim pose immediately to the indicated value
- Parameters
translation (Optional[Sequence[float]], optional) – translation in the local frame of the prim (with respect to its parent prim). shape is (3, ). Defaults to None, which means left unchanged.
orientation (Optional[Sequence[float]], optional) – quaternion orientation in the local frame of the prim. quaternion is scalar-first (w, x, y, z). shape is (4, ). Defaults to None, which means left unchanged.
Hint
This method belongs to the methods used to set the prim state
Example:
>>> prim.set_local_pose(translation=np.array([1.0, 0.5, 0.0]), orientation=np.array([1., 0., 0., 0.]))
- set_local_scale(scale: Optional[Sequence[float]]) None
Set prim’s scale with respect to the local frame (the prim’s parent frame).
- Parameters
scale (Optional[Sequence[float]]) – scale to be applied to the prim’s dimensions. shape is (3, ). Defaults to None, which means left unchanged.
Example:
>>> # scale prim 10 times smaller >>> prim.set_local_scale(np.array([0.1, 0.1, 0.1]))
- set_visibility(visible: bool) None
Set the visibility of the prim in stage
- Parameters
visible (bool) – flag to set the visibility of the usd prim in stage.
Example:
>>> # make prim not visible in the stage >>> prim.set_visibility(visible=False)
- set_world_pose(position: Optional[Sequence[float]] = None, orientation: Optional[Sequence[float]] = None) None
Ses prim’s pose with respect to the world’s frame
Warning
This method will change (teleport) the prim pose immediately to the indicated value
- Parameters
position (Optional[Sequence[float]], optional) – position in the world frame of the prim. shape is (3, ). Defaults to None, which means left unchanged.
orientation (Optional[Sequence[float]], optional) – quaternion orientation in the world frame of the prim. quaternion is scalar-first (w, x, y, z). shape is (4, ). Defaults to None, which means left unchanged.
Hint
This method belongs to the methods used to set the prim state
Example:
>>> prim.set_world_pose(position=np.array([1.0, 0.5, 0.0]), orientation=np.array([1., 0., 0., 0.]))
Rotating Lidar PhysX sensor
- class RotatingLidarPhysX(prim_path: str, name: str = 'rotating_lidar_physX', rotation_frequency: Optional[float] = None, rotation_dt: Optional[float] = None, position: Optional[numpy.ndarray] = None, translation: Optional[numpy.ndarray] = None, orientation: Optional[numpy.ndarray] = None, fov: Optional[Tuple[float, float]] = None, resolution: Optional[Tuple[float, float]] = None, valid_range: Optional[Tuple[float, float]] = None)
- add_azimuth_data_to_frame() None
- add_depth_data_to_frame() None
- add_intensity_data_to_frame() None
- add_linear_depth_data_to_frame() None
- add_point_cloud_data_to_frame() None
- add_semantics_data_to_frame() None
- add_zenith_data_to_frame() None
- apply_visual_material(visual_material: omni.isaac.core.materials.visual_material.VisualMaterial, weaker_than_descendants: bool = False) None
Apply visual material to the held prim and optionally its descendants.
- Parameters
visual_material (VisualMaterial) – visual material to be applied to the held prim. Currently supports PreviewSurface, OmniPBR and OmniGlass.
weaker_than_descendants (bool, optional) – True if the material shouldn’t override the descendants materials, otherwise False. Defaults to False.
Example:
>>> from omni.isaac.core.materials import OmniGlass >>> >>> # create a dark-red glass visual material >>> material = OmniGlass( ... prim_path="/World/material/glass", # path to the material prim to create ... ior=1.25, ... depth=0.001, ... thin_walled=False, ... color=np.array([0.5, 0.0, 0.0]) ... ) >>> prim.apply_visual_material(material)
- disable_semantics() None
- disable_visualization() None
- enable_semantics() None
- enable_visualization(high_lod: bool = False, draw_points: bool = True, draw_lines: bool = True) None
- get_applied_visual_material() omni.isaac.core.materials.visual_material.VisualMaterial
Return the current applied visual material in case it was applied using apply_visual_material or it’s one of the following materials that was already applied before: PreviewSurface, OmniPBR and OmniGlass.
- Returns
the current applied visual material if its type is currently supported.
- Return type
Example:
>>> # given a visual material applied >>> prim.get_applied_visual_material() <omni.isaac.core.materials.omni_glass.OmniGlass object at 0x7f36263106a0>
- get_current_frame() dict
- get_default_state() omni.isaac.core.utils.types.XFormPrimState
Get the default prim states (spatial position and orientation).
- Returns
an object that contains the default state of the prim (position and orientation)
- Return type
Example:
>>> state = prim.get_default_state() >>> state <omni.isaac.core.utils.types.XFormPrimState object at 0x7f33addda650> >>> >>> state.position [-4.5299529e-08 -1.8347054e-09 -2.8610229e-08] >>> state.orientation [1. 0. 0. 0.]
- get_fov() Tuple[float, float]
- get_local_pose() Tuple[numpy.ndarray, numpy.ndarray]
Get prim’s pose with respect to the local frame (the prim’s parent frame)
- Returns
first index is the position in the local frame (with shape (3, )). Second index is quaternion orientation (with shape (4, )) in the local frame
- Return type
Tuple[np.ndarray, np.ndarray]
Example:
>>> # if the prim is in position (1.0, 0.5, 0.0) with respect to the world frame >>> position, orientation = prim.get_local_pose() >>> position [0. 0. 0.] >>> orientation [0. 0. 0.]
- get_local_scale() numpy.ndarray
Get prim’s scale with respect to the local frame (the parent’s frame)
- Returns
scale applied to the prim’s dimensions in the local frame. shape is (3, ).
- Return type
np.ndarray
Example:
>>> prim.get_local_scale() [1. 1. 1.]
- get_num_cols() int
- get_num_cols_in_last_step() int
- get_num_rows() int
- get_resolution() float
- get_rotation_frequency() int
- get_valid_range() Tuple[float, float]
- get_visibility() bool
- Returns
true if the prim is visible in stage. false otherwise.
- Return type
bool
Example:
>>> # get the visible state of an visible prim on the stage >>> prim.get_visibility() True
- get_world_pose() Tuple[numpy.ndarray, numpy.ndarray]
Get prim’s pose with respect to the world’s frame
- Returns
first index is the position in the world frame (with shape (3, )). Second index is quaternion orientation (with shape (4, )) in the world frame
- Return type
Tuple[np.ndarray, np.ndarray]
Example:
>>> # if the prim is in position (1.0, 0.5, 0.0) with respect to the world frame >>> position, orientation = prim.get_world_pose() >>> position [1. 0.5 0. ] >>> orientation [1. 0. 0. 0.]
- get_world_scale() numpy.ndarray
Get prim’s scale with respect to the world’s frame
- Returns
scale applied to the prim’s dimensions in the world frame. shape is (3, ).
- Return type
np.ndarray
Example:
>>> prim.get_world_scale() [1. 1. 1.]
- initialize(physics_sim_view=None) None
Create a physics simulation view if not passed and using PhysX tensor API
Note
If the prim has been added to the world scene (e.g.,
world.scene.add(prim)
), it will be automatically initialized when the world is reset (e.g.,world.reset()
).- Parameters
physics_sim_view (omni.physics.tensors.SimulationView, optional) – current physics simulation view. Defaults to None.
Example:
>>> prim.initialize()
- is_paused() bool
- is_semantics_enabled() bool
- is_valid() bool
Check if the prim path has a valid USD Prim at it
- Returns
True is the current prim path corresponds to a valid prim in stage. False otherwise.
- Return type
bool
Example:
>>> # given an existing and valid prim >>> prims.is_valid() True
- is_visual_material_applied() bool
Check if there is a visual material applied
- Returns
True if there is a visual material applied. False otherwise.
- Return type
bool
Example:
>>> # given a visual material applied >>> prim.is_visual_material_applied() True
- property name: Optional[str]
Returns: str: name given to the prim when instantiating it. Otherwise None.
- property non_root_articulation_link: bool
Used to query if the prim is a non root articulation link
- Returns
True if the prim itself is a non root link
- Return type
bool
Example:
>>> # for a wrapped articulation (where the root prim has the Physics Articulation Root property applied) >>> prim.non_root_articulation_link False
- pause() None
- post_reset() None
Reset the prim to its default state (position and orientation).
Note
For an articulation, in addition to configuring the root prim’s default position and spatial orientation (defined via the
set_default_state
method), the joint’s positions, velocities, and efforts (defined via theset_joints_default_state
method) are imposedExample:
>>> prim.post_reset()
- property prim: pxr.Usd.Prim
Returns: Usd.Prim: USD Prim object that this object holds.
- property prim_path: str
Returns: str: prim path in the stage
- remove_azimuth_data_from_frame() None
- remove_depth_data_from_frame() None
- remove_intensity_data_from_frame() None
- remove_linear_depth_data_from_frame() None
- remove_point_cloud_data_from_frame() None
- remove_semantics_data_from_frame() None
- remove_zenith_data_from_frame() None
- resume() None
- set_default_state(position: Optional[Sequence[float]] = None, orientation: Optional[Sequence[float]] = None) None
Set the default state of the prim (position and orientation), that will be used after each reset.
- Parameters
position (Optional[Sequence[float]], optional) – position in the world frame of the prim. shape is (3, ). Defaults to None, which means left unchanged.
orientation (Optional[Sequence[float]], optional) – quaternion orientation in the world frame of the prim. quaternion is scalar-first (w, x, y, z). shape is (4, ). Defaults to None, which means left unchanged.
Example:
>>> # configure default state >>> prim.set_default_state(position=np.array([1.0, 0.5, 0.0]), orientation=np.array([1, 0, 0, 0])) >>> >>> # set default states during post-reset >>> prim.post_reset()
- set_fov(value: Tuple[float, float]) None
- set_local_pose(translation: Optional[Sequence[float]] = None, orientation: Optional[Sequence[float]] = None) None
Set prim’s pose with respect to the local frame (the prim’s parent frame).
Warning
This method will change (teleport) the prim pose immediately to the indicated value
- Parameters
translation (Optional[Sequence[float]], optional) – translation in the local frame of the prim (with respect to its parent prim). shape is (3, ). Defaults to None, which means left unchanged.
orientation (Optional[Sequence[float]], optional) – quaternion orientation in the local frame of the prim. quaternion is scalar-first (w, x, y, z). shape is (4, ). Defaults to None, which means left unchanged.
Hint
This method belongs to the methods used to set the prim state
Example:
>>> prim.set_local_pose(translation=np.array([1.0, 0.5, 0.0]), orientation=np.array([1., 0., 0., 0.]))
- set_local_scale(scale: Optional[Sequence[float]]) None
Set prim’s scale with respect to the local frame (the prim’s parent frame).
- Parameters
scale (Optional[Sequence[float]]) – scale to be applied to the prim’s dimensions. shape is (3, ). Defaults to None, which means left unchanged.
Example:
>>> # scale prim 10 times smaller >>> prim.set_local_scale(np.array([0.1, 0.1, 0.1]))
- set_resolution(value: float) None
- set_rotation_frequency(value: int) None
- set_valid_range(value: Tuple[float, float]) None
- set_visibility(visible: bool) None
Set the visibility of the prim in stage
- Parameters
visible (bool) – flag to set the visibility of the usd prim in stage.
Example:
>>> # make prim not visible in the stage >>> prim.set_visibility(visible=False)
- set_world_pose(position: Optional[Sequence[float]] = None, orientation: Optional[Sequence[float]] = None) None
Ses prim’s pose with respect to the world’s frame
Warning
This method will change (teleport) the prim pose immediately to the indicated value
- Parameters
position (Optional[Sequence[float]], optional) – position in the world frame of the prim. shape is (3, ). Defaults to None, which means left unchanged.
orientation (Optional[Sequence[float]], optional) – quaternion orientation in the world frame of the prim. quaternion is scalar-first (w, x, y, z). shape is (4, ). Defaults to None, which means left unchanged.
Hint
This method belongs to the methods used to set the prim state
Example:
>>> prim.set_world_pose(position=np.array([1.0, 0.5, 0.0]), orientation=np.array([1., 0., 0., 0.]))
Camera sensor
- class Camera(prim_path: str, name: str = 'camera', frequency: Optional[int] = None, dt: Optional[str] = None, resolution: Optional[Tuple[int, int]] = None, position: Optional[numpy.ndarray] = None, orientation: Optional[numpy.ndarray] = None, translation: Optional[numpy.ndarray] = None, render_product_path: Optional[str] = None)
Provides high level functions to deal with a camera prim and its attributes/ properties. If there is a camera prim present at the path, it will use it. Otherwise, a new Camera prim at the specified prim path will be created.
- Parameters
prim_path (str) – prim path of the Camera Prim to encapsulate or create.
name (str, optional) – shortname to be used as a key by Scene class. Note: needs to be unique if the object is added to the Scene. Defaults to “camera”.
frequency (Optional[int], optional) – Frequency of the sensor (i.e: how often is the data frame updated). Defaults to None.
dt (Optional[str], optional) – dt of the sensor (i.e: period at which a the data frame updated). Defaults to None.
resolution (Optional[Tuple[int, int]], optional) – resolution of the camera (width, height). Defaults to None.
position (Optional[Sequence[float]], optional) – position in the world frame of the prim. shape is (3, ). Defaults to None, which means left unchanged.
translation (Optional[Sequence[float]], optional) – translation in the local frame of the prim (with respect to its parent prim). shape is (3, ). Defaults to None, which means left unchanged.
orientation (Optional[Sequence[float]], optional) – quaternion orientation in the world/ local frame of the prim (depends if translation or position is specified). quaternion is scalar-first (w, x, y, z). shape is (4, ). Defaults to None, which means left unchanged.
render_product_path (str) – path to an existing render product, will be used instead of creating a new render product the resolution and camera attached to this render product will be set based on the input arguments. Note: Using same render product path on two Camera objects with different camera prims, resolutions is not supported Defaults to None
- add_bounding_box_2d_loose_to_frame() None
Attach the bounding_box_2d_loose annotator to this camera. The bounding_box_2d_loose annotator returns:
np.array shape: (num_objects, 1) dtype: np.dtype([
(“semanticId”, “<u4”), (“x_min”, “<i4”), (“y_min”, “<i4”), (“x_max”, “<i4”), (“y_max”, “<i4”), (“occlusionRatio”, “<f4”),
])
See more details: https://docs.omniverse.nvidia.com/extensions/latest/ext_replicator/annotators_details.html#bounding-box-2d-loose
- add_bounding_box_2d_tight_to_frame() None
Attach the bounding_box_2d_tight annotator to this camera. The bounding_box_2d_tight annotator returns:
np.array shape: (num_objects, 1) dtype: np.dtype([
(“semanticId”, “<u4”), (“x_min”, “<i4”), (“y_min”, “<i4”), (“x_max”, “<i4”), (“y_max”, “<i4”), (“occlusionRatio”, “<f4”),
])
See more details: https://docs.omniverse.nvidia.com/extensions/latest/ext_replicator/annotators_details.html#bounding-box-2d-tight
- add_bounding_box_3d_to_frame() None
- add_distance_to_camera_to_frame() None
Attach the distance_to_camera_to_frame annotator to this camera. The distance_to_camera_to_frame annotator returns:
np.array shape: (width, height, 1) dtype: np.float32
See more details: https://docs.omniverse.nvidia.com/extensions/latest/ext_replicator/annotators_details.html#distance-to-camera
- add_distance_to_image_plane_to_frame() None
Attach the distance_to_image_plane annotator to this camera. The distance_to_image_plane annotator returns:
np.array shape: (width, height, 1) dtype: np.float32
See more details: https://docs.omniverse.nvidia.com/extensions/latest/ext_replicator/annotators_details.html#distance-to-image-plane
- add_instance_id_segmentation_to_frame() None
Attach the instance_id_segmentation annotator to this camera. The instance_id_segmentation annotator returns:
np.array shape: (width, height, 1) or (width, height, 4) if colorize is set to true dtype: np.uint32 or np.uint8 if colorize is set to true
See more details: https://docs.omniverse.nvidia.com/extensions/latest/ext_replicator/annotators_details.html#instance-id-segmentation
- add_instance_segmentation_to_frame() None
Attach the instance_segmentation annotator to this camera. The main difference between instance id segmentation and instance segmentation are that instance segmentation annotator goes down the hierarchy to the lowest level prim which has semantic labels, which instance id segmentation always goes down to the leaf prim. The instance_segmentation annotator returns:
np.array shape: (width, height, 1) or (width, height, 4) if colorize is set to true dtype: np.uint32 or np.uint8 if colorize is set to true
See more details: https://docs.omniverse.nvidia.com/extensions/latest/ext_replicator/annotators_details.html#instance-segmentation
- add_motion_vectors_to_frame() None
Attach the motion vectors annotator to this camera. The motion vectors annotator returns:
np.array shape: (width, height, 4) dtype: np.float32
See more details: https://docs.omniverse.nvidia.com/extensions/latest/ext_replicator/annotators_details.html#motion-vectors
- add_normals_to_frame() None
Attach the normals annotator to this camera. The normals annotator returns:
np.array shape: (width, height, 4) dtype: np.float32
See more details: https://docs.omniverse.nvidia.com/extensions/latest/ext_replicator/annotators_details.html#normals
- add_occlusion_to_frame() None
Attach the occlusion annotator to this camera. The occlusion annotator returns:
np.array shape: (num_objects, 1) dtype: np.dtype([(“instanceId”, “<u4”), (“semanticId”, “<u4”), (“occlusionRatio”, “<f4”)])
- add_pointcloud_to_frame(include_unlabelled: bool = False)
Attach the pointcloud annotator to this camera. The pointcloud annotator returns:
np.array shape: (num_points, 3) dtype: np.float32
See more details: https://docs.omniverse.nvidia.com/extensions/latest/ext_replicator/annotators_details.html#point-cloud
- add_semantic_segmentation_to_frame() None
Attach the semantic_segmentation annotator to this camera. The semantic_segmentation annotator returns:
np.array shape: (width, height, 1) or (width, height, 4) if colorize is set to true dtype: np.uint32 or np.uint8 if colorize is set to true
See more details: https://docs.omniverse.nvidia.com/extensions/latest/ext_replicator/annotators_details.html#semantic-segmentation
- apply_visual_material(visual_material: omni.isaac.core.materials.visual_material.VisualMaterial, weaker_than_descendants: bool = False) None
Apply visual material to the held prim and optionally its descendants.
- Parameters
visual_material (VisualMaterial) – visual material to be applied to the held prim. Currently supports PreviewSurface, OmniPBR and OmniGlass.
weaker_than_descendants (bool, optional) – True if the material shouldn’t override the descendants materials, otherwise False. Defaults to False.
Example:
>>> from omni.isaac.core.materials import OmniGlass >>> >>> # create a dark-red glass visual material >>> material = OmniGlass( ... prim_path="/World/material/glass", # path to the material prim to create ... ior=1.25, ... depth=0.001, ... thin_walled=False, ... color=np.array([0.5, 0.0, 0.0]) ... ) >>> prim.apply_visual_material(material)
- get_applied_visual_material() omni.isaac.core.materials.visual_material.VisualMaterial
Return the current applied visual material in case it was applied using apply_visual_material or it’s one of the following materials that was already applied before: PreviewSurface, OmniPBR and OmniGlass.
- Returns
the current applied visual material if its type is currently supported.
- Return type
Example:
>>> # given a visual material applied >>> prim.get_applied_visual_material() <omni.isaac.core.materials.omni_glass.OmniGlass object at 0x7f36263106a0>
- get_aspect_ratio() float
- Returns
ratio between width and height
- Return type
float
- get_clipping_range() Tuple[float, float]
- Returns
near_distance and far_distance respectively.
- Return type
Tuple[float, float]
- get_current_frame(clone=False) dict
- Parameters
clone (bool, optional) – if True, returns a deepcopy of the current frame. Defaults to False.
- Returns
returns the current frame of data
- Return type
dict
- get_default_state() omni.isaac.core.utils.types.XFormPrimState
Get the default prim states (spatial position and orientation).
- Returns
an object that contains the default state of the prim (position and orientation)
- Return type
Example:
>>> state = prim.get_default_state() >>> state <omni.isaac.core.utils.types.XFormPrimState object at 0x7f33addda650> >>> >>> state.position [-4.5299529e-08 -1.8347054e-09 -2.8610229e-08] >>> state.orientation [1. 0. 0. 0.]
- get_depth() numpy.ndarray
- Returns
(n x m x 1) depth data for each point.
- Return type
depth (np.ndarray)
- get_dt() float
- Returns
gets the dt to acquire new data frames
- Return type
float
- get_fisheye_polynomial_properties() Tuple[float, float, float, float, float, List]
- Returns
- nominal_width, nominal_height, optical_centre_x,
optical_centre_y, max_fov and polynomial respectively.
- Return type
Tuple[float, float, float, float, float, List]
- get_focal_length() float
- Returns
Longer Lens Lengths Narrower FOV, Shorter Lens Lengths Wider FOV
- Return type
float
- get_focus_distance() float
- Returns
Distance from the camera to the focus plane (in stage units).
- Return type
float
- get_frequency() float
- Returns
gets the frequency to acquire new data frames
- Return type
float
- get_horizontal_aperture() float
_ :returns: Emulates sensor/film width on a camera :rtype: float
- get_horizontal_fov() float
- Returns
horizontal field of view in pixels
- Return type
float
- get_image_coords_from_world_points(points_3d: numpy.ndarray) numpy.ndarray
- Using pinhole perspective projection, this method projects 3d points in the world frame to the image
plane giving the pixel coordinates [[0, width], [0, height]]
- Parameters
points_3d (np.ndarray) – 3d points (X, Y, Z) in world frame. shape is (n, 3) where n is the number of points.
- Returns
2d points (u, v) corresponds to the pixel coordinates. shape is (n, 2) where n is the number of points.
- Return type
np.ndarray
- get_intrinsics_matrix() numpy.ndarray
- Returns
the intrinsics of the camera (used for calibration)
- Return type
np.ndarray
- get_lens_aperture() float
- Returns
controls lens aperture (i.e focusing). 0 turns off focusing.
- Return type
float
- get_local_pose(camera_axes: str = 'world') None
Gets prim’s pose with respect to the local frame (the prim’s parent frame in the world axes).
- Parameters
camera_axes (str, optional) – camera axes, world is (+Z up, +X forward), ros is (+Y up, +Z forward) and usd is (+Y up and -Z forward). Defaults to “world”.
- Returns
- first index is position in the local frame of the prim. shape is (3, ).
second index is quaternion orientation in the local frame of the prim. quaternion is scalar-first (w, x, y, z). shape is (4, ).
- Return type
Tuple[np.ndarray, np.ndarray]
- get_local_scale() numpy.ndarray
Get prim’s scale with respect to the local frame (the parent’s frame)
- Returns
scale applied to the prim’s dimensions in the local frame. shape is (3, ).
- Return type
np.ndarray
Example:
>>> prim.get_local_scale() [1. 1. 1.]
- get_pointcloud() numpy.ndarray
- Returns
(N x 3) 3d points (X, Y, Z) in camera frame. Shape is (N x 3) where N is the number of points.
- Return type
pointcloud (np.ndarray)
Note
This currently uses the depth annotator to generate the pointcloud. In the future, this will be switched to use the pointcloud annotator.
- get_projection_mode() str
- Returns
perspective or orthographic.
- Return type
str
- get_projection_type() str
- Returns
pinhole, fisheyeOrthographic, fisheyeEquidistant, fisheyeEquisolid, fisheyePolynomial or fisheyeSpherical
- Return type
str
- get_render_product_path() str
- Returns
gets the path to the render product attached to this camera
- Return type
string
- get_resolution() Tuple[int, int]
- Returns
width and height respectively.
- Return type
Tuple[int, int]
- get_rgb() numpy.ndarray
- Returns
(N x 3) RGB color data for each point.
- Return type
rgb (np.ndarray)
- get_rgba() numpy.ndarray
- Returns
(N x 4) RGBa color data for each point.
- Return type
rgba (np.ndarray)
- get_shutter_properties() Tuple[float, float]
- Returns
delay_open and delay close respectively.
- Return type
Tuple[float, float]
- get_stereo_role() str
- Returns
mono, left or right.
- Return type
str
- get_vertical_aperture() float
- Returns
Emulates sensor/film height on a camera.
- Return type
float
- get_vertical_fov() float
- Returns
vertical field of view in pixels
- Return type
float
- get_view_matrix_ros()
3D points in World Frame -> 3D points in Camera Ros Frame
- Returns
- the view matrix that transforms 3d points in the world frame to 3d points in the camera axes
with ros camera convention.
- Return type
np.ndarray
- get_visibility() bool
- Returns
true if the prim is visible in stage. false otherwise.
- Return type
bool
Example:
>>> # get the visible state of an visible prim on the stage >>> prim.get_visibility() True
- get_world_points_from_image_coords(points_2d: numpy.ndarray, depth: numpy.ndarray)
- Using pinhole perspective projection, this method does the inverse projection given the depth of the
pixels
- Parameters
points_2d (np.ndarray) – 2d points (u, v) corresponds to the pixel coordinates. shape is (n, 2) where n is the number of points.
depth (np.ndarray) – depth corresponds to each of the pixel coords. shape is (n,)
- Returns
(n, 3) 3d points (X, Y, Z) in world frame. shape is (n, 3) where n is the number of points.
- Return type
np.ndarray
- get_world_pose(camera_axes: str = 'world') Tuple[numpy.ndarray, numpy.ndarray]
Gets prim’s pose with respect to the world’s frame (always at [0, 0, 0] and unity quaternion not to be confused with /World Prim)
- Parameters
camera_axes (str, optional) – camera axes, world is (+Z up, +X forward), ros is (+Y up, +Z forward) and usd is (+Y up and -Z forward). Defaults to “world”.
- Returns
- first index is position in the world frame of the prim. shape is (3, ).
second index is quaternion orientation in the world frame of the prim. quaternion is scalar-first (w, x, y, z). shape is (4, ).
- Return type
Tuple[np.ndarray, np.ndarray]
- get_world_scale() numpy.ndarray
Get prim’s scale with respect to the world’s frame
- Returns
scale applied to the prim’s dimensions in the world frame. shape is (3, ).
- Return type
np.ndarray
Example:
>>> prim.get_world_scale() [1. 1. 1.]
- initialize(physics_sim_view=None) None
To be called before using this class after a reset of the world
- Parameters
physics_sim_view (_type_, optional) – _description_. Defaults to None.
- is_paused() bool
- Returns
is data collection paused.
- Return type
bool
- is_valid() bool
Check if the prim path has a valid USD Prim at it
- Returns
True is the current prim path corresponds to a valid prim in stage. False otherwise.
- Return type
bool
Example:
>>> # given an existing and valid prim >>> prims.is_valid() True
- is_visual_material_applied() bool
Check if there is a visual material applied
- Returns
True if there is a visual material applied. False otherwise.
- Return type
bool
Example:
>>> # given a visual material applied >>> prim.is_visual_material_applied() True
- property name: Optional[str]
Returns: str: name given to the prim when instantiating it. Otherwise None.
- property non_root_articulation_link: bool
Used to query if the prim is a non root articulation link
- Returns
True if the prim itself is a non root link
- Return type
bool
Example:
>>> # for a wrapped articulation (where the root prim has the Physics Articulation Root property applied) >>> prim.non_root_articulation_link False
- pause() None
pauses data collection and updating the data frame
- post_reset() None
Reset the prim to its default state (position and orientation).
Note
For an articulation, in addition to configuring the root prim’s default position and spatial orientation (defined via the
set_default_state
method), the joint’s positions, velocities, and efforts (defined via theset_joints_default_state
method) are imposedExample:
>>> prim.post_reset()
- property prim: pxr.Usd.Prim
Returns: Usd.Prim: USD Prim object that this object holds.
- property prim_path: str
Returns: str: prim path in the stage
- remove_bounding_box_2d_loose_from_frame() None
- remove_bounding_box_2d_tight_from_frame() None
- remove_bounding_box_3d_from_frame() None
- remove_distance_to_camera_from_frame() None
- remove_distance_to_image_plane_from_frame() None
- remove_instance_id_segmentation_from_frame() None
- remove_instance_segmentation_from_frame() None
- remove_motion_vectors_from_frame() None
- remove_normals_from_frame() None
- remove_occlusion_from_frame() None
- remove_pointcloud_from_frame() None
- remove_semantic_segmentation_from_frame() None
- resume() None
resumes data collection and updating the data frame
- set_clipping_range(near_distance: Optional[float] = None, far_distance: Optional[float] = None) None
Clips the view outside of both near and far range values.
- Parameters
near_distance (Optional[float], optional) – value to be used for near clipping. Defaults to None.
far_distance (Optional[float], optional) – value to be used for far clipping. Defaults to None.
- set_default_state(position: Optional[Sequence[float]] = None, orientation: Optional[Sequence[float]] = None) None
Set the default state of the prim (position and orientation), that will be used after each reset.
- Parameters
position (Optional[Sequence[float]], optional) – position in the world frame of the prim. shape is (3, ). Defaults to None, which means left unchanged.
orientation (Optional[Sequence[float]], optional) – quaternion orientation in the world frame of the prim. quaternion is scalar-first (w, x, y, z). shape is (4, ). Defaults to None, which means left unchanged.
Example:
>>> # configure default state >>> prim.set_default_state(position=np.array([1.0, 0.5, 0.0]), orientation=np.array([1, 0, 0, 0])) >>> >>> # set default states during post-reset >>> prim.post_reset()
- set_dt(value: float) None
- Parameters
value (float) – sets the dt to acquire new data frames
- set_fisheye_polynomial_properties(nominal_width: Optional[float], nominal_height: Optional[float], optical_centre_x: Optional[float], optical_centre_y: Optional[float], max_fov: Optional[float], polynomial: Optional[Sequence[float]]) None
- Parameters
nominal_width (Optional[float]) – Rendered Width (pixels)
nominal_height (Optional[float]) – Rendered Height (pixels)
optical_centre_x (Optional[float]) – Horizontal Render Position (pixels)
optical_centre_y (Optional[float]) – Vertical Render Position (pixels)
max_fov (Optional[float]) – maximum field of view (pixels)
polynomial (Optional[Sequence[float]]) – polynomial equation coefficients (sequence of 5 numbers) starting from A0, A1, A2, A3, A4
- set_focal_length(value: float)
- Parameters
value (float) – Longer Lens Lengths Narrower FOV, Shorter Lens Lengths Wider FOV
- set_focus_distance(value: float)
The distance at which perfect sharpness is achieved.
- Parameters
value (float) – Distance from the camera to the focus plane (in stage units).
- set_frequency(value: int) None
- Parameters
value (int) – sets the frequency to acquire new data frames
- set_horizontal_aperture(value: float) None
- Parameters
value (Optional[float], optional) – Emulates sensor/film width on a camera. Defaults to None.
- set_kannala_brandt_properties(nominal_width: float, nominal_height: float, optical_centre_x: float, optical_centre_y: float, max_fov: Optional[float], distortion_model: Sequence[float]) None
Approximates kannala brandt distortion with ftheta fisheye polynomial coefficients. :param nominal_width: Rendered Width (pixels) :type nominal_width: float :param nominal_height: Rendered Height (pixels) :type nominal_height: float :param optical_centre_x: Horizontal Render Position (pixels) :type optical_centre_x: float :param optical_centre_y: Vertical Render Position (pixels) :type optical_centre_y: float :param max_fov: maximum field of view (pixels) :type max_fov: Optional[float] :param distortion_model: kannala brandt generic distortion model coefficients (k1, k2, k3, k4) :type distortion_model: Sequence[float]
- set_lens_aperture(value: float)
- Controls Distance Blurring. Lower Numbers decrease focus range, larger
numbers increase it.
- Parameters
value (float) – controls lens aperture (i.e focusing). 0 turns off focusing.
- set_local_pose(translation: Optional[Sequence[float]] = None, orientation: Optional[Sequence[float]] = None, camera_axes: str = 'world') None
Sets prim’s pose with respect to the local frame (the prim’s parent frame in the world axes).
- Parameters
translation (Optional[Sequence[float]], optional) – translation in the local frame of the prim (with respect to its parent prim). shape is (3, ). Defaults to None, which means left unchanged.
orientation (Optional[Sequence[float]], optional) – quaternion orientation in the local frame of the prim. quaternion is scalar-first (w, x, y, z). shape is (4, ). Defaults to None, which means left unchanged.
camera_axes (str, optional) – camera axes, world is (+Z up, +X forward), ros is (+Y up, +Z forward) and usd is (+Y up and -Z forward). Defaults to “world”.
- set_local_scale(scale: Optional[Sequence[float]]) None
Set prim’s scale with respect to the local frame (the prim’s parent frame).
- Parameters
scale (Optional[Sequence[float]]) – scale to be applied to the prim’s dimensions. shape is (3, ). Defaults to None, which means left unchanged.
Example:
>>> # scale prim 10 times smaller >>> prim.set_local_scale(np.array([0.1, 0.1, 0.1]))
- set_matching_fisheye_polynomial_properties(nominal_width: float, nominal_height: float, optical_centre_x: float, optical_centre_y: float, max_fov: Optional[float], distortion_model: Sequence[float], distortion_fn: Callable) None
Approximates given distortion with ftheta fisheye polynomial coefficients. :param nominal_width: Rendered Width (pixels) :type nominal_width: float :param nominal_height: Rendered Height (pixels) :type nominal_height: float :param optical_centre_x: Horizontal Render Position (pixels) :type optical_centre_x: float :param optical_centre_y: Vertical Render Position (pixels) :type optical_centre_y: float :param max_fov: maximum field of view (pixels) :type max_fov: Optional[float] :param distortion_model: distortion model coefficients :type distortion_model: Sequence[float] :param distortion_fn: distortion function that takes points and returns distorted points :type distortion_fn: Callable
- set_projection_mode(value: str) None
Sets camera to perspective or orthographic mode.
- Parameters
value (str) – perspective or orthographic.
- set_projection_type(value: str) None
- Parameters
value (str) – pinhole: Standard Camera Projection (Disable Fisheye) fisheyeOrthographic: Full Frame using Orthographic Correction fisheyeEquidistant: Full Frame using Equidistant Correction fisheyeEquisolid: Full Frame using Equisolid Correction fisheyePolynomial: 360 Degree Spherical Projection fisheyeSpherical: 360 Degree Full Frame Projection
- set_rational_polynomial_properties(nominal_width: float, nominal_height: float, optical_centre_x: float, optical_centre_y: float, max_fov: Optional[float], distortion_model: Sequence[float]) None
Approximates rational polynomial distortion with ftheta fisheye polynomial coefficients. :param nominal_width: Rendered Width (pixels) :type nominal_width: float :param nominal_height: Rendered Height (pixels) :type nominal_height: float :param optical_centre_x: Horizontal Render Position (pixels) :type optical_centre_x: float :param optical_centre_y: Vertical Render Position (pixels) :type optical_centre_y: float :param max_fov: maximum field of view (pixels) :type max_fov: Optional[float] :param distortion_model: rational polynomial distortion model coefficients (k1, k2, p1, p2, k3, k4, k5, k6) :type distortion_model: Sequence[float]
- set_resolution(value: Tuple[int, int]) None
- Parameters
value (Tuple[int, int]) – width and height respectively.
- set_shutter_properties(delay_open: Optional[float] = None, delay_close: Optional[float] = None) None
- Parameters
delay_open (Optional[float], optional) – Used with Motion Blur to control blur amount, increased values delay shutter opening. Defaults to None.
delay_close (Optional[float], optional) – Used with Motion Blur to control blur amount, increased values forward the shutter close. Defaults to None.
- set_stereo_role(value: str) None
- Parameters
value (str) – mono, left or right.
- set_vertical_aperture(value: float) None
- Parameters
value (Optional[float], optional) – Emulates sensor/film height on a camera. Defaults to None.
- set_visibility(visible: bool) None
Set the visibility of the prim in stage
- Parameters
visible (bool) – flag to set the visibility of the usd prim in stage.
Example:
>>> # make prim not visible in the stage >>> prim.set_visibility(visible=False)
- set_world_pose(position: Optional[Sequence[float]] = None, orientation: Optional[Sequence[float]] = None, camera_axes: str = 'world') None
Sets prim’s pose with respect to the world’s frame (always at [0, 0, 0] and unity quaternion not to be confused with /World Prim).
- Parameters
position (Optional[Sequence[float]], optional) – position in the world frame of the prim. shape is (3, ). Defaults to None, which means left unchanged.
orientation (Optional[Sequence[float]], optional) – quaternion orientation in the world frame of the prim. quaternion is scalar-first (w, x, y, z). shape is (4, ). Defaults to None, which means left unchanged.
camera_axes (str, optional) – camera axes, world is (+Z up, +X forward), ros is (+Y up, +Z forward) and usd is (+Y up and -Z forward). Defaults to “world”.
- property supported_annotators: List[str]
Returns: List[str]: annotators supported by the camera
- distort_point_kannala_brandt(camera_matrix, distortion_model, x, y)
This helper function distorts point(s) using Kannala Brandt fisheye model. It should be equivalent to the following reference that uses OpenCV:
- def distort_point_kannala_brandt2(camera_matrix, distortion_model, x, y):
import cv2 ((fx,_,cx),(_,fy,cy),(_,_,_)) = camera_matrix pt_x, pt_y, pt_z = (x-cx)/fx, (y-cy)/fy, np.full(x.shape, 1.0) points3d = np.stack((pt_x, pt_y, pt_z), axis = -1) rvecs, tvecs = np.array([0.0,0.0,0.0]), np.array([0.0,0.0,0.0]) cameraMatrix, distCoeffs = np.array(camera_matrix), np.array(distortion_model) points, jac = cv2.fisheye.projectPoints(np.expand_dims(points3d, 1), rvecs, tvecs, cameraMatrix, distCoeffs) return np.array([points[:,0,0], points[:,0,1]])
- distort_point_rational_polynomial(camera_matrix, distortion_model, x, y)
This helper function distorts point(s) using rational polynomial model. It should be equivalent to the following reference that uses OpenCV:
- def distort_point_rational_polynomial(x, y)
import cv2 ((fx,_,cx),(_,fy,cy),(_,_,_)) = camera_matrix pt_x, pt_y, pt_z = (x-cx)/fx, (y-cy)/fy, np.full(x.shape, 1.0) points3d = np.stack((pt_x, pt_y, pt_z), axis = -1) rvecs, tvecs = np.array([0.0,0.0,0.0]), np.array([0.0,0.0,0.0]) cameraMatrix, distCoeffs = np.array(camera_matrix), np.array(distortion_coefficients) points, jac = cv2.projectPoints(points3d, rvecs, tvecs, cameraMatrix, distCoeffs) return np.array([points[:,0,0], points[:,0,1]])
- get_all_camera_objects(root_prim: str = '/')
Retrieve omni.isaac.sensor Camera objects for each camera in the scene.
- Parameters
root_prim (str) – Root prim where the world exists.
- Returns
A list of omni.isaac.sensor Camera objects
- Return type
Camera[]
- point_to_theta(camera_matrix, x, y)
This helper function returns the theta angle of the point.
Contact Sensor Interface
This submodule provides an interface to a simulated contact sensor. A simplified command is provided to create a contact sensor in the stage:
- commands.IsaacSensorCreateContactSensor(parent: str = None, min_threshold: float = 0, max_threshold: float = 100000, color: pxr.Gf.Vec4f = Gf.Vec4f(1.0, 1.0, 1.0, 1.0), radius: float = - 1, sensor_period: float = - 1, translation: pxr.Gf.Vec3d = Gf.Vec3d(0.0, 0.0, 0.0))
Base class for all Commands.
Once the contact sensor is created, you must first acquire this interface and then you can use this interface to access the contact sensor
Also, the offset of the contact sensor is also affect by the parent’s transformations.
1from omni.isaac.sensor import _sensor
2_cs = _sensor.acquire_contact_sensor_interface()
Note
if the contact sensor is not initially created under a valid rigid body parent, the contact sensor will not output any valid data even if the contact sensor is later attached to a valid rigid body parent.
Acquiring Extension Interface
- _sensor.acquire_contact_sensor_interface(plugin_name: str = None, library_path: str = None) omni::isaac::sensor::ContactSensorInterface
Acquire Contact Sensor interface. This is the base object that all of the Contact Sensor functions are defined on
- _sensor.release_contact_sensor_interface(arg0: omni::isaac::sensor::ContactSensorInterface) None
Release Contact Sensor interface. Generally this does not need to be called, the Contact Sensor interface is released on extension shutdown
To collect the most recent reading, call the interface get_sensor_reading(/path/to/sensor, use_latest_data=True). The result will be most recent sensor reading.
reading = _cs.get_sensor_reading("/World/Cube/Contact_Sensor", use_latest_data)
To collect the reading at the last sensor measurement time based on the sensor period, call the interface get_sensor_reading(/path/to/sensor). This will give you the physics step data closest to the sensor measurement time.
reading = _cs.get_sensor_reading("/World/Cube/Contact_Sensor")
To collect raw reading, call the interface get_contact_sensor_raw_data(/path/to/sensor). The result will return a list of raw contact data for that body.
raw_Contact = _cs.get_contact_sensor_raw_data("/World/Cube/Contact_Sensor")
Output Types
- class CsSensorReading
Contact Sensor Reading
- property inContact
Deprecation Alert, inContact will be renamed to in_contact. boolean that flags if the sensor registers a contact. (
bool
)
- property in_contact
boolean that flags if the sensor registers a contact. (
bool
)
- property is_valid
validity of the data. (
bool
)
- property time
timestamp of the reading, in seconds . (
float
)
- property value
sensor force reading value. (
float
)
- class CsRawData
Contact Raw Data
- property body0
Body 0 name handle, (
int
)
- property body1
Body 1 name handle, (
int
)
- property dt
timestep during this contact report, (
float
)
- property impulse
impulse, global coordinates , (
carb.Float3
)
- property normal
normal, global coordinates , (
carb.Float3
)
- property position
position, global coordinates, (
carb.Float3
)
- property time
simulation timestamp, (
float
)
Interface Methods
- class ContactSensorInterface
- decode_body_name(self: omni.isaac.sensor._sensor.ContactSensorInterface, arg0: int) str
Decodes the body name pointers from the contact raw data into a string :param arg0: body name handle :type arg0:
int
- Returns
The body name.
- Return type
str
- get_contact_sensor_raw_data(self: omni.isaac.sensor._sensor.ContactSensorInterface, arg0: str) object
- Parameters
arg0 (
str
) – USD Path to contact sensor as string- Returns
The list of contact raw data that contains the specified body that the contact sensor is attached to.
- Return type
numpy.array
- get_rigid_body_raw_data(self: omni.isaac.sensor._sensor.ContactSensorInterface, arg0: str) object
Get raw data from a rigid body that have contact report API enabled :param arg0: USD Path to rigid body as string :type arg0:
str
- Returns
The list of contact raw data that contains the specified body.
- Return type
numpy.array
- get_sensor_reading(self: omni.isaac.sensor._sensor.ContactSensorInterface, sensor_path: str, use_latest_data: bool = False) omni.isaac.sensor._sensor.CsSensorReading
- Parameters
arg0 (
char*
) – the sensor patharg1 (
bool
) – use_latest_data
- Returns
The reading for the current sensor period.
- Return type
numpy.array
- is_contact_sensor(self: omni.isaac.sensor._sensor.ContactSensorInterface, arg0: str) bool
- Parameters
arg0 (
str
) – USD Path to sensor as string- Returns
True for is contact sensor, False for not contact sensor.
- Return type
bool
IMU sensor Interface
This submodule provides an interface to a simulate IMU sensor, which provides ground truth linear acceleration, angular velocity, orientation data.
A simplified command is provided to create an IMU sensor:
- commands.IsaacSensorCreateImuSensor(parent: str = None, sensor_period: float = - 1, translation: pxr.Gf.Vec3d = Gf.Vec3d(0.0, 0.0, 0.0), orientation: pxr.Gf.Quatd = Gf.Quatd(1.0, Gf.Vec3d(0.0, 0.0, 0.0)), linear_acceleration_filter_size: int = 1, angular_velocity_filter_size: int = 1, orientation_filter_size: int = 1)
Base class for all Commands.
Similiarly, once an IMU sensor is created, you can use this interface to interact with the simulated IMU sensor. You must first call the acquire_imu_sensor_interface.
1from omni.isaac.sensor import _sensor
2_is = _sensor.acquire_imu_sensor_interface()
Note
if the IMU sensor is not initially created under a valid rigid body parent, the IMU sensor will not output any valid data even if the IMU sensor is later attached to a valid rigid body parent. Also, the offset and orientation of the IMU sensor is also affect by the parent’s transformations.
Acquiring Extension Interface
- _sensor.acquire_imu_sensor_interface(plugin_name: str = None, library_path: str = None) omni::isaac::sensor::ImuSensorInterface
Acquire Contact Sensor interface. This is the base object that all of the Contact Sensor functions are defined on
- _sensor.release_imu_sensor_interface(arg0: omni::isaac::sensor::ImuSensorInterface) None
Release Contact Sensor interface. Generally this does not need to be called, the Contact Sensor interface is released on extension shutdown
To collect the most recent reading, call the interface get_sensor_reading(/path/to/sensor, use_latest_data = True). The result will be most recent sensor reading.
reading = _is.get_sensor_reading("/World/Cube/Imu_Sensor", use_latest_data = True)
To collect the reading at the last sensor measurement time based on the sensor period, call the interface get_sensor_reading(/path/to/sensor).
reading = _is.get_sensor_reading("/World/Cube/Imu_Sensor")
Since the sensor reading time is usually between two physics steps, linear interpolation method is used by default to get the reading at sensor time between the physics steps. However the get_sensor_reading can also accept a custom function in the event that a different interpolation strategy is prefered..
from typing import List
# Input Param: List of past IsSensorReadings, time of the expected sensor reading
def interpolation_function(data:List[_sensor.IsSensorReading], time:float) -> _sensor.IsSensorReading:
interpolated_reading = _sensor.IsSensorReading()
# do interpolation
return interpolated_reading
reading = _is.get_sensor_reading("/World/Cube/Imu_Sensor", interpolation_function = interpolation_function)
Note
The interpolation function will only be used if the sensor frequency is lower than the physics frequency and use_latest_data flag is .
Output Types
- class IsSensorReading
Imu Sensor Reading
- property ang_vel_x
Gyroscope reading value x axis, in rad/s. (
float
)
- property ang_vel_y
Gyroscope reading value y axis, in rad/s. (
float
)
- property ang_vel_z
Gyroscope reading value z axis, in rad/s. (
float
)
- property is_valid
validity of sensor reading. (
bool
)
- property lin_acc_x
Accelerometer reading value x axis, in m/s^2. (
float
)
- property lin_acc_y
Accelerometer reading value y axis, in m/s^2. (
float
)
- property lin_acc_z
Accelerometer reading value z axis, in m/s^2. (
float
)
- property orientation
Orientation quaternion reading (x, y, z, w). (
carb.Float4
)
- property time
timestamp of the reading, in seconds. (
float
)
Interface Methods
- class ImuSensorInterface
- get_sensor_reading(self: omni.isaac.sensor._sensor.ImuSensorInterface, sensor_path: str, interpolation_function: Callable[[List[omni.isaac.sensor._sensor.IsSensorReading], float], omni.isaac.sensor._sensor.IsSensorReading] = None, use_latest_data: bool = False, read_gravity: bool = True) object
- Parameters
arg0 (
char*
) – the sensor patharg1 (
std::function<IsReading(std::vector<IsReading>, float)>&
) – interpolation_functionarg2 (
bool
) – use_latest_dataarg3 (
bool
) – read_gravity
- Returns
The reading for the current sensor period.
- Return type
numpy.array
- is_imu_sensor(self: omni.isaac.sensor._sensor.ImuSensorInterface, arg0: str) bool
- Parameters
arg0 (
str
) – USD Path to sensor as string- Returns
True for is imu sensor, False for not imu sensor.
- Return type
bool
Effort Sensor
Effort sensor is a python class for reading gronud truth joint effort measurements. The Effort sensor can be created directly in Python using the path of the joint of interest.
1from omni.isaac.sensor.scripts.effort_sensor import EffortSensor
2import numpy as np
3
4sensor = EffortSensor(prim_path="/World/Robot/revolute_joint")
Note
If the sensor was created with the incorrect prim path, simply delete the sensor and recreate it. If the measured joint needs to be changed and the new joint has the same parent, update_dof_name(dof_name:str)
function maybe used.
Acquiring Sensor data
To collect the most recent reading, call the interface get_sensor_reading(use_latest_data = True). The result will be most recent sensor reading.
reading = sensor.get_sensor_reading(use_latest_data = True)
To collect the reading at the last sensor measurement time based on the sensor period, call the interface get_sensor_reading().
reading = sensor.get_sensor_reading()
Since the sensor reading time is usually between two physics steps, linear interpolation method is used by default to get the reading at sensor time between the physics steps. However the get_sensor_reading can also accept a custom function in the event that a different interpolation strategy is prefered..
1from omni.isaac.sensor.scripts.effort_sensor import EsSensorReading
2
3# Input Param: List of past EsSensorReading, time of the expected sensor reading
4def interpolation_function(data, time):
5 interpolated_reading = EsSensorReading()
6 # do interpolation
7 return interpolated_reading
8
9reading = sensor.get_sensor_reading(interpolation_function)
Note
The interpolation function will only be used if the sensor frequency is lower than the physics frequency and use_latest_data flag is not enabled.
Output Types
EsSensorReading
time (float): The time of the sensor reading.
value (float): The measured effort on the joint.
is_valid (boolean): The validitty of the sensor measurement.
LightBeam sensor Interface
This submodule provides an interface to simulate a LightBeam sensor, which provides linear depth and hit positions of each raycast.
A simplified command is provided to create a LightBeam sensor:
- commands.IsaacSensorCreateLightBeamSensor(parent: str = None, translation: pxr.Gf.Vec3d = Gf.Vec3d(0.0, 0.0, 0.0), orientation: pxr.Gf.Quatd = Gf.Quatd(1.0, Gf.Vec3d(0.0, 0.0, 0.0)), num_rays: int = 1, curtain_length: float = 0.0, forward_axis: pxr.Gf.Vec3d = Gf.Vec3d(1.0, 0.0, 0.0), curtain_axis: pxr.Gf.Vec3d = Gf.Vec3d(0.0, 0.0, 1.0), min_range: float = 0.4, max_range: float = 100.0)
Base class for all Commands.
Similiarly, once a LightBeam sensor is created, you can use this interface to interact with the simulated LightBeam sensor. You must first call the acquire_lightbeam_sensor_interface.
1from omni.isaac.sensor import _sensor
2_ls = _sensor.acquire_lightbeam_sensor_interface()
Acquiring Extension Interface
- _sensor.acquire_lightbeam_sensor_interface(plugin_name: str = None, library_path: str = None) omni::isaac::sensor::LightBeamSensorInterface
Acquire Contact Sensor interface. This is the base object that all of the Contact Sensor functions are defined on
- _sensor.release_lightbeam_sensor_interface(arg0: omni::isaac::sensor::LightBeamSensorInterface) None
Release Contact Sensor interface. Generally this does not need to be called, the Contact Sensor interface is released on extension shutdown
To collect the most recent data, call the following interface methods:
# This will return a vector of uint8_t (0 false, 1 true)
beam_hit = _ls.get_beam_hit_data("/World/LightBeam_Sensor")
# This will the number of rays in the light curtain
num_rays = _ls.get_num_rays("/World/LightBeam_Sensor")
# This will return a vector of floats of the linear depth of each raycast
linear_depth_data = _ls.get_linear_depth_data("/World/LightBeam_Sensor")
# This will return a vector of xyz points of hit positions of each raycast
hit_pos_data = _ls.get_hit_pos_data("/World/LightBeam_Sensor")
Interface Methods
- class LightBeamSensorInterface
- get_beam_hit_data(self: omni.isaac.sensor._sensor.LightBeamSensorInterface, arg0: str) object
- Parameters
arg0 (
str
) – USD path to sensor as a string- Returns
True for light beam sensor detecting a raycast hit, False for not no hit
- Return type
numpy.ndarray
- get_hit_pos_data(self: omni.isaac.sensor._sensor.LightBeamSensorInterface, arg0: str) object
- Parameters
arg0 (
str
) – USD path to sensor as a string- Returns
Hit positions in xyz for each light beam relative to sensor origin
- Return type
numpy.ndarray
- get_linear_depth_data(self: omni.isaac.sensor._sensor.LightBeamSensorInterface, arg0: str) object
- Parameters
arg0 (
str
) – USD path to sensor as a string- Returns
The distance from the sensor to the hit for each light beam in meters
- Return type
numpy.ndarray
- get_num_rays(self: omni.isaac.sensor._sensor.LightBeamSensorInterface, arg0: str) int
- Parameters
arg0 (
str
) – USD Path to sensor as string- Returns
The number of rays in the light curtain.
- Return type
int
- is_lightbeam_sensor(self: omni.isaac.sensor._sensor.LightBeamSensorInterface, arg0: str) bool
- Parameters
arg0 (
str
) – USD Path to sensor as string- Returns
True for is light beam sensor, False for not light beam sensor.
- Return type
bool
Omnigraph Nodes
IsaacReadEffortSensor
Node that reads out joint effort values
- Inputs
execIn (execution): The input execution port.
prim (target): Path to the joint getting measured.
useLatestData (bool): True to use the latest data from the physics step, False to use the data measured by the sensor. Default to False.
enabled (bool): True to enable sensor, False to disable the sensor. Default to True.
sensorPeriod (float): Downtime between sensor readings. Default to 0.
- Outputs
execOut (execution): Output execution triggers when sensor has data.
sensorTime (float): Timestamp of the sensor reading.
value (float): Effort value reading.
IsaacPrintRTXSensorInfo
[‘Print raw RTX sensor data to console. Example of using omni.sensors Python bindings in OmniGraph node.’]
- Inputs
exec (execution): The input execution port.
dataPtr (uint64): Pointer to RTX sensor render result.
IsaacComputeRTXRadarPointCloud
[‘This node reads from the an RTX Radar sensor and holds point cloud data buffers’]
- Inputs
exec (execution): The input execution port.
dataPtr (uint64): Pointer to Radar render result.
renderProductPath (token): Path of the renderProduct to wait for being rendered.
- Outputs
exec (execution): Output execution triggers when Radar sensor has data.
transform (matrixd[4]): The input matrix transformed from Radar to World.
sensorID (uchar): Sensor Id for sensor that generated the scan.
scanIdx (uchar): Scan index for sensors with multi scan support.
timeStampNS (uint64): Scan timestamp in nanoseconds.
cycleCnt (uint64): Scan cycle count.
maxRangeM (float): The max unambiguous range for the scan.
minVelMps (float): The min unambiguous velocity for the scan.
maxVelMps (float): The max unambiguous velocity for the scan.
minAzRad (float): The min unambiguous azimuth for the scan.
maxAzRad (float): The max unambiguous azimuth for the scan.
minElRad (float): The min unambiguous elevation for the scan.
maxElRad (float): The max unambiguous elevation for the scan.
numDetections (uint): The number of valid detections in the array.
dataPtr (uint64): Buffer of 3d points containing point cloud data in Radar coordinates.
cudaDeviceIndex (int): Index of the device where the data lives (-1 for host data).
bufferSize (uint64): Size (in bytes) of the buffer (0 if the input is a texture).
height (uint): Height of point cloud buffer, will always return 1.
width (uint): 3 x Width or number of points in point cloud buffer.
radialDistance (float[]): Radial distance (m).
radialVelocity (float[]): Radial velocity (m/s).
azimuth (float[]): Azimuth angle (radians).
elevation (float[]): Angle of elevation (radians).
rcs (float[]): Radar cross section in decibels referenced to a square meter (dBsm).
semanticId (uint[]): semantic ID.
materialId (uint[]): material ID.
objectId (uint[]): object ID.
IsaacReadIMU
Node that reads out IMU linear acceleration, angular velocity and orientation data
- Inputs
execIn (execution): The input execution port.
imuPrim (target): Usd prim reference to the IMU prim.
useLatestData (bool): True to use the latest data from the physics step, False to use the data measured by the sensor. Default to False.
readGravity (bool): True to read gravitational acceleration in the measurement, False to ignore gravitational acceleration. Default to True.
- Outputs
execOut (execution): Output execution triggers when sensor has data.
sensorTime (float): Timestamp of the sensor reading.
linAcc (vectord[3]): Linear acceleration IMU reading.
angVel (vectord[3]): Angular velocity IMU reading.
orientation (quatd[4]): Sensor orientation as quaternion.
IsaacCreateRTXLidarScanBuffer
[‘This node creates a full scan buffer for RTX Lidar sensor.’]
- Inputs
exec (execution): The input execution port.
dataPtr (uint64): Pointer to LiDAR render result.
cudaDeviceIndex (int): Index of the device where the data lives (-1 for host data). Default to -1.
keepOnlyPositiveDistance (bool): Keep points only if the return distance is > 0. Default to True.
transformPoints (bool): Transform point cloud to world coordinates. Default to False.
accuracyErrorAzimuthDeg (float): Accuracy error of azimuth in degrees applied to all points equally.
accuracyErrorElevationDeg (float): Accuracy error of elevation in degrees applied to all points equally.
accuracyErrorPosition (float[3]): Position offset applied to all points equally.
renderProductPath (token): Config is gotten from this.
outputIntensity (bool): Create an output array for the Intensity. Default to True.
outputDistance (bool): Create an output array for the Distance. Default to True.
outputObjectId (bool): Create an output array for the ObjectId. Default to False.
outputVelocity (bool): Create an output array for the Velocity. Default to False.
outputAzimuth (bool): Create an output array for the Azimuth. Default to False.
outputElevation (bool): Create an output array for the Elevation. Default to False.
outputNormal (bool): Create an output array for the Normals. Default to False.
outputTimestamp (bool): Create an output array for the Timestamp. Default to False.
outputEmitterId (bool): Create an output array for the EmitterId. Default to False.
outputBeamId (bool): Create an output array for the BeamId. Default to False.
outputMaterialId (bool): Create an output array for the MaterialId. Default to False.
- Outputs
exec (execution): Output execution triggers when lidar sensor has data.
dataPtr (uint64): Pointer to LiDAR render result.
cudaDeviceIndex (int): Index of the device where the data lives (-1 for host data).
bufferSize (uint64): Size (in bytes) of the buffer (0 if the input is a texture).
transform (matrixd[4]): The transform matrix from lidar to world coordinates.
intensityPtr (uint64): intensity [0,1].
intensityDataType (float): type.
intensityBufferSize (uint64): size.
distancePtr (uint64): range in m.
distanceDataType (float): type.
distanceBufferSize (uint64): size.
azimuthPtr (uint64): azimuth in rad [-pi,pi].
azimuthDataType (float): type.
azimuthBufferSize (uint64): size.
elevationPtr (uint64): elevation in rad [-pi/2, pi/2].
elevationDataType (float): type.
elevationBufferSize (uint64): size.
velocityPtr (uint64): elevation in rad [-pi/2, pi/2].
velocityDataType (float[3]): type.
velocityBufferSize (uint64): size.
objectIdPtr (uint64): ObjectId for getting usd prim information.
objectIdDataType (uint): type.
objectIdBufferSize (uint64): size.
normalPtr (uint64): Normal at the hit location.
normalDataType (float[3]): type.
normalBufferSize (uint64): size.
timestampPtr (uint64): timestamp in ns.
timestampDataType (uint64): type.
timestampBufferSize (uint64): size.
emitterIdPtr (uint64): emitterId.
emitterIdDataType (uint): type.
emitterIdBufferSize (uint64): size.
beamIdPtr (uint64): beamId.
beamIdDataType (uint): type.
beamIdBufferSize (uint64): size.
materialIdPtr (uint64): materialId at hit location.
materialIdDataType (uint): type.
materialIdBufferSize (uint64): size.
indexPtr (uint64): Index into the full array if keepOnlyPositiveDistance ((startTick+tick)*numChannels*numEchos + channel*numEchos + echo).
indexDataType (uint): type.
indexBufferSize (uint64): size.
numReturnsPerScan (uint): Number of returns in the full scan.
ticksPerScan (uint): Number of ticks in a full scan.
numChannels (uint): Number of channels of the lidar.
numEchos (uint): Number of echos of the lidar.
renderProductPath (token): Config is gotten from this.
height (uint): Height of point cloud buffer, will always return 1.
width (uint): 3 x Width or number of points in point cloud buffer.
IsaacReadContactSensor
Node that reads out contact sensor data
- Inputs
execIn (execution): The input execution port.
csPrim (target): USD prim reference to contact sensor prim.
useLatestData (bool): True to use the latest data from the physics step, False to use the data measured by the sensor. Default to False.
- Outputs
execOut (execution): Output execution triggers when sensor has data.
sensorTime (float): Sensor reading timestamp.
inContact (bool): Bool that registers current sensor contact.
value (float): Contact force value reading (N).
IsaacReadLightBeam
Node that reads out light beam sensor data
- Inputs
execIn (execution): The input execution port.
lightbeamPrim (target): Usd prim reference to the light beam prim.
- Outputs
execOut (execution): Output execution triggers when sensor has data.
numRays (int): The number of rays in light curtain.
beamHitData (bool[]): Array of bools that registers if a light beam is broken.
linearDepthData (float[]): Array containing linear depth data.
hitPosData (pointf[3][]): Array containing hit position data.
beamOrigins (pointf[3][]): Array containing origins of each beam.
beamEndPoints (pointf[3][]): Array containing end points of each beam.
IsaacComputeRTXLidarFlatScan
[‘Accumulates full scan from the lowest elevation emitter on an RTX Lidar’]
- Inputs
exec (execution): The input execution port.
dataPtr (uint64): Pointer to LiDAR render result.
bufferSize (uint64): Size (in bytes) of the buffer (0 if the input is a texture).
renderProductPath (token): Used to retrieve lidar configuration.
timeStamp (double): Input timestamp (s).
- Outputs
exec (execution): Output execution triggers when lidar sensor has accumulated a full scan.
horizontalFov (float): Horizontal Field of View (deg).
horizontalResolution (float): Increment between horizontal rays (deg).
depthRange (float[2]): Range for sensor to detect a hit [min, max] (m).
rotationRate (float): Rotation rate of sensor in Hz.
linearDepthData (float[]): Linear depth measurements from full scan, ordered by increasing azimuth (m).
intensitiesData (uchar[]): Intensity measurements from full scan, ordered by increasing azimuth.
numRows (int): Number of rows in buffers.
numCols (int): Number of columns in buffers.
azimuthRange (float[2]): The azimuth range [min, max] (deg).
timeStamp (double): Timestamp of first beam in scan.
IsaacComputeRTXLidarPointCloud
[‘This node reads from the an RTX Lidar sensor and holds point cloud data buffers’]
- Inputs
exec (execution): The input execution port.
dataPtr (uint64): Pointer to LiDAR render result.
keepOnlyPositiveDistance (bool): Keep points only if the return distance is > 0. Default to True.
accuracyErrorAzimuthDeg (float): Accuracy error of azimuth in degrees applied to all points equally.
accuracyErrorElevationDeg (float): Accuracy error of elevation in degrees applied to all points equally.
accuracyErrorPosition (float[3]): Position offset applied to all points equally.
renderProductPath (token): Path of the renderProduct to wait for being rendered.
- Outputs
exec (execution): Output execution triggers when lidar sensor has data.
transform (matrixd[4]): The transform matrix from lidar to world coordinates.
dataPtr (uint64): Buffer of points containing point cloud data in Lidar coordinates.
cudaDeviceIndex (int): Index of the device where the data lives (-1 for host data).
bufferSize (uint64): Size (in bytes) of the buffer (0 if the input is a texture).
intensity (float[]): intensity [0,1].
range (float[]): range in m.
azimuth (float[]): azimuth in rad [-pi,pi].
elevation (float[]): elevation in rad [-pi/2, pi/2].
height (uint): Height of point cloud buffer, will always return 1.
width (uint): 3 x Width or number of points in point cloud buffer.
IsaacReadRTXLidarData
[‘This node reads the data straight from the an RTX Lidar sensor.’]
- Inputs
exec (execution): The input execution port.
bufferSize (uint64): number of bytes in dataPtr. Default to 0.
dataPtr (uint64): Pointer to LiDAR render result.
cudaDeviceIndex (int): Index of the device where the data lives (-1 for host data). Default to -1.
cudaStream (uint64): Cuda Stream dataPtr is on if cudaDeviceIndex > -1. Default to 0.
keepOnlyPositiveDistance (bool): Keep points only if the return distance is > 0. Default to False.
renderProductPath (token): Config is gotten from this.
- Outputs
exec (execution): Output execution triggers when lidar sensor has data.
numBeams (uint64): The number of lidar beams being output.
frameId (uint64): The frameId of the current render.
timestampNs (uint64): The time in nanoseconds of the start of frame.
transformStart (matrixd[4]): The transform matrix from lidar to world coordinates at the start of the frame.
transform (matrixd[4]): The transform matrix from lidar to world coordinates at the end of the frame.
depthRange (float[2]): The min and max range for sensor to detect a hit [min, max].
azimuths (float[]): azimuth in deg [0, 360).
elevations (float[]): elevation in deg [-90, 90].
distances (float[]): distance in m.
intensities (float[]): intensity [0,1].
velocities (pointf[3][]): velocity at hit point in sensor coordinates [m/s].
flags (uchar[]): flags.
hitPointNormals (pointf[3][]): hit point Normal.
deltaTimes (uint[]): delta time in ns from the head (relative to tick timestamp).
emitterIds (uint[]): beam/laser detector id.
materialIds (uint[]): hit point material id.
objectIds (uint[]): hit point object id.
ticks (uint[]): tick of point.
tickStates (uchar[]): emitter state the tick belongs to.
channels (uint[]): channel of point.
echos (uchar[]): echo id in ascending order.