Materials#
Introduction#
The design considerations for non-visual materials stems from the requirements that different sensor models need to compute return strength from ray-traced intersections. Therefore, the concept of a material must consist of a generic definition that can operate from a common excutable code, i.e. behavior, that has complete and extensible inputs, outputs, and context that is operable on host and device hardware domains.
Non-visual material support for different sensor models (i.e. support for Lidar, Radar, and Ultrasonic) can be segmented into two domains. The behavior and property domains. The behavior is synonymous with a BSDF (BiDirectional Scattering Distribution function) which defines the details behind how reflection and transmission subcomponents are calculated and combined to define a specific light-matter interaction behavior. The properties characterize the data that drive how each component of the BSDF behavior is determined as a function of the non-visual sensor wavelength and wavetype. Upon return, the output structure will contain a complete set of reflection and/or transmission coefficients, that are agnostic to the source unit space set by the sensor model, that can be directed back to a receiver or continued through continuation ray methods for advanced ray-tracing techniques. The agnostic unit space simply means that the BSDF behavior does not need to know what the input state is. It can be a normalized intensity, absolute power/field measure, radiance, or any input radiometric space a sensor model is designed to run in.
The material system, which consists of the properties and the behaviors, must conform to the following design requirements in order to operate in the most generic sense for sensor models.:
Complete
Layered concept
Ease of use for content developers
Ease of switching
Various domains
A system that is fully complete entails the ability for BSDFs to compute boundary reflections (air/material, material/air, and material/material), perform volumetric based transmission and attenuations, inter-reflections between boundary surfaces, and phase and polarization options for advanced coherent light source tracking. The layered concept provides the ability to add additional material sheets (like paints and coatings) for advanced material layer interactions in an “out of the box” experience that is simply enabled by the non-visual material attribution on the content and system parameters to enable/disable the advanced features. The ease of use for content developers stems from the requirement that non-visual materials can be daunting and require domain expertise that not everyone has. By abstracting out these intricate details by semantic labeling, content developers can perform the same workflow with only minimal effort to encode a non-visual material with the look and feel of a visual material labeling. The purpose of ease of switching provides fast and easy remappings of material mapping assignments and behaviors. This can be performed at the power user level where one can redefine everything or a more basic level of remapping existing materials to fit a specific need. Finally, various domains must be accounted for to provide a generic material system. The system must support different wave types and spectral information. This means that the materials must be spectrally dependent providing accurate data for all wavelengths of the sensor models as well as wave types (electromagnetic and mechanical).
The NVIDIA material infrastructure supports all of these requirements for Lidar, Radar, and Ultrasonic sensor models and wavetypes. To realize this, NvMaterials
is an internal plugin system design that encompasses an API for infrastructure execution within any of the currently supported sensor models. The semantic labeling, shown in the
table below, is the backbone for ease of use for content developers to map non-visual materials in assets and content maps with the added option to remap properties and behaviors.
The remappings can fully re-implement the workflow by remapping existing BSDF behavior and properties with a simple commandline argument (or carb
setting).
The NVIDIA non-visual material labeling workflow can be found in the Current Material page. This system provides the “ease of use” requirement with no intervention from the user. The semantic labeling for the non-visual materials are set in the USD asset and maps directly. This provides a much easier content based workflow that is less error prone. The proceedure for how this is accomplished is discussed in the USD section below.
To summarize:
Non-visual materials stem from a visual domain counterpart through the semantic mapping defined below.
Non-visual materials are defined through USD attribution.
Materials are easy to use and map.
Materials can be easily re-mapped and re-defined for maximum extendability.
This is the full feature list of the NVIDIA BSDF subcomponents defining the behaviors derived from the input properties.
Simple BSDF behaviors like Constant and Default lambertian
Complex BSDF behaviors for full radiometric accountability respecting the materials physical and spectral properties
BSDF response function for reflection/transmission for material boundaries, volumes, and layers
This includes on sided (air-to-material and vice versa) as well as double sided (air-material-air) boundary types
Allow sampling to define next even contribution (i.e. continuation rays)
Ready params for various material bases, wave types, and spectral ranges
Deterministic
Optionally usable by users in combination of NV utils
Optional Phase and Polarization
Optional coating and layers boundary interactions (multilayer inter-reflection and inter-transmission functions)
Optional curvature estimation for augmenting the source ray divergence reflection and transmission calculations
NVIDIA Non-Visual Materials#
There is a limited but fairly completeset of materials that exist with the current sensors material infrastructure. The details about the material behaviors, properties and how their implementations come together will not be discussed, however the list can be identified. Here is the list of existing materials for the current release:
Material |
Description |
---|---|
|
Simple constant value that is returned from every surface. No complex boundary interactions are computed with this BSDF type. |
|
Simple lambertian returning a cosine weighted response with a set factor that has a default value of 0.15. |
|
More complex computing responses for diffuse and specular coefficients based upon the material properties resulting in reflection and transmission elements. |
|
BSDF specific to ultrasonic sensor models that uses physical properties, rather than optical properties, for computing reflections and transmissions for mechanical wave types |
|
A collection of the |
Current Materials#
The non-visual materials and associated attributes are defined in USD and these attributes will produce an encoded material ID that will be decoded
in the BSDF processing to define base and additional BSDF component behavior. Below details the USD attributes for non-visual materials. The definition
of the cube is not required. It is shown here to illustrate how the material.binding
USD framework ties into a /Look
that contains the USD material schema.
It is that material definition where the essential mapping defines what base material is chosen and any additional details for more advanced BSDF behavior modeling.
def Cube "Cube3" (
prepend apiSchemas = []
)
{
float3[] extent = [(-1, -1, -1), (1, 1, 1)]
rel material:binding = </Looks/new_enc> (
bindMaterialAs = "weakerThanDescendants"
)
vector3f physics:angularVelocity = (0, 0, 0)
bool physics:collisionEnabled = 0
bool physics:kinematicEnabled = 0
bool physics:rigidBodyEnabled = 1
vector3f physics:velocity = (0, 0, 0)
bool physxRigidBody:disableGravity = 1 (
allowedTokens = []
)
double size = 1.1
quatf xformOp:orient = (0.707, 0.0, 0.0, -0.707)
double3 xformOp:scale = (0.000299999888241291, 1, 1)
double3 xformOp:translate = (-16, -55.25, 1.25)
uniform token[] xformOpOrder = ["xformOp:translate", "xformOp:orient", "xformOp:scale"]
}
def Scope "Looks"
{
def Material "new_enc"
{
token outputs:mdl:displacement.connect = </Looks/new_enc/Shader.outputs:out>
token outputs:mdl:surface.connect = </Looks/new_enc/Shader.outputs:out>
token outputs:mdl:volume.connect = </Looks/new_enc/Shader.outputs:out>
custom string omni:simready:nonvisual:base = "aluminum"
custom string omni:simready:nonvisual:coating = "clearcoat"
custom string omni:simready:nonvisual:attributes = "retroreflective"
def Shader "Shader"
{
uniform token info:implementationSource = "sourceAsset"
uniform asset info:mdl:sourceAsset = @OmniPBR.mdl@
uniform token info:mdl:sourceAsset:subIdentifier = "OmniPBR"
color3f inputs:diffuse_color_constant = (0.67, 0.67, 0.97) (
customData = {
float3 default = (0.2, 0.2, 0.2)
}
displayGroup = "Albedo"
displayName = "Albedo Color"
doc = "This is the albedo base color"
hidden = false
)
token outputs:out
}
}
}
The example cube above has a material binding that corresponds to the /Look
definition for the applied material. Within the Material
block, the base, coatings, and attributes
define the base material and additional information for more detailed sub-component BSDF behavior. The base material is required but the other fields for coatings
and attributes are optional. For such a situation, the coatings and attributes can be set to none
. The prefix for the USD custom strings are shown in the assumed default
state of omni:simready:nonvisual
. Given that other content initiatives previously existed and modeled USD content differently for non-visual materials, the runtime has the ability to support a
different prefix. This other prefix is inputs:nonvisual
. The carb
setting /rtx/materialDb/nonVisualMaterialSemantics/prefix
is used to define which of the 2 prefix strings
to use.
In addition, the Material
shown above defines an input diffuse_color_constant
. This is typically defined in the content and would not be required to set from a user-defined
asset perspective. However, if the base material is a calibration_lambertion
type, then the diffuse_color_constant
(along with the carb
setting for enabling reflectance information)
defined here will fill diffuseRefl
in the NvMatInput struct. In the case of the calibration_lambertion
base material type,
the red and green channel define the lambertian reflectance factor and the blue channel defines the roughness of the surface for normal vector perterbation. This only applies
to DefaultMaterial
BSDF behavior types. The more complex BSDF behaviors (such as CoreMaterial
and CompositeMaterial
) will use this information for more complex behavior indexing, specular, and
diffuse computations.
The list below details all the supported base materials, coating, and attributes.
Materials, Coatings, and Attributes#
Base Materials#
Material Name |
Index |
Description |
---|---|---|
none |
0 |
Default, unlabeled, or unspecified |
Metals#
aluminum |
1 |
Signs, poles, etc. |
steel |
2 |
Heavy construction metals |
oxidized_steel |
3 |
Rusted Steel |
iron |
4 |
Manhole covers, drainage grills, etc |
oxidized_iron |
5 |
Rusted iron |
silver |
6 |
Shiny metals |
brass |
7 |
Architecture |
bronze |
8 |
Statues, etc |
oxidized_Bronze_Patina |
9 |
Old Statues |
tin |
10 |
Food cans, etc |
Polymers#
plastic |
11 |
Generic plastics, etc |
fiberglass |
12 |
Car bumpers, etc |
carbon_fiber |
13 |
Car parts, bycicle parts, etc |
vinyl |
14 |
Car interior, etc |
plexiglass |
15 |
Light covers, etc |
pvc |
16 |
Water tubing, etc |
nylon |
17 |
Plastic batc, etc |
polyester |
18 |
Some Clothing, etc |
Glass#
clear_glass |
19 |
Glass that is clear with no contaminants |
frosted_glass |
20 |
Glass that has volumetric particulates/imperfections |
one_way_mirror |
21 |
Building glass panels |
mirror |
22 |
Side mirrors, etc |
ceramic_glass |
23 |
Heavy Duty glass in construction |
Other#
asphalt |
24 |
Roads, etc |
concrete |
25 |
Construction |
leaf_grass |
26 |
Live vegetation |
dead_leaf_grass |
27 |
Dead vegetation |
rubber |
28 |
Tires, etc |
wood |
29 |
Construction |
bark |
30 |
Trees, vegetation |
cardboard |
31 |
Boxes, etc |
paper |
32 |
News papers, paper bags, writing paper, etc |
fabric |
33 |
Clothing |
skin |
34 |
Human, pig, etc |
fur_hair |
35 |
Human head, animal, etc |
leather |
36 |
Clothing, car interior, etc |
marble |
37 |
Construction |
brick |
38 |
Construction |
stone |
39 |
Nature, stones that have structure |
gravel |
40 |
Nature, finer stones such as pebbles |
dirt |
41 |
Nature, very fine grainoles such as sand/dust |
mud |
42 |
Nature, wet dirt |
water |
43 |
Nature, water fountains, lakes, rivers, etc |
salt_water |
44 |
Nature, oceans and seas, free from biologics |
snow |
45 |
Nature, frozen water droplets (crystalline) |
ice |
46 |
Nature, frozen water, larger bodies |
calibration_lambertion |
47 |
Special material with defined diffuse reflectance such as target panels with know reflectance |
All the base materialIds have a default mapping to a CompositeMaterial
behavior except for none
and calibration_lambertion
. These two materials
map to a DefaultMaterial
exhibiting lambertian type reflectance behavior.
Coatings#
Coating Name |
Index |
Description |
---|---|---|
none |
0 |
Default, unlabeled, or unspecified coating |
paint |
1 |
Painted |
clearcoat |
2 |
Clear-coated |
paint_clearcoat |
3 |
Painted and clear coated |
TBD |
4 |
Unused, reserved for future |
TBD |
5 |
Unused, reserved for future |
TBD |
6 |
Unused, reserved for future |
TBD |
7 |
Unused, reserved for future |
Attributes#
Attribute Name |
Index |
Description |
---|---|---|
none |
0 |
Unspecified attribute |
emissive |
1 |
Energy emitting surface |
retroreflective |
2 |
retro reflective surface |
single_sided |
4 |
Single-sided surface (non-thin geometry) |
visually_transparent |
8 |
Material is visually transparent |
TDB |
16 |
Unused, reserved for future |
The specification tables above shows the options and material types that can be used with the USD attribution schemes. These mappings are read in and then used to encode a final material ID. The BSDF will decode the materialId and use the fields and base ID to control how the BSDF computes the reflectance and transmittance values. Note that the semantic material mappings to non-visual materials are automatically done for ease of use. The base material ID to non-visual material can be remapped for more advanced usage and is described below.
The material ID is a uint_16 that is uniquely encoded to encompass the base material, coatings, and attributes. The lower byte of the material ID identifies the base material index. The upper byte encodes the coatings and attributes. The lower 3 bits of the upper byte encodes the paints and coatings. The upper 5 bits of the upper byte encodes the attributes. The attributes are encoded as a bit field resulting in each bit uniquely mapping to one of the 5 attributes defined in the table above.:
attributes coatings base material
xxxxx xxx xxxxxxxx
For example, a base material of steel with a paint coating and a retro reflective attribute will have the following encoded materialId.:
attribute coating base material Final MaterialId generated
00010 001 00000010 0x0001000100000010 = 4354
It is the responsibility of the BSDF to decode the materialId and perform the specific sub actions defining the desired behavior derived from the data and the attribution. Despite this requirement, adhering to the material ID specification and leveraging USD for the mapping provides an easy method to get non-visual materials from source content to sensor specific behavior. The Material Manager provides additional insight into the specifics of the behavioral controls.
The material management is summed up with just to 2 actions that are considerably simple and less error prone:
1. Provide commandline arguments (or carb settings) to enable USD based materials and any optional remappings
2. Fill material map for Material Manager to processing
Additional arguments exist per sensor modality to provide a means to remap bsdf behaviors and non-visual materials per material ID. The remapping only needs to be performed for the material ID requiring the desired change. For example, the following settings will remap the BSDF behavior and non-visual material for material index 5 and 6 of a lidar sensor.:
--/app/sensors/nv/lidar/matBehaviorToIdOverrides="CompositeMaterial:5;CoreMaterial:6"
--/app/sensors/nv/lidar/matNameToIdMapOverrides="asphalt:5;aluminum:6"
In this example, it can be seen that material IDs 5 and 6 have been remapped to the asphalt
and aluminum
properties from the default mapping of oxidized_iron
and silver
as indicated in the table above. Additionally, the BSDF behavior has been remapped to a CompositeMaterial
and CoreMaterial
respectively. Such changes can be done upon startup
or during runtime. If runtime changes are desired, one additional setting is required that can be done via Python using the carb
interface.:
--/app/sensors/nv/materials/resetMaterials=true
The matBehaviorToIdOverrides
and matNameToIdMapOverrides
settings must be set first before the resetMaterials
flag is set for runtime changes. This is because the remappings
must be defined before the materials can be reset. Note that the remappings can be null which results in a automatic mapping from the materials identified in the table above
to the Non Visual Materials.
A final point that a sensor modality needs to account for is communicating with the rtxSensor infrastructure to request additional data for additional features in the BSDF behavior. This includes retrieving additional information such as visible band diffuse reflectance and roughness. These option requests to rtxSensor are enabled by two additional flags as shown in the c++ code for a sensor model implementation.:
// result is a rtx::rtxsensor::RtxSensorRequirements* type coming from the rtxSensor call to getModelRequirements
result->returnDataConfig |= rtx::rtxsensor::RtxSensorReturnData::RTXSENSOR_RETURN_MATERIAL_REFLECTANCE;
result->returnDataConfig |= rtx::rtxsensor::RtxSensorReturnData::RTXSENSOR_RETURN_MATERIAL_ADDITIONAL;
These RtxSensorRequirements flags are enabled in the per modality settings. Here is an example enabling these requirements for a lidar sensor.:
--/app/sensors/nv/lidar/enableRtxReflectanceInformation=true
--/app/sensors/nv/lidar/enableAdditionalRtxReturnInformation=true
The additional information comes in the form of a roughness parameter for visual material surface normal variances. Though the non-visual material properties include such information, the visible band information provides another variable to augment the non-visual material roughness to provide a similar roughness scale as the visual. In addition, the visual band diffuse reflectance information can be accessed when a paint is defined in the non-visual material schema. The diffuse color provides a lookup to retrieve spectral paint data to augment the base material return in a layered fashion for the given sensor modality wavelength.
The NVIDIA sensor models can conditionally enable or disable these features by use of two carb settings. The following settings show each of the sensor modalities for requesting additional information from rtxSensor.:
--/app/sensors/nv/lidar/enableRtxReflectanceInformation=true
--/app/sensors/nv/lidar/enableAdditionalRtxReturnInformation=true
--/app/sensors/nv/radar/enableRtxReflectanceInformation=true
--/app/sensors/nv/radar/enableAdditionalRtxReturnInformation=true
--/app/sensors/nv/ultrasonic/enableRtxReflectanceInformation=true
--/app/sensors/nv/ultrasonic/enableAdditionalRtxReturnInformation=true
Below is the full list of carb settings for controlling material properties and BSDF behaviors. What they are set at indicates their default state.:
--/app/sensors/nv/lidar/enableRtxReflectanceInformation=false
--/app/sensors/nv/lidar/enableAdditionalRtxReturnInformation=false
--/app/sensors/nv/lidar/enableRtxSensorGeometry=false
--/app/sensors/nv/lidar/enablePolarization=false
--/app/sensors/nv/lidar/matBehaviorToIdOverrides=""
--/app/sensors/nv/lidar/matNameToIdMapOverrides=""
--/app/sensors/nv/radar/enableRtxReflectanceInformation=false
--/app/sensors/nv/radar/enableAdditionalRtxReturnInformation=false
--/app/sensors/nv/radar/enableRtxSensorGeometry=false
--/app/sensors/nv/radar/enablePolarization=false
--/app/sensors/nv/radar/matBehaviorToIdOverrides=""
--/app/sensors/nv/radar/matNameToIdMapOverrides=""
--/app/sensors/nv/ultrasonic/enableRtxReflectanceInformation=false
--/app/sensors/nv/ultrasonic/enableAdditionalRtxReturnInformation=false
--/app/sensors/nv/ultrasonic/enableRtxSensorGeometry=false
--/app/sensors/nv/ultrasonic/enablePolarization=false
--/app/sensors/nv/ultrasonic/matBehaviorToIdOverrides=""
--/app/sensors/nv/ultrasonic/matNameToIdMapOverrides=""
--/app/sensors/nv/materials/resetMaterials=false
--/app/sensors/nv/materials/enableMaterialInterLayerContribution=false
--/app/sensors/nv/materials/preserveMaterialFlags=0xff
--/rtx/materialDb/nonVisualMaterialSemantics/prefix=omni:simready:nonvisual
The only per-modality parameters that have not been discussed in this document are the enableRtxSensorGeometry
, enablePolarization
and the enableMaterialInterLayerContribution
flags.
They enable the model to perform additional logic for per-vertex normals and vertices in the rtxSensorReturns
(in support of curvature and divergence computations),
polarization tracking of material interactions, and inter-layer reflectance/transmittance calculations respectively. The resetMaterials
flag will reprocess the materials that
are redefined via the matBehaviorToIdOverrides
and matNameToIdMapOverrides
. The preserveMaterialFlags
is a bit field for filtering the upper byte of the material ID
that is reserved for paint, coatings, and attribution specifications. The default is 0xFF
which means that all flags are applicable. The last setting is for support of the
non-visual material prefix naming scheme. This can be either inputs:nonvisual
or omni:simready:nonvisual
.
Plugins#
This section describes the plugins used to deliver the non-visual material framework discussed above.
MaterialReader#
The material reader provides an easy to use layer for loading material property data, organizing material property and behavior mappings, and overriding/remapping of properties and/or behaviors. The plugin is an internal plugin that is leveraged in all sensor modalities for interfacing between the sensor and the Material Manager that ultimately links to the hit shader for processing the returns from the RtxSensor ray-tracer. It handles the details of parsing the spectral material data and organizing the default (or remapped) BSDF behaviors as well as the spectral data for the given materialId. The output from the material reader is then passed to the Material Manager which then defines the device side kernels that store the material data and behavior details.
Troubleshooting#
Of all the settings defined above, the most important setting to be aware of is --/rtx/materialDb/nonVisualMaterialSemantics/prefix
. If a simulation is run and the
non-visual materials are observed to be the same for all the content, it is likely the content is using the inputs:nonvisual
prefix, while the default setting is
omni:simready:nonvisual
.
It is possible that if --/app/sensors/nv/materials/preserveMaterialFlags
is set to 0
, then the expectation is that only the base material is used and the coatings and
additional BSDF attributions are not used. If the materialId
exhibits only the base index but additional flags are expected, check the state of the preserveMaterialFlags
setting.
If this is set as expected, then the likely culprit are the string tokens set in the USD content.
One other useful feature in kit is the ability to view the non-visual materials in a false color rendering mode in a viewport. This is called the non-visual material debugger.
Under the RenderSettings->Common tab in kit, the View->DebugView tab has a en entry for Non-Visual Material ID
. This will render the non-visual materials and additionally
allow for picking in the viewport to evaluate the materialId
. This is a useful tool to help identify the content materialId
mappings.