Set Up Omniverse SDK#
To develop a new Kit XR template app, you start by setting up the necessary settings and dependencies. This involves using the Kit app template as the foundation for our XR-specific template. The Kit app template allows developers to fork, customize, and use it for developing, packaging, and sharing their extensions. This toolkit is crucial for developing GPU-accelerated applications within the NVIDIA Omniverse ecosystem. It includes pre-configured templates, tools, and sample code designed to simplify and expedite the creation of high-performance applications using OpenUSD.
Clone the Kit Repository#
Choose a folder on your local machine to install the repository, keeping the directory structure shallow with minimal nested subdirectories when building your application. Be sure to create a new folder instead of using a location where other templates reside, as this could lead to configuration errors or conflicts due to updates to the template. If you clone the repository in a folder containing other repositories and encounter issues from a template change, follow the instructions in the Troubleshooting page to resolve the problem.
Copy the Github repository URL, you will need it for a later step:
https://github.com/NVIDIA-Omniverse/kit-app-template
Open a new terminal in your code editor.
Clone the github repository.
git clone https://github.com/NVIDIA-Omniverse/kit-app-template.git
Navigate into the newly cloned directory.
cd kit-app-template
In the terminal, checkout a specific version of kitXR using the provided tag. Do not worry about the “You are in ‘detached HEAD’ state” message, this is expected because you’re checking out a specific commit, not a branch.
git checkout kit-106.0.3
In the terminal, create a new Kit application template by using the template new command.
.\repo.bat template new
This command creates an application containing the Kit SDK.
You can give your application a title and version number if you’d like. You must accept the Omniverse EULA during the template creation process. Use the arrow keys on your keyboard to choose an option, or type a response where appropriate, and press Enter for each selection.
Make a selection for the type of application you want to create:
Select USD Composer to create a Kit app for authoring, editing and testing your USD data in XR. You can leave the name version by default if you’d like.
Select Kit Base Editor If you’d like to create a smaller footprint application just for streaming your USD.
Modify your Kit App Extensions & Settings#
Open
\kit-app-template\source\apps\{templateFileName}.kit
in your code editor.Include some extension dependencies at the end of the
[dependencies]
list. You need to include different extensions based on whether you’re creating the template for Kit Base Editor or USD Composer:# XR extensions for AVP development "omni.kit.xr.ogn" = {} "omni.kit.xr.profile.ar" = {} # XR Simulator Extensions "omni.kit.xr.system.simulatedxr" = {} # Environment Core "omni.kit.environment.core" = {}
# XR extensions for AVP development "omni.kit.xr.ogn" = {} "omni.kit.xr.profile.ar" = {} # XR Simulator Extensions "omni.kit.xr.system.simulatedxr" = {} # Environment Core "omni.kit.environment.core" = {} # GDN Publishing Extension "omni.configurator_publisher" = {} # Action Graph "omni.graph.bundle.action" = {} "omni.graph.window.action" = {}
These Omniverse XR extensions provide tools for XR development, AR support, and XR simulation, enabling efficient creation and testing of immersive experiences. The GDN publishing extension streamlines asset preparation for the Graphics Delivery Network, while the environment core extension manages virtual settings. Action graph extensions enable visual scripting, simplifying workflow automation and the creation of complex interactions.
Append the following settings to the end of the
[settings]
list. Use the settings in the appropriate tab depending on if you picked USD Composer or Kit Base Editor.# XR Settings for AVP development xr.cloudxr.version = 4.1 xr.depth.aov = "GBufferDepth" xr.simulatedxr.enabled = true # Performance Settings persistent.renderer.raytracingOmm.enabled = true rtx-transient.resourcemanager.enableTextureStreaming = false xr.ui.enabled=false # Default AR mode to "Stage" render quality defaults.xr.profile.ar.renderQuality = "off" # Enable CloudXR 4.1 by default defaults.xr.profile.ar.system.display = "CloudXR41" # Set near clipping plane for AR to 0.15 meters persistent.xr.profile.ar.render.nearPlane = 0.15 # Avoid some glass artifacts persistent.rtx.sceneDb.allowDuplicateAhsInvocation = false
# XR Settings for AVP development xr.cloudxr.version = 4.1 xr.depth.aov = "GBufferDepth" xr.simulatedxr.enabled = true # Performance Settings persistent.renderer.raytracingOmm.enabled = true rtx-transient.resourcemanager.enableTextureStreaming = false xr.ui.enabled=false # Default AR mode to "Stage" render quality defaults.xr.profile.ar.renderQuality = "off" # Enable CloudXR 4.1 by default defaults.xr.profile.ar.system.display = "CloudXR41" # Set near clipping plane for AR to 0.15 meters persistent.xr.profile.ar.render.nearPlane = 0.15 # Avoid some glass artifacts persistent.rtx.sceneDb.allowDuplicateAhsInvocation = false # Enable the Fabric Scene Delegate useFabricSceneDelegate = true
The XR settings for AVP development primarily focus on configuring and optimizing the virtual and augmented reality experience. They enable specific technologies like CloudXR, set up depth and rendering parameters, and adjust performance-related options. These settings also define default configurations for AR mode, establish display preferences, and activate necessary components like the Fabric Scene Delegate. Overall, these configurations aim to create a smooth, efficient, and high-quality XR experience while balancing performance and visual fidelity for AVP (Apple Vision Pro) development.
Save your
.kit
file.
Build and Launch the Kit Application#
Build the App#
To be able to run the application, you need to build it first. The build process compiles your application and its extensions, preparing them for launch.
In the terminal window, enter the following command to build your app:
.\repo.bat build
Wait a few moments while the application builds.
The repo tools will also ask you to build a “_streaming” version of your kit app, you can accept the defaults and ignore this additional Kit app that is created.
When the build process is finished, a message appears:
BUILD (RELEASE) SUCCEEDED (<time taken>)
In the terminal, launch your application with the developer bundle enabled by entering the following command:
.\repo.bat launch -d
When asked in the terminal window, press Enter to select the application and start the launch process.
Again, ignore the “_streaming” version of your Kit app template. The first time a Kit application is launched it will compile and cache the RTX shaders, this can take several minutes.
When the launch process is complete, the application opens in a new window.
Once your application is built, you’re ready to open your dataset.
Download the USD dataset from here.
Open the
visionpro_purse_example.usd
file and allow all the files to load.At this point, If you need to get more familiar with Omniverse, its default interface and UI, or OpenUSD, head to the USD Composer documentation.
Note
Depending on the app template you select you may end up adding duplicate settings and dependencies to your kit file, if you receive an error like
Encountered exception TOMLKitError: Redefinition of an existing table
then double check your.kit
file for duplicate entries and remove the unwanted entries.
Launch App in Simulated XR#
Let’s run this stage with Simulated XR - our XR output system that runs inside Omniverse and does not require a headset to emulate the resolution and stereo output of an immersive device.
Locate the Panel named AR.
Change the Output Plugin from CloudXR41 to Simulated XR.
Click Start AR.
The Viewport should show both eyes being rendered with the correct resolution. You can use the normal Viewport controls to move around the scene. For the Purse, you can use the debug keys we’ve set up in our logic graph: 1 cycles through the colors, 2 cycles through the styles, 3 cycles through the cameras, 4 toggles context mode, and 5 cycles through the environments.
Note the FPS in the Viewport. With Simulated XR you should be getting around 45FPS on our recommended DevKit hardware. Take note that when you stop Simulated XR and refer to the FPS displayed in the default Viewport HUD, it might be significantly higher. This is because DLSS-Frame Generation is not supported in XR in addition to the viewport only rendering one eye instead of two.
Later, you’ll start to manipulate the ActionGraph logic in the scene, but first, let’s set up our Xcode environment.