Review Client Features and UI#
Now that the client is running, it’s a great time to familiarize yourself with some of the features you can build with the SDK.
Portals & Volumes#
Omniverse renders in visionOS can be displayed in Portal mode, which acts as a view into another environment, or Volume mode, where objects are placed directly into the user’s space.
Portal mode in Omniverse acts like a view into another environment, rendered in stereo with head tracking, making objects appear just beyond the portal opening. This mode is ideal for art-directed environments or when dealing with digital twins that are too large for comfortable viewing in volume mode.
To switch between portal and volume modes, select the Cube icon in the top left corner of the Configuration window.
Gestures#
Gestures are available in the Volume context. In volume mode, you can pinch and hold with one hand while moving the hand to the left and right to rotate the Purse. This is a common “click and drag” gesture in visionOS. Likewise, you can pinch and hold with both hands and move their hands closer or further apart to scale the Purse. We calculate these gestures locally on the Vision Pro using the built-in visionOS Gestures, and send the desired rotate and scale value back to Omniverse to update the stage.
Placement Tool#
The Placement Tool in AVP allows you to position and anchor a virtual asset onto flat, horizontal surfaces within an AR environment. To use it, ensure the configurator project is running on a physical Vision Pro device, as it won’t work in the visionOS Simulator. Activate placement mode by pressing the “Place” button, which brings up a placement puck that follows your headset’s direction. Move the puck to the desired location, then use a pinch gesture to anchor the asset in place. Ensure both the client app and the server-side configurator USD file have active and connected sessions for successful placement.
Xcode Project Configuration for the Placement Tool#
In Xcode select Configurator > Info > Targets > Configurator
Verify that the following keys exist:
NSWorldSensingUsageDescription: Needed to track model position in the world
NSHandsTrackingUsageDescription: Required to accurately place models, and streaming hand tracking data to a remote server
These keys are required for AR placement functionality. They inform the user why the app needs access to world sensing and hand tracking capabilities.