CloudXR Cookbook: Common Feature Snippets#
Note
Applies to: Spatial Extensions, Kit 109.0.3+, CloudXR 6
This cookbook provides ready-to-use code snippets for common features you may want to add to your CloudXR client application. Each snippet builds on top of the Generic Viewer and the CloudXRKit SDK.
Note
All snippets assume you have a connected CloudXRSession with at least one available MessageChannel. The Generic Viewer’s ServerActionsView.swift demonstrates the basics of channel selection and raw string messaging; the snippets below show how to build structured, typed interactions on top of that foundation.
Snippet Overview#
Type-safe JSON messages to the server.
Switch colors, styles, and materials.
Switch between scene cameras.
Rotate the scene with hand gestures.
Pinch to scale the scene.
Control lighting with SwiftUI slider.
Show/hide objects using variants.
Smooth fade in/out transitions.
Listen for server command results.
Display content in a visionOS portal.
Sending Structured JSON Messages to the Server#
Send typed, structured JSON messages to your Omniverse application instead of raw strings.
Prerequisites
A connected CloudXRSession with an available MessageChannel.
1import Foundation
2import CloudXRKit
3
4// A protocol for structured messages that encode to JSON with a "Type" key.
5protocol ServerMessage: Encodable {
6 associatedtype Payload: Encodable
7 var type: String { get }
8 var message: [String: Payload] { get }
9}
10
11// Encode any ServerMessage to Data for sending over a MessageChannel.
12private let jsonEncoder = JSONEncoder()
13
14func encodeMessage<T: ServerMessage>(_ msg: T) -> Data {
15 try! jsonEncoder.encode(msg)
16}
17
18// Send a structured message over the first available channel.
19func sendToServer<T: ServerMessage>(_ msg: T, session: Session) -> Bool {
20 let data = encodeMessage(msg)
21 guard let channelInfo = session.availableMessageChannels.first,
22 let channel = session.getMessageChannel(channelInfo) else {
23 return false
24 }
25 return channel.sendServerMessage(data)
26}
How It Works#
Define a ServerMessage struct for each command your Omniverse app expects. The type field maps to the "Type" key in the JSON payload that your server-side extension dispatches on. Calling sendToServer encodes the struct to JSON and sends it over the first available opaque data channel.
The ServerMessage protocol ensures type safety at compile time. Each message type is a separate struct, so it is not possible to accidentally send malformed JSON.
Your Omniverse extension should register a listener on the opaque data channel and dispatch based on the "Type" field in the received JSON.
Switching a USD Variant (Color, Style, Material)#
Tell the server to switch a USD variant set – the most common configurator interaction.
Prerequisites
Snippet 1 (structured messaging), an Omniverse scene with variant sets configured.
1// Define the message the server expects for variant switching.
2struct SetVariantMessage: ServerMessage {
3 let type = "setVariantSelection"
4 let message: [String: String]
5
6 init(variantSetName: String, variantName: String) {
7 message = [
8 "variantSetName": variantSetName,
9 "variantName": variantName
10 ]
11 }
12}
13
14// Define your variant options as a Swift enum for type safety.
15enum MaterialColor: String, CaseIterable, Identifiable {
16 case red = "Red"
17 case blue = "Blue"
18 case black = "Black"
19 case white = "White"
20
21 var id: String { rawValue }
22}
23
24// Usage: switch the "color" variant set to "Red"
25let msg = SetVariantMessage(variantSetName: "color", variantName: MaterialColor.red.rawValue)
26sendToServer(msg, session: session)
How It Works#
The SetVariantMessage struct produces JSON like:
{"type": "setVariantSelection", "variantSetName": "color", "variantName": "Red"}
On the server, your Omniverse extension receives this and calls the USD variant-switching API for the specified prim. The enum gives you compile-time safety and makes it easy to build picker UI with ForEach(MaterialColor.allCases).
Server Side
Your Omniverse extension should handle messages of type "setVariantSelection" and apply the variant using prim.GetVariantSets().SetSelection(variantSetName, variantName).
Switching the Active Camera#
Switch between predefined camera positions in your Omniverse scene.
Prerequisites
Snippet 1 (structured messaging), camera prims set up in your USD scene.
struct SetActiveCameraMessage: ServerMessage {
let type = "setActiveCamera"
let message: [String: String]
/// cameraPath: the full USD prim path to the camera, e.g.
/// "/World/Cameras/Front" or "/World/Cameras/TopDown"
init(cameraPath: String) {
message = ["cameraPath": cameraPath]
}
}
// Usage: switch to the front camera
let cameraPrefix = "/World/Cameras/"
let msg = SetActiveCameraMessage(cameraPath: "\(cameraPrefix)Front")
sendToServer(msg, session: session)
How It Works#
The message tells your Omniverse application to switch its active viewport camera to the specified prim path. Camera paths are fully qualified USD prim paths. You can build a camera picker UI by defining an array of camera names and constructing the full path from a shared prefix.
Server Side
Your Omniverse extension should handle "setActiveCamera" messages and set the active camera on the viewport using the provided cameraPath.
Rotating the Streamed Scene with a Gesture (visionOS)#
Allow the user to rotate the streamed CloudXR content around the Y-axis using visionOS hand gestures.
Prerequisites
A CloudXRSessionComponent entity in your RealityView.
1import SwiftUI
2import RealityKit
3
4// State to track rotation across gesture updates
5@Observable
6class GestureState {
7 var lastRotation: Float = 0
8 var totalRotation: Float = 0
9}
10
11// Build a RotateGesture3D that rotates the session entity around the Y-axis.
12@MainActor
13func makeRotationGesture(sessionEntity: Entity, state: GestureState) -> some Gesture {
14 RotateGesture3D(constrainedToAxis: .z, minimumAngleDelta: .degrees(2))
15 .onChanged { value in
16 // RotateGesture3D has limited range; multiply by a factor for usability
17 let rotationFactor: Float = 3.0
18 let radians = Float(value.rotation.angle.radians) * -sign(value.rotation.axis.z)
19 let amplified = radians * rotationFactor
20
21 // Compute the delta since last update
22 let delta = amplified - state.lastRotation
23 state.lastRotation = amplified
24
25 // Apply cumulative Y-axis rotation to the session entity
26 state.totalRotation += delta
27 sessionEntity.setOrientation(
28 simd_quatf(angle: state.totalRotation, axis: simd_float3(0, 1, 0)),
29 relativeTo: nil
30 )
31 }
32 .onEnded { _ in
33 state.lastRotation = 0
34 }
35}
How It Works#
RotateGesture3D on visionOS tracks a two-handed twist gesture. Its raw rotation range is small, so a 3x multiplier is applied for usability. The gesture provides cumulative angles, so we compute the delta between updates and accumulate it into a total rotation. The session entity (which parents all streamed content) is rotated around the Y-axis, causing the entire remote scene to visually rotate in place. Resetting lastRotation on gesture end ensures clean deltas for the next gesture.
Scaling the Streamed Scene with a Pinch Gesture (visionOS)#
Allow the user to scale the streamed content using a visionOS pinch (magnify) gesture.
Prerequisites
A CloudXRSessionComponent entity in your RealityView.
1import SwiftUI
2import RealityKit
3
4// Track the scale at the start of each gesture
5@Observable
6class ScaleState {
7 var baseScale: Float = 1.0
8}
9
10// Clamp and snap scale values for a good user experience.
11func correctedScale(factor: Float, baseScale: Float) -> Float {
12 let raw = factor * baseScale
13 if raw > 5.0 { return 5.0 } // Maximum scale
14 if raw < 0.2 { return 0.2 } // Minimum scale
15 if raw > 0.9 && raw < 1.1 { return 1.0 } // Snap to 1.0 near identity
16 return raw
17}
18
19@MainActor
20func makeScaleGesture(sessionEntity: Entity, state: ScaleState) -> some Gesture {
21 MagnifyGesture(minimumScaleDelta: 0.075)
22 .onChanged { value in
23 let scale = correctedScale(factor: Float(value.magnification), baseScale: state.baseScale)
24 sessionEntity.scale = .one * scale
25 }
26 .onEnded { value in
27 let scale = correctedScale(factor: Float(value.magnification), baseScale: state.baseScale)
28 sessionEntity.scale = .one * scale
29 state.baseScale = scale
30 }
31}
How It Works#
MagnifyGesture reports a magnification factor relative to the gesture start (1.0 = no change). We multiply it by the baseScale captured at the previous gesture’s end to get cumulative scaling. The correctedScale function clamps to a 0.2–5.0 range and snaps to exactly 1.0 when the user is close to the original size, preventing the frustrating “almost but not quite at 1x” problem. On gesture end, we store the final scale as the new baseline.
Controlling Scene Lighting with a Slider#
Send a lighting intensity value from a SwiftUI slider to the Omniverse scene.
Prerequisites
Snippet 1 (structured messaging), a server extension that handles lighting commands.
1import SwiftUI
2import CloudXRKit
3
4// Message to set scene light intensity on the server.
5struct SetLightIntensityMessage: ServerMessage {
6 let type = "setLightSlider"
7 let message: [String: Float]
8
9 init(intensity: Float, range: ClosedRange<Float> = 0.0...2.0) {
10 let clamped = min(max(intensity, range.lowerBound), range.upperBound)
11 message = ["intensity": clamped]
12 }
13}
14
15// A SwiftUI view with a lighting slider.
16struct LightingSliderView: View {
17 @State private var intensity: Float = 1.0
18 let session: Session
19 let range: ClosedRange<Float> = 0.0...2.0
20
21 var body: some View {
22 VStack {
23 Text("Lighting")
24 Slider(value: $intensity, in: range)
25 .onChange(of: intensity) {
26 let msg = SetLightIntensityMessage(intensity: intensity, range: range)
27 sendToServer(msg, session: session)
28 }
29 }
30 .padding()
31 }
32}
How It Works#
The slider’s value is bound to intensity. On each change, a SetLightIntensityMessage is sent to the server with the clamped intensity value. The server-side extension applies this to the scene’s light source(s). The range can be adjusted to match your scene’s lighting setup.
Server Side
Your Omniverse extension should handle "setLightSlider" messages and set the intensity attribute on the relevant UsdLux light prim(s).
Toggling Object Visibility#
Show or hide objects in the Omniverse scene by switching a visibility variant.
Prerequisites
Snippet 1 (structured messaging), a visibility variant set on the target prim.
1// Reuses the same SetVariantMessage from Snippet 2.
2enum ObjectVisibility: String {
3 case visible = "Visible"
4 case hidden = "Hidden"
5}
6
7func setObjectVisibility(_ visibility: ObjectVisibility, session: Session) {
8 let msg = SetVariantMessage(
9 variantSetName: "Visibility",
10 variantName: visibility.rawValue
11 )
12 sendToServer(msg, session: session)
13}
14
15// Usage in a SwiftUI Toggle
16struct VisibilityToggleView: View {
17 @State private var isVisible = true
18 let session: Session
19
20 var body: some View {
21 Toggle("Show Object", isOn: $isVisible)
22 .onChange(of: isVisible) {
23 setObjectVisibility(isVisible ? .visible : .hidden, session: session)
24 }
25 }
26}
How It Works#
This reuses the variant-switching pattern from Snippet 2 with a "Visibility" variant set. The SwiftUI Toggle provides a clean on/off control. When toggled, it sends either "Visible" or "Hidden" as the variant name. This is a simple pattern you can replicate for any boolean feature in your scene.
Server Side
Configure a variant set named "Visibility" on the target prim with "Visible" and "Hidden" variants that control the prim’s visibility attribute.
Animating Entity Opacity (Fade In/Out)#
Smoothly animate the opacity of any RealityKit entity – useful for transitions, loading states, or visual feedback.
Prerequisites
RealityKit, Combine. No CloudXR dependency – this is a pure RealityKit utility.
Full implementation – Entity opacity extension
1import RealityKit
2import Combine
3
4// File-level storage for animation completion subscriptions.
5private var opacitySubscriptions: Set<AnyCancellable> = .init()
6
7extension Entity {
8 /// The opacity of this entity and its descendants.
9 /// Automatically adds an OpacityComponent if one doesn't exist.
10 var opacity: Float {
11 get { components[OpacityComponent.self]?.opacity ?? 1 }
12 set {
13 if components.has(OpacityComponent.self) {
14 components[OpacityComponent.self]?.opacity = newValue
15 } else {
16 components[OpacityComponent.self] = OpacityComponent(opacity: newValue)
17 }
18 }
19 }
20
21 /// Animate opacity from one value to another.
22 /// - Parameters:
23 /// - target: The target opacity (0.0 to 1.0).
24 /// - from: The starting opacity. Defaults to the entity's current opacity.
25 /// - duration: Animation duration in seconds.
26 /// - delay: Delay before the animation starts.
27 /// - completion: Called when the animation finishes.
28 func animateOpacity(
29 to target: Float,
30 from: Float? = nil,
31 duration: TimeInterval = 0.5,
32 delay: TimeInterval = 0,
33 completion: (() -> Void)? = nil
34 ) {
35 let startValue = from ?? self.opacity
36
37 if !components.has(OpacityComponent.self) {
38 components[OpacityComponent.self] = OpacityComponent(opacity: startValue)
39 }
40
41 let animation = FromToByAnimation(
42 name: "opacityFade",
43 from: startValue,
44 to: target,
45 duration: duration,
46 timing: .linear,
47 isAdditive: false,
48 bindTarget: .opacity,
49 delay: delay
50 )
51
52 do {
53 let resource = try AnimationResource.generate(with: animation)
54 let controller = playAnimation(resource)
55
56 if let completion {
57 scene?.publisher(for: AnimationEvents.PlaybackCompleted.self)
58 .filter { $0.playbackController == controller }
59 .sink { _ in completion() }
60 .store(in: &opacitySubscriptions)
61 }
62 } catch {
63 assertionFailure("Could not generate opacity animation: \(error.localizedDescription)")
64 }
65 }
66}
Usage#
// Fade out over 0.25 seconds
entity.animateOpacity(to: 0.0, duration: 0.25)
// Fade in over 0.5 seconds with a completion handler
entity.animateOpacity(to: 1.0, duration: 0.5) {
print("Fade-in complete")
}
How It Works#
This extension uses RealityKit’s FromToByAnimation bound to the .opacity target. It ensures an OpacityComponent exists on the entity before animating. The animation is linear and affects the entity and all its descendants. An optional completion handler is connected through Combine by subscribing to AnimationEvents.PlaybackCompleted and filtering to the specific playback controller.
Listening for Server Completion Notifications#
Detect when the server finishes processing a command (e.g., a variant switch) so you can update your UI.
Prerequisites
A connected MessageChannel, JSON messages from the server.
1import CloudXRKit
2import Foundation
3
4/// Start listening for messages on a channel and call the handler for each parsed message.
5func listenForServerMessages(
6 channel: MessageChannel,
7 onMessage: @escaping ([String: Any]) -> Void
8) -> Task<Void, Never> {
9 Task {
10 for await messageData in channel.receivedMessageStream {
11 if let json = try? JSONSerialization.jsonObject(with: messageData) as? [String: Any] {
12 onMessage(json)
13 }
14 }
15 }
16}
17
18// Example: detect when a variant switch completes
19func handleServerMessage(_ json: [String: Any]) {
20 guard let type = json["Type"] as? String else { return }
21
22 switch type {
23 case "switchVariantComplete":
24 // The server finished applying a variant change.
25 if let variantName = json["variantSetName"] as? String {
26 print("Variant switch completed: \(variantName)")
27 // Update your UI state here, e.g. re-enable buttons or dismiss a spinner.
28 }
29 default:
30 break
31 }
32}
Usage#
// Start listening when the channel becomes available
let listenerTask = listenForServerMessages(channel: channel, onMessage: handleServerMessage)
// Cancel when done:
listenerTask.cancel()
How It Works#
The receivedMessageStream on a MessageChannel is an AsyncSequence that yields Data for each incoming server message. We parse each message as JSON and dispatch based on the "Type" field. For variant switches, the server sends a "switchVariantComplete" message when it finishes applying the change, which you can use to update loading indicators or re-enable UI controls.
Server Side
After processing a command, your Omniverse extension should send a completion notification back through the same opaque data channel with a "Type" field your client can match on.
Creating a Portal Window (visionOS)#
Display the streamed CloudXR content inside a portal – a window into another world that exists within the user’s physical space.
Prerequisites
RealityKit, visionOS, a CloudXRSessionComponent entity.
Full implementation – Portal factory function
1import RealityKit
2
3/// Create a portal that shows a separate world entity through a window-like plane.
4/// Returns (container, portalEntity) -- add `container` to your scene.
5func makePortal(
6 world: Entity,
7 width: Float = 3.25,
8 height: Float = 2.5,
9 cornerRadius: Float = 0.2,
10 position: simd_float3 = simd_float3(0, 0.25, -2.5)
11) -> (container: Entity, portal: Entity) {
12
13 // The portal plane -- renders the target world through a "window"
14 let portal = Entity()
15 portal.components[ModelComponent.self] = .init(
16 mesh: .generatePlane(width: width, height: height, cornerRadius: cornerRadius),
17 materials: [PortalMaterial()]
18 )
19 portal.components[PortalComponent.self] = .init(target: world)
20
21 // A small draggable bar above the portal for repositioning
22 let barWidth = width / 9
23 let barHeight: Float = 0.02
24 let barOffset = (height / 2) + (barHeight * 3)
25
26 let bar = ModelEntity()
27 bar.name = "portalBar"
28 bar.components[ModelComponent.self] = .init(
29 mesh: .generatePlane(width: barWidth, height: barHeight, cornerRadius: barHeight / 2),
30 materials: [UnlitMaterial(color: .white)]
31 )
32 bar.components.set(HoverEffectComponent())
33 bar.generateCollisionShapes(recursive: true)
34 bar.components.set(InputTargetComponent())
35
36 // Position the portal below the bar
37 portal.position = simd_float3(0, barOffset, 0)
38
39 // Assemble the hierarchy: container -> parent -> { portal, bar }
40 let parent = Entity()
41 parent.addChild(portal)
42 parent.addChild(bar)
43
44 let container = Entity()
45 container.name = "portalContainer"
46 container.addChild(parent)
47 container.position = position
48
49 return (container, portal)
50}
Usage#
// 1. Create a world entity and mark it as a portal world
let portalWorld = Entity()
portalWorld.components[WorldComponent.self] = .init()
// 2. Move your session entity (CloudXR content) into the portal world
portalWorld.addChild(sessionEntity)
// 3. Create the portal
let (container, portal) = makePortal(world: portalWorld)
// 4. Add both the world and the portal container to your scene
sceneEntity.addChild(portalWorld)
sceneEntity.addChild(container)
How It Works#
A visionOS portal uses three key components:
- WorldComponent
Defines a separate “world” – content parented under this entity is only visible through a portal.
- PortalMaterial
Applied to a plane mesh, this material acts as a window into the target world.
- PortalComponent
Links the portal plane to the world entity, creating the see-through effect.
The drag bar above the portal uses HoverEffectComponent for visual feedback and InputTargetComponent with collision shapes to support drag gestures for repositioning.
End-to-End: Hand Gesture Triggers a Server Action#
This pattern connects a visionOS hand gesture on the Apple Vision Pro to a server-side Kit action via the CloudXR opaque data channel. The gesture (a tap) sends a selectPrimsRequest message to the Kit server, which selects the specified prim in the USD stage.
Prerequisites
This snippet assumes you have a working CloudXRKit session and a MessageChannel reference. See Snippet 1: Sending Structured JSON Messages for the messaging setup.
1import SwiftUI
2import RealityKit
3
4struct InteractiveImmersiveView: View {
5 let channel: MessageChannel
6
7 var body: some View {
8 RealityView { content in
9 // Your session entity setup
10 }
11 .gesture(
12 SpatialTapGesture()
13 .targetedToAnyEntity()
14 .onEnded { value in
15 let tappedEntity = value.entity
16 let primPath = tappedEntity.name
17
18 let message: [String: Any] = [
19 "type": "selectPrimsRequest",
20 "payload": ["paths": [primPath]]
21 ]
22
23 if let data = try? JSONSerialization.data(withJSONObject: message),
24 let json = String(data: data, encoding: .utf8) {
25 channel.sendServerMessage(json)
26 }
27 }
28 )
29 }
30}
How It Works#
SpatialTapGesturedetects when the user taps an entity using visionOS hand trackingThe tapped entity’s name is used as the USD prim path (this works when entity names match prim paths in the streamed scene)
A
selectPrimsRequestJSON payload is constructed using the"payload"field format (preferred over"message")The message is sent over the CloudXR opaque data channel to the Kit server
The USD Viewer’s built-in
selectPrimsRequesthandler receives the message and updates the prim selection on the stage
This pattern generalizes to any gesture-to-server-action flow: replace SpatialTapGesture with RotateGesture3D, MagnifyGesture, or DragGesture, and replace selectPrimsRequest with any custom event type your Kit extension handles.