Connecting to the WebRTC stream#
Once you have a 200 status indicating that the stream is ready and a response body with:
IDof the streaming sessionIPof the streaming Kit Appsource signaling portof the WebRTC streamsource media portof the WebRTC stream
You are ready to connect your client application to the WebRTC stream.
AppStreamer#
We have created a React Component AppStreamer to allow web developers to embed interactive WebRTC streams from Kit Apps into their applications. It also has the ability for limited bi-directional messaging between the Kit App and the client, alongside traditional methods such as REST, WebSockets and Message Queues.
The AppStreamer component is included in the omnivers-webrtc-streaming-library, distributed from NVIDIA’s npm repository.
.npmrc settings to access NVIDIA’s npm repository#registry=https://registry.npmjs.org
@nvidia:registry=https://edge.urm.nvidia.com:443/artifactory/api/npm/omniverse-client-npm/
Full documentation on how to use the AppStreamer component is outside the scope of this guide. However, we have a Web Viewer Sample git repo that walks through a sample client web application using the AppStreamer component to connect to a Kit App stream.
Here are some pointers into the sample project to augment the provided instructions.
Using the Web Viewer Sample#
The AppStreamer React Component is called by the AppStream React Component defined in /src/AppStream.tsx. It is the AppStream component that is embedded in the client app.
To configure an AppStreamer you must call its setup() method passing a properly configured streamConfig. You can see how this is being configured in the /src/AppStream.tsx. In particular the componentDidMount() method.
Follow the flow for the “local” streamConfig.source.
You must configure the URL for the Kit App stream properly:
/* The arguments for specifying the stream location */
const url = `signalingserver=${serverIP}&signalingport=${signalingData.source_port}&mediaserver=${serverIP}
&mediaport=${mediaData.source_port}&sessionid=${sessionId}`
/* Plus remaining arguments. Backend is the streaming endpoint */
const url = `signalingserver=${serverIP}&signalingport=${signalingData.source_port}&mediaserver=${serverIP}
&mediaport=${mediaData.source_port}&sessionid=${sessionId}&mic=0&cursor=free&server=&nucleus=""
&resolution=${width}x${height}&fps=${fps}&autolaunch=true&backendurl=${backend}&terminateVerb=DELETE`
This URL must then be set in the streamConfig.urlLocation:
streamConfig = {
source: 'local',
videoElementId: 'remote-video',
audioElementId: 'remote-audio',
messageElementId: 'message-display',
urlLocation: { search: url }
};
StreamConfig: {
audioElementId: string;
messageElementId: string;
source: "local";
videoElementId: string;
urlLocation: Location;
}
The streamConfig is part of StreamProps:
AppStreamer.setup({
streamConfig: streamConfig,
onUpdate: (message: any) => this._onUpdate(message),
onStart: (message: any) => this._onStart(message),
onCustomEvent: (message: any) => this._onCustomEvent(message)
};
StreamProps: {
authenticate: boolean;
onCustomEvent: ((message) => void);
onISSOUpdate: ((message) => void);
onStart: ((message) => void);
onStop: ((message) => void);
onTerminate: ((message) => void);
onUpdate: ((message) => void);
streamConfig: StreamConfig;
}