The Audio2Face Live Link Plugin allows creators to stream animated facial blendshape weights and audio into Unreal Engine to be played on a character. Either Audio2Face or Avatar Cloud Engine (ACE) can stream facial animation and audio. With very little configuration a MetaHuman character can be setup to receive streamed facial animation, but the plugin may be used with many other character types with the correct mapping pose assets.
Live Link Setup
This section assumes that a MetaHuman project is the starting point. This ensures that there’s a valid character, Blueprints, and Live Link plugin enabled.
Install the Plugin
Navigate to this directory in your Audio2Face installation folder (found within your Omniverse Library) and find the “ACE” directory.
Copy the “ACE” directory into the Plugins directory of your UE project.
Enable the Plugin
The Omniverse Audio2Face Live Link plugin must be enabled, do this from the Plugins tab in Unreal Engine:
(Optional) Update the AR Kit mapping Pose and Animation Sequence
The Audio2Face team has modified the existing Pose and Animation Sequence assets from the MetaHuman project and they are included in the Audio2Face Live Link plugin’s Content folder. To modify, open the
Face_AnimBP Blueprint. Select the MetaHuman Blueprint, then select
Face in the
Details tab. Click the magnifying glass next to the
Face_AnimBP_C Anim Class. This will select that Blueprint in the Content Browser. Double click it to open.
mh_arkit_mapping_pose node and change the
Pose Asset to
If this asset is not visible, change the Content visibility to
Show Plugin Content:
Add Live Link Skeletal Animation Component
First, open or create a level with a MetaHuman Blueprint. These changes can be made in either the level or in the MetaHuman Blueprint.
Next, add a Live Link Skeletal Animation Component` to the MetaHuman Blueprint. By doing this within the Blueprint Editor, the MetaHuman will always carry this component in the future. Click Compile to compile the Blueprint, click Save to save it, and then close the Blueprint Editor.
Create an NVIDIA Omniverse Live Link Source
Live Link window (from the
Window > Virtual Production > Live Link menu) and add the
NVIDIA Omniverse LiveLink Source:
The listening port to receive streaming animation blendshape weights
The listening port to receive streaming wave audio
Audio Sample Rate
The sample rate of incoming audio stream. ACE is currently configured to send a 16kHz wave to Unreal, so that is default.
- The plugin doesn’t do any audio resampling (from one bitrate to another) so it is imperative that the sample rate matches the wave files that will be streamed to the plugin.
- When resampling the audio playback tends to drift, so resampling support was removed
Once the source starts streaming data correctly, you should have an
Audio2Face subject like below. Note that
Audio2Face is the default name for the subject and is configurable in both the Audio2Face application and ACE:
Animation Delay Time
The animation blenshape weight replay is delayed this many milliseconds to help align audio and animation
The audio replay is delayed this many milliseconds to help align audio and animation
Update Face_AnimBP Blueprint with the Audio2Face Subject Name
Now that the subject is streaming, the
Face_AnimBP Blueprint must be updated with the new
Audio2Face subject name. Open the
Face_AnimBP Blueprint and change the Default of the LLink Face Subj from
Audio2Face. Also, enable
LLink Face Head as well. You should see the preview of the MetaHuman update in your Anim Blueprint preview window. Click Compile to compile the blueprint, click Save to save it, and then close the Blueprint Editor.
Live Link Setup for Runtime Deployment
If the goal is to use the Audio2Face Live Link plugin in a packaged experience, or even in
-game mode using the editor executable, there are specific steps because the LiveLink Source window isn’t available in this configuration.
Create a Live Link Source Preset
To access the
NVIDIA Omniverse Live Link Source during a runtime experience a
LiveLinkPreset asset must be saved. To do this, create the source and save it like this: