This project provides a Warudo Node for generating OVR Lip Sync blendshape animation data using the OVRLipSync.dll provided by Oculus.
-
Download the OVRLipSync.dll file from Oculus Lip Sync Unity Integration (You will need to extract the dll from the unitypackage, it will be in the
Assets/Oculus/LipSync/Plugins/Win64/
folder). -
Place the
OVRLipSync.cs
andOVRLipSyncNode.cs
files into yourSteam/steamapps/common/Warudo/Warudo_Data/StreamingAssets/Playground
folder. -
Create a folder named
MonoBleedingEdge
withinSteam/steamapps/common/Warudo/Warudo_Data/
. -
Place the downloaded
OVRLipSync.dll
file into theMonoBleedingEdge
folder.
To utilize the Generate OVR Lip Sync Animations
node in your blueprint, follow these steps:
-
Replace the existing
Generate Lip Sync Animations
node with theGenerate OVR Lip Sync Animations
node in your blueprint. -
From the provided dropdown menu, select your character.
-
Press the
Auto Map Visemes
button within the node. This action will prompt the node to attempt mapping the correct blendshapes to the visemes automatically. -
Verify that the visemes are mapped correctly to your model.
-
Choose the microphone you wish to use for generating the viseme data.
For any inquiries or troubleshooting assistance, feel free to reach out to me.
- Email: [email protected]
- Discord: @Ximmer
- Twitch: https://www.twitch.tv/ximmer_vr
- Bluesky: https://bsky.app/profile/ximmer.dev
- Carrd: https://ximmer.carrd.co/
- This project is licensed under the MIT License.
- This project utilizes the
OVRLipSync.dll
andOVRLipSync.cs
file provided by Oculus under the Oculus Audio SDK License Version 3.3.