Oculus Mixed Reality Capture is a plugin that lets content creators make amazing mixed reality videos of a VR game for the Oculus Quest. There are many good tutorials out there that explain to you how to create a video for an Oculus Quest experience that implements it, but there is very little documentation on how actually a developer can implement this plugin in his/her VR experience. Since I had to make some tests with Oculus MRC for our game HitMotion: Reloaded (download it now on SideQuest!) and learn everything by trial and error, I have decided to write this long tutorial for you, so you don’t have to waste the same time I did! Are you ready to implement Oculus MRC like a boss?
Oculus MRC in Unity – Video Tutorial
I have made for you a video tutorial where I explain everything I learned about Oculus Mixed Reality Capture in Unity, including the complicated problems with XR Interaction Toolkit, URP, and object transparency. You can watch it here:
Otherwise, keep reading for the usual textual version!
What is Oculus Mixed Reality Capture?
Oculus Mixed Reality Capture (MRC) is an important plugin for Unity and Unreal Engine that developers can integrate into their projects. When this MRC plugin is integrated, all the users can easily create mixed reality videos out of that experience by just using a laptop and the Quest, with no cables in-between. Think about the cool videos that you see about Beat Saber, with the real person playing on top of the virtual environment: with MRC, content creators playing your Quest game will be able to create this kind of videos very easily in two ways:
- Using a green screen and a PC running the popular application OBS, creating professional-looking mixed reality videos
- Not using a green screen and using just a phone app like the one that LIV has just released for iPhone and shoot easily mixed reality videos on the fly
The problem with Oculus MRC is that it is really poorly documented. Especially I have found almost no resource talking about edge cases like the use of this plugin for applications implementing the XR Interaction Toolkit or running the Universal Rendering Pipeline. I am going to cover all of this for you, so you can implement this cool feature in your application, and let content creators make cool videos of your games!
Prerequisites
This tutorial assumes that you already have some knowledge of Unity: you know how to create an Oculus Quest project in Unity, either using the official Oculus Plugin or the XR Interaction Toolkit. It also assumes you already know how to shoot mixed reality videos with OBS and Oculus MRC. If you lack these basic skills, you can have a look at these useful articles:
How to create a Quest app in Unity for App Lab: https://skarredghost.com/2021/02/04/how-to-submit-app-lab-oculus-quest/
How to create a cross-platform app in Unity using the XR Interaction Toolkit:
https://skarredghost.com/2021/02/24/how-to-unity-cross-platform-vr-xr/
How to create a mixed reality video of an application that implements the MRC: https://skarredghost.com/2021/04/22/oculus-quest-mixed-reality-capture/
Full tutorial on Mixed Reality Capture by Mike VRO: https://www.youtube.com/watch?v=tQnYsWj2ePo
LIV now offers the possibility of recording mixed reality videos with just an iPhone: https://www.roadtovr.com/liv-mobile-beta-ios-app-mixed-reality-quest/
Project Setup
Let’s start from the basics of the setup of the Unity project in which to implement the Oculus Mixed Reality Capture:
- Create a Unity project
- Switch to Android build target
- Integrate the XR Plugin Management System from the Project Settings
- Select Oculus as the plugin to use both for PC and Android
- Import the XR Interaction Toolkit from the Package Manager (only if you intend to use it)
At this point, import from the Unity Asset Store the official Oculus Plugin. When Unity prompts you what to import, just keep the “VR” directory. All the other directories are not important: if you need them for your reasons, import them, otherwise avoid importing them to not make the project too big.
Build the sample
First thing first, build the MRC sample that Oculus has embedded into its plugin. So open the scene Assets\Oculus\VR\Scenes\MRC.Unity and just build it into an APK and deploy it on your device. Then put on your Quest, and test that mixed reality capture works with OBS.
You may wonder why I’m giving this piece of advice: well, this “MRC” scene is made to work out of the box, so if you just build it, the resulting application MUST WORK. If it doesn’t work, it means that there is something wrong either with your project or your MRC Calibration.
Two common errors may be:
- You are using URP and so you see in OBS that the resulting mixed reality video is weird, with the virtual objects shown with a strange projection. In this case, don’t worry, I show you how to solve issues with URP later on in the article;
- The calibration between the Quest and OBS is not working anymore. The result is a video where you see your virtual controllers in a virtual position or orientation different from the real position. In this case, perform again the calibration as instructed in all tutorials about how to shoot mixed reality videos linked above.
I strongly suggest you do it. Today I have lost 3 hours because I couldn’t understand what was not working on my project… I kept modifying and building my own scene thinking that the problem was in something I was making wrong. After some hours, I decided to make this test, and I realized that even the MRC scene was not working, and so I understood I had to look for problems elsewhere. I have so in the end found that the calibration I performed yesterday evening was weirdly not working anymore. Don’t be stupid like me, and make this quick test 🙂
When you are at a stage when everything works building the MRC scene, you can go on and implement your own scene.
MRC with Oculus Plugin
Implementing Mixed Reality Capture when you are using the official Oculus Plugin for Unity and its related prefabs (OVRCameraRig, OVRPlayerController, etc…) is as easy as doing nothing. Because actually… you have to do nothing: if you have an OVRCameraRig in your scene, everything already works. Just check that the OVRManager on it has the option “Mixed Reality Capture for Quest” set as ActivationMode: Automatic, and everything will work out of the box.
Facebook has made a great job in making this incredibly easy: all experiences implementing the Oculus Plugin are immediately compatible with MRC, so people can easily create mixed reality videos out of them.
MRC with XR Interaction Toolkit
Using the Oculus Plugin is good if you plan to ship your game only for Quest. But actually, using Unity facilities like the XR Interaction Toolkit makes you create games that are cross-platform out of the box. The problem is: XR Interaction Toolkit has no automated way to easily trigger the Mixed Reality Capture plugin because it is an Oculus-specific feature.
The good news is that with little modification to your app, you can use MRC with the XR Interaction toolkit. And I have made things even easier by making a little package for you.
The steps are:
- Remember the pre-requisites that I have written above: you have to import in your project the official Oculus Plugin even if you intend to use the XR Toolkit
- Head to this GitHub repo of mine and download the package clicking on the release listed in the right column of the page (see picture above)
- Import the MrcXrtHelpers package you have just downloaded into your Unity project
- Add into your scene the prefab Oculus_MRC_XRT_Manager_Full.prefab
- Inside the prefab, you can find an object called OVRManager, where inside there is the OVRManager script (you don’t say!) plus two scripts called CopyTransform and RemoveMRCamerasTracking. Drag the transform of your XR Rig into the OriginalTransform field of CopyTransform script
- That’s it! Build, run, and you have your mixed reality working!
In case you don’t want to use my prefab or it is not working for some reasons, you can substitute the points 4-5 with:
- Add an OVRManager script to your XR Rig gameobject
- Make sure OVRManager has tracking origin set as Floor or Stage (personally I always prefer Stage) and that “Mixed Reality Capture for Quest” is set as ActivationMode: Automatic
- Add the script RemoveMRCamerasTracking from my package to the same gameobject that has the XR Rig and the OVRManager
What does my package do? Nothing special, just two things:
- CopyTransform makes sure that OVRManager always has the same transform pose of the XR Rig, so that if you implement locomotion in your game, the mixed reality capture keeps working well. The Oculus plugin creates the two cameras to offer MR as children of the OVRManager, so the OVRManager must move together with the XR Rig. Let me explain better why: if you have implemented smooth locomotion in your game, the XR Rig (that is your virtual avatar) moves when you move the thumbstick of your controller, but actually, your physical position in the real world is not changing, and of course, also the physical camera that you use for recording is not moving. This means that even the relative position of your virtual recording MR camera with respect to your virtual avatar must remain the same, so the resulting video is coherent (remember that virtual and real counterparts should match for the MR video to work well). And the only way to obtain this is to make sure that the virtual cameras move together with the rig when there is locomotion.
- RemoveMRCamerasTracking makes sure to remove the TrackedPoseDriver scripts that Unity may add automatically to the MR cameras. If you don’t remove them, the MR cameras move together with your head, and you don’t want this to happen, because actually they should stay fixed in the Unity world in the same pose the physical camera is in real life.
Try it with yourself… easy peasy, you can have MRC with the XR Interaction Toolkit too! And in the video linked above, I also made some tests with locomotion systems, and while I moved inside the virtual scene, the resulting mixed reality video kept being coherent!
MRC with URP
What if you run your URP project and it produces weird results with URP? Some months ago, I have lost a full afternoon with my buddy Max trying to make MRC work with a URP project of ours, so I know the pain of it.
Well, actually the solution is very simple if you know it: go to the Project Settings, and select the Oculus section in the XR Plug-In Management. There you can find various options for the Oculus Plugin and one of them is the rendering mode. In the Android tab, at the “Stereo Rendering Mode” option select “Multiview”. If it is “Multipass”, everything is going to break, if it is “Multiview”, everything works.
Multiview also add some performance benefit to your application, so it’s a good thing to add. But keep in mind that it can also mess with some shaders of yours, so make a test of your app after you have activated it.
(If you want some explanations about what Multiview and Multipass mean, you can have a look to this old post on Unity blog)
MRC issue with semi-transparent objects
Look at this quick test video I made with HitMotion and MRC… do you see those horrible black quads? They are not part of the game of course, but they appear in my mixed reality video. How is it possible?
It took a while for me to realize it, but they are the semitransparent objects of our game. And semi-transparent objects in the foreground do not work well with MRC and OBS. Let me explain to you how MRC+OBS work, so that you can understand where the problem is.
The classical OBS scene to be used with Oculus Mixed Reality Capture is made of three layers:
- The background one is the one with the full rendering of the Unity game from the point of view of your physical camera
- The middle one, that goes on top of the background one, is your chroma-keyed physical camera stream
- The foreground one is a layer that just shows you the foreground objects and stays on top of the other two ones.
The MRC system uses the position of your headset and the physical camera to understand which objects are “foreground”, that is that should be drawn on top of your real camera stream: if an object lies between your head and the camera, then it is in front of you, so it’s foreground, otherwise it is background. Only foreground objects are drawn on the foreground layer, so can appear in front of the real video stream, that is in front of you. This is enough to give you the illusion of depth.
The problem with this trick is how these objects are drawn. The foreground camera just renders the foreground objects and should provide a background to them… and the choice is the black color. The camera of course can’t render a “transparent” frame, so the result must be fully opaque. The camera uses a black clear color and then renders all foreground objects on top of it. All the other parts of the scene are simply not seen by it, it just renders the objects that pass the “foreground test”. The objects are rendered on top of a black background, and when this frame gets passed to OBS, OBS applies a chroma-keying on the exact black color (so it removes only the pixels with exact color [0, 0, 0]), effectively making you see only the foreground objects.
But if one of these foreground objects is semi-transparent, its color gets drawn on top of the black, and it modifies a bit the black color of course. This means that chroma-keying is not removing those pixels anymore, because they are not perfectly black anymore, and what you see is a lot of black regions in your video.
What can you do? Well, I’m sorry to say that there is not a perfect solution for this, because you receive an image that is already assembled, and you can’t remove the black from the semi-transparent region. You can do these things:
- Adjust the chroma-keying tolerance on the Foreground object and see if this way the result is more acceptable: if you remove a wider interval of blacks, maybe you remove all the bad regions from your final video;
- Re-design your graphical assets so that to avoid transparency in foreground objects;
- Put all the semitransparent elements on a dedicated layer, and then prevent MRC cameras to render it. So, for instance, let’s suppose that your issues are given by a particle effect with many transparencies. You can put the Particle Effect on a dedicated layer called for instance “NoMRC”, and then you open your OVRManager, and in the end, there is a section called “Mixed Reality Capture”. Click on Show Properties and you will find some interesting configuration properties. The one that may interest you is extraHiddenLayers, which specifies what layers to remove from the capture: in this case, you can specify NoMRC, and the mixed reality video won’t show that particle effect anymore. You can also add layers to the extraVisibleLayers section to specify layers that should be seen only by the MRC cameras… for instance if you want to substitute that particle system with another visual fx that must only be visualized in the mixed reality capture and not in the real game.
Performance issues
MRC adds two additional cameras, so using it of course will consume a bit of resources. Oculus has made sure that these cameras don’t render every frame (they render one frame every two) and I think they won’t be used if no application is requesting MRC to start, but you have to consider that anyway there is a performance overhead when using this system. If your experience is able to barely sustain 72FPS, maybe it’s better not to use MRC at all and disable it in the OVRManager.
Quelle: