By studying the official tutorial, I tried a compositing production process based on Nuke's real-time streaming of Unreal Engine scenes. based on the camera property of the UnrealReader node, the camera is used as a bridge to combine the Unreal Engine scene and the green screen material in real-time. about materials, the Unreal Engine version is 5.0. Regarding scene resources, terrain materials, trees, buildings, and other models are all from the market. The green screen footage was shot and photographed by myself.
I have tried various situations with the application of UnrealReader. The entire film compositing part is roughly divided into three types: static camera green screen, camera tracking green screen, and CG model.





This part is similar to traditional compositing of green screen material and 3D scene, except that the 3D scene is replaced by a real-time rendered Unreal Engine scene. For this kind of static image, the camera choice can be an Unreal Engine camera or a Nuke camera, because UnrealReader is compatible with both cameras.

BG_UNREAL ENGINE

GS_STATIC CAMERA

POSITON TO POINTS

COMPOSITING

NODES

This part is a little different from the previous part, the camera. Since it is a moving shot, there is an extra step in the entire workflow, camera tracking. In this case, you need to first find the camera path through Nuke, and then import the Nuke camera tracking path into the unreal scene through UnrealReader.

BG_uNREAL ENGINE

GS_CAMEREA TRACKING

POSITION TO POINTS

COLOR ID

NODES

UnrealReader can read and export camera paths within Unreal Engine. This part is based on Unreal Engine camera compositing. The situation I envision is that if a CG animation needs to be rendered in a third-party 3D software due to renderer effects or other reasons, the user can animate the camera of the Unreal Engine and export the camera data from Nuke through UnrealReader. This allows you to put the camera data into any software, like 3ds Max, Maya, Blender, etc. Here I imported the camera data into Maya and used Arnold to render an animation sequence of the spaceship taking off.

BG_UNREAL ENGINE

FG_ARNOLD RENDER

HDRI FROM UNREAL ENGINE

COLOR ID

NODES
I think applying Nuke to this virtual production process through UnrealReader has two benefits. First, combining green screen, Nuke, and Unreal, this workflow may be able to solve the current high price problem of LED wall virtual production, and the output of the picture through UnrealReader will not have physical color loss. Secondly, compared with traditional green screen compositing, this process enables real-time rendering of background previews faster and easier to modify. Then combine some functions of the Unreal 5 engine to make many interesting attempts. For example, for 3D scanning of large scenes, I once watched a video about importing Google Earth into Unreal Engine 5. In this way, a great and huge scene can be obtained very quickly, and through UnrealReader, the production of large scenes should be more convenient.