top of page

3D Match Moving

Week 01: Nuke 3D and Camera Projection

During the first session of the module, we were introduced to the 3D workspace and tools in Nuke. We learnt how to add 3D objects, textures and cameras in the 3D workspace. We then explored texturing the objects, transforming them and animating cameras, and finally render out them as a 2D scene. Ultimately, we made a few camera projections and applied those techniques to 2D images. Camera projection techniques can transform a 2D matte painting into a 3D scene. Therefore, in camera projection techniques, An image is "projected" into 3D geometry through a camera. 

Screen Shot 2016-11-14 at 11.42.34 (1).png
Exercise 1: A basic 3D setup

This is a basic 3D set up with a few 3D nodes. All the elements in a 3D workspace can be combined with a Scene node - all the geometric objects, cameras, and lights. I animated the camera's 'translate' and 'rotate' attributes. 

Screenshot (921).png

Final node graph. 

Screenshot (922).png

Screen recording. 

Exercise 2: Projecting a tunnel to a cylinder 
Method #1:

Made a basic setup with Project3D, Cylinder, Scene, ScanlineRender, render cam and projection cam nodes. Note: rotate the cylinder to mimic the tunnel while the camera is going through it. 

Screen Shot 2022-02-16 at 10.01.28.png

I scaled up the cylinder along the Y-axes to mimic the tunnel. 

Screen Shot 2022-02-16 at 10.03.55.png

Final node graph. 

Screenshot (923).png

Screen recording. Check out the workflow in the video above. 

Method #2:

I used FrameHold instead of Projection Cam here. 

Screen Shot 2022-02-16 at 10.07.14.png

Node graph. 

Screenshot (924).png

Screen capture. 

Exercise 2: Projecting different parts of an image to cards

This is an exciting and useful workflow and can be used to build a 3D Environment from a 2D Image

We have already set up render and projection cameras. Chose 'Fill' for resize type of the reformat node. 

Screen Shot 2022-02-16 at 10.54.36.png

I started from the back wall, roto'ed it and used Project3D node for projection.

Screen Shot 2022-02-16 at 11.12.53.png
Screen Shot 2022-02-16 at 11.31.04.png

Projected various parts of the image to several cards, then moved them to build up the room. 

Screen Shot 2022-02-16 at 11.39.21.png

Note: Set the number of samples to render per pixel, to produce motion blur and antialiasing.

Screen Shot 2022-02-16 at 11.42.26.png

Note: I used the Remove node to remove channels from the input clip, and make the rendering process smoother and faster. 

Screen Shot 2022-02-16 at 12.00.26.png
Screenshot (925).png
Screenshot (926).png
Screenshot (927).png
Workflow (screen capture)
Render
Week 02: Nuke 3D Tracking

We looked at 3D Camera Tracking this week. 3D Match Moving, as previously discussed, is the process of mixing filmed sequences (live backplates) with CGI in such a way that they match one another. The goal is to transform live backplate footage into a 3D environment and simulate parallax movement in the scene. As a result, 3D Match Moving is the act of detecting a pattern in a set of pixels and following it across the screen. The programme then monitors points in order to create a three-dimensional camera and recreate the motions. Lens distortion will be one of the problems we will confront. Thus, we need to straighten out the curvature caused by the lens to do 3D tracking correctly. 

Exercise 1
Screen Shot 2022-02-23 at 09.41.22.png

I first brought in a camera tracker node to track the scene.  Water and the moving people are not suitable for tracking, so I roto'ed them out. 

Screen Shot 2022-02-23 at 09.41.55.png

Turn on Preview Features in the camera tracker settings. 

Screen Shot 2022-02-23 at 09.42.13.png

Increased the number of features to track in each frame. 

Screen Shot 2022-02-23 at 09.44.06.png

Selected Source Alpha to apply the previously made mask (roto). 

Screen Shot 2022-02-23 at 09.45.15.png

Hit Track to start tracking. 

Screen Shot 2022-02-23 at 09.48.37.png

After tracking, we need to hit Solve to check the tracking quality. 

Screen Shot 2022-02-23 at 09.49.07.png

Deleted the invalid tracks (amber and red marks)

Screen Shot 2022-02-23 at 09.49.49.png

Update Solve to check the tracks again. We may need to do these steps again to delete all the invalid tracks. 

Screen Shot 2022-02-23 at 09.53.12.png

In this stage, the Camera Tracker node created a scene, a camera tracker point cloud and camera nodes for me. 

Screen Shot 2022-02-23 at 10.02.24.png

First pressed W on the keyboard to see two media in viewer. Next, reduced the opacity of the scene to be able to see dots. Finally, chose Vertex Selection to be able to select dots. 

Screen Shot 2022-02-23 at 10.02.55.png

While the specific dots are selected, brought in a card and pressed Match Selection Points

Screen Shot 2022-02-23 at 10.09.27.png

I then tried to align the card with the dots.

Screen Shot 2022-02-23 at 10.26.17.png

Final result. 

Screen Shot 2022-02-23 at 10.27.47.png

Node graph. 

Exercise 2
Screen Shot 2022-02-23 at 11.32.10.png

In this exercise, instead of using a checkerboard card, we projected a patch to a card and snapped the card to a part of the scene using a camera tracker. 

Screen Shot 2022-02-23 at 11.32.34.png

Node graph. 

3D Clean up Script layout
Screen Shot 2019-02-06 at 11.30.36.png
Week 3: 3D Equalizer

We were introduced to 3D Equaliser this week. 3D Equaliser is one of the most powerful 3D tracking software, widely used by most of the major studios around the globe. During the session, we got our head around the software user interface and practised a few essential tracking workflows. 

To simplify the way we interact with the software, we set up a bunch of shortcuts as below: 

GUI_Shortcut Keys.jpg
Tracking in 3DEquiliser (important steps and tips)
Screen Shot 2022-03-02 at 10.35.04.png

Don't forget to set the frame rate when importing your footage!

Screenshot (977).png

First things first, head to Playback -> Export Buffer Compression File if playback is slow. If you already have the buffer file, you need to import it. 

Screen Shot 2022-03-02 at 10.51.48.png

Set Filmback Width, then Pixel Aspect and finally Focal Length. Also, put an Identifier in the related field. 

Screen Shot 2022-03-02 at 10.56.25.png

Ctrl-click to make a tracking point. When done, Alt-click to deselect the point. To reselect a point, Alt-drag around it. To gauge, press G. To track press T

Screen Shot 2022-03-02 at 11.08.01.png

When a point goes out of the screen, the tracking process stops. This also happens when the image is so blurry. So we need to go to the relevant frame and make the tracking area smaller. 

Screen Shot 2022-03-02 at 11.12.10.png

We may change the tracking mode of specific points to "Marker". 

Screen Shot 2022-03-02 at 11.15.03.png

Adding more tracks helps us have a more accurate result. 

Screen Shot 2022-03-02 at 11.22.34.png

3D Orientation veiw. 

Screen Shot 2022-03-02 at 11.24.22.png

Activate Deviation Browser to be able to see the tracking quality. 

Screen Shot 2022-03-02 at 12.31.08.png

Go to Calc -> Calc All From Scratch. You can also press Alt + C

Screen Shot 2022-03-02 at 12.31.26.png

Tick the box for "All Points" to be able to see all tracking points in the deviation browser. 

Screen Shot 2022-03-02 at 12.33.58.png

Bring up Parameter Adjustment Window and adjust the focal length. 

Screenshot (978).png

Double-click on the lens to open up the attribute editor. Then go to Lens Distortion and select 3DE Classic LD Model. Tick the boxes for Distortion and Quartic Distortion

Screenshot (979).png

Then again head to Parameter Adjustment and this time press Adaptive All and then Adjust...

Screen capture
Week 04: Lenses and Cameras

We practised the previously learnt workflows of tracking in 3D Equaliser this week. Also, we looked at camera setting camera constraints, changing the camera's height according to the survey data, and creating distance constraints. So what we need to do is put the survey data into the 3D Equaliser to have better and more accurate tracking. 

Survey data
Survey_image_01.jpg
Practice #1: Nodal shot (fixed camera position)
Setting camera constraint
Screen Shot 2022-03-09 at 10.45.36.png

In this scene, the camera only rotates, so its position is fixed. However, you can see that 3D Equaliser assumed that the camera had movement. We need to fix this! 

Screen Shot 2022-03-09 at 10.47.17.png

To fix that, we set up a camera constraint. 

Screen Shot 2022-03-09 at 10.47.34.png

You can see that the camera is now fixed and has only rotation. 

Changing camera's height
Screenshot (980).png
Creating distance constraint
Screenshot (981).png
Practice #2: Free shot (parallax view)
Screenshot (999).png

Selected the tracking points on the ground. 

Screenshot (1000).png

Algined the points to XZ Plane so that all the points will be placed above the grid.

Screenshot (1001).png

Selected one of the ground points and moved to to origin. 

Screenshot (1003).png

Created locators from all the points. 

Screenshot (1006).png

Scaled up the locators for us to be able to clearly see them in the scene. 

Screenshot (1009).png

Made a cone to snap it to a locator positioned on the ground. 

Screenshot (1011).png

Cone snapped to a ground locator. 

Screenshot (1012).png

Result. 

Screen capture
Week 05: 3DE Freeflow and Nuke

This week we learnt how to bake the scene in 3DE export out the assets to use them in Nuke. Overall, we export out the camera, locators, 3d objects and lens distortion and use a specific workflow for lens distortion in Nuke. As another exercise, we created a patch based on the information we brought into Nuke from 3DE. 

3DE: Exporting assets for Nuke
Screen Shot 2022-03-16 at 10.25.28.png

Cintinuing from where I left off last week. I had a cylinder snapped to a point on the ground and many tracking points in the scene. 

Screen Shot 2022-03-16 at 10.25.51.png

In order to export the assets from 3DE, I needed to bake the scene first. 

Screen Shot 2022-03-16 at 10.30.00.png

Export Project: exports the camera. Browse the file and put it to MATCHMOVING -> CAMERA

Screen Shot 2022-03-16 at 10.31.42.png

Selected all the points, and created locators from them. Then chose all the locators and headed to Geo -> Export OBJ... Put the output file to MATCHMOVING -> GEO.

Screen Shot 2022-03-16 at 10.34.18.png

To export the lens distortion, I headed to File -> Export -> Export Nuke LD_3DE Lens Distortion Node. Put the exported file to MATCHMOVING -> UNDISTORT

Screen Shot 2022-03-16 at 10.35.26.png

Selected the cylinder and exported it by going to 3D Models -> Export Obj File... Put it to MATCHMOVING -> GEO

Nuke: using 3DE data
Screen Shot 2022-03-16 at 10.42.27.png

The camera output file dragged to Nuke. 

Screen Shot 2022-03-16 at 10.46.05.png

I dragged and dropped all the 3DE exported files namely lens distortion, 3D object, camera and locators. I only kept the camera node and deleted other extra nodes for the camera. 

Screen Shot 2022-03-16 at 11.00.08.png

Important: UD and RD should be set up as above. 

Screen Shot 2022-03-16 at 11.08.05.png

Important: as I scaled up (for 1.1) the footage by Overscan (reformat) node, I need to multiply Horiz Aperture and Vert Aperture each in 1.1. 

Screen Shot 2022-03-16 at 11.14.21.png

The nodes and setup I used to be able to see the cylinder and locators in the scene. 

Screen Shot 2022-03-16 at 11.46.35.png

Here I wanted to use a patch and remove the text on the plate. I then needed to use 3DE data to track the plate. 

Screen Shot 2022-03-16 at 11.49.19.png

Roto'ed the plate using the above nodes and set up. 

Screen Shot 2022-03-16 at 11.52.17.png

Projected the roto to a card and aligned the card with the plate in 3D. 

Screen Shot 2022-03-16 at 12.19.50.png

Result. 

Screen Shot 2022-03-16 at 12.20.09.png

Node graph. 

Week 06: Surveys

This week, we looked at how to set survey points within a scene. We learned how to use all the provided survey data and various measurements to track elements more precisely in our scene in 3DE. 

Survey image
decking_survey_data_02.jpg
Using the measurements and making survey points 
Screen Shot 2022-03-23 at 09.38.23.png

In the 3D Orientation view, go to Points Groups and R-click on Points to add new points. 

Screen Shot 2022-03-23 at 09.39.46.png

Select Exactly Surveyed for Survey Type.

Screen Shot 2022-03-23 at 09.40.48.png

Put the exact measurements in Position XYZ fields. 

Screen Shot 2022-03-23 at 09.42.21.png

Go to Preferences to change the Regular Units to the unit you have your measurements in. 

Screen Shot 2022-03-23 at 09.49.53.png

I first started to make survey points for the ground. 

Screen Shot 2022-03-23 at 09.52.24.png

We can duplicate our survey points as above. 

Screen Shot 2022-03-23 at 10.17.19.png

All the survey points made. 

Screen Shot 2022-03-23 at 10.21.45.png

After making the survey points, it's time to track them. Simply select the point, then control-click on the corresponding area and track when on Manual Tracking. Head to the Image Controls Window to brighten up the scene so that we can see dark spots.

Screen Shot 2022-03-23 at 10.35.29.png

All points tracked. 

Screen Shot 2022-03-23 at 11.10.29.png

As before, adjust the lens distortion.

Screen Shot 2022-03-23 at 11.10.43.png

Select Brute Force the lens distortion. 

Screen Shot 2022-03-23 at 11.11.30.png

Select Adaptive All for focal length. 

Screen Shot 2022-03-23 at 11.19.23.png

To check the tracks, go to Lineup view and see if the red dot is placed in the middle of the green cross. If not, rectify the track. 

Screen Shot 2022-03-23 at 11.42.49.png

I made three tracking points on the wall to make the tracking even more precise. Then I needed to align those points with the survey points I already made for the wall. 

Screen Shot 2022-03-23 at 11.43.03.png

Converted the wall tracking points to survey points. 

Screen Shot 2022-03-23 at 11.50.27.png

Used Push Points Tool to align the points. 

Assignment 01: 3D Match Moving Project

For the first assignment, I was given a video and a survey image to work on and do 3D match moving by using 3D Equaliser and Nuke.

​

For the assignment, I had to:

1- have a minimum of 40 tracked points

2- get a low deviation

3- use survey data

4- add locators and 3D Cube

5- export camera, LD data, locator geo, and 3d models to Nuke

6- undistort lens and reformat

7- place a cleanup patch on the fire exit sign

​

I tried to do whatever the assignment had asked me, then moved on to the next level and added a tv screen (colour bar) and a sign above the door by using the tracking data I had brought into Nuke from 3DE. Below you can see my developmental work. 

Development #1: 

Here I did everything the assignment had asked me. 

Development #2:

I put a colour bar image on the left-hand TV and used the 3DE info I'd brought into Nuke.

Development #3:

Added a sign above the door.

Node graph
Screenshot (1031).png
Screen recording (3DE)
Screen recording (Nuke)
Week 07: Shooting footage for assignment 02

We went to the studio this week to shoot footage for the second assignment. 

Week 08: Surveys (part 2)

We continued to explore how to use various survey data in 3DE to solidify our tracking process this week. We had a look at how to import multiple reference images into 3DE and track points across the reference images and our footage. 

Screen Shot 2022-04-06 at 09.22.57.png

3DE4 -> File -> Import -> Import Multiple Reference Frames... v1.1

Screen Shot 2022-04-06 at 09.23.30.png

Navigated to where the reference images are and imported them into 3DE.

Screen Shot 2022-04-06 at 09.25.07.png

Deleted DS_Store camera as we don't need that. 

Screen Shot 2022-04-06 at 09.32.55.png

We need to have a lens for our footage and another for our ref images. 

Screen Shot 2022-04-06 at 09.50.09.png

Reference lens set up. 

Screen Shot 2022-04-06 at 09.50.13.png

Footage lens set up. 

Screen Shot 2022-04-06 at 09.37.25.png

Assigning lenses to the cameras. 

Screen Shot 2022-04-06 at 11.19.10.png

I made more than 40 tracking points and shared them across the ref images and the main footage. 

Week 9 Lens Distortion and Grids

This week, we learnt how to use a grid to have a more precise lens distortion set up in 3D Equaliser. First, we were given a grid shot that was filmed for the purpose of lens distortion. Then, we brought the grid shot into 3DE as a reference camera and calculated the lens distortion based on that. 

Screen Shot 2022-04-20 at 09.30.08.png

Lens attributes set up.

Screen Shot 2022-04-20 at 09.45.21.png

About 30 tracking points added. 

Screen Shot 2022-04-20 at 09.49.10.png

Made a camera constraint since the shot is not freemove, is nodal. 

Screen Shot 2022-04-20 at 09.49.19.png

Used a camera constraint.

Screen Shot 2022-04-20 at 09.57.14.png

Chose 3DE Radial as the lens distortion model. 

Screen Shot 2022-04-20 at 09.58.22.png

Brought it the grid shot as a reference camera

Screen Shot 2022-04-20 at 09.59.00.png

Grid shot selected. 

Screen Shot 2022-04-20 at 09.59.57.png

Changed the viewer to Distortion Grid Controls

Screen Shot 2022-04-20 at 10.01.05.png

Aligned some points with the checkerboard. Pressed Snap to have 3DE align the others points for us. 

Screen Shot 2022-04-20 at 10.02.37.png

Expanded the points to cover the whole grid.

Screen Shot 2022-04-20 at 10.02.49.png

Finally calculated the lens distortion based on the grid we brought in. 

Screen Shot 2022-04-20 at 10.03.40.png

Calc lens distortion window. 

Week 10: Nuke Maya Pipeline

I looked at how to take tracked footage from 3DE to Maya this week. In Maya, I learned how to prep the software to bring in my assets, set up the size of the image sequence correctly and add a 3D object to the scene. Before that, I applied lens distortion data to the image sequence in Nuke in order to undistort my footage for Maya. 

Screen Shot 2022-04-27 at 10.19.48.png

In 3DE, I tracked the footage. Next, I needed to export Mel script and lens distortion files. Note: Make sure to set the start frame to 1001 when exporting. 

Screen Shot 2022-04-27 at 10.26.39.png

Undistorted the sequence as above. 

Screen Shot 2022-04-27 at 10.29.07.png

Named the undistorted sequence appropriately. 

Screen Shot 2022-04-27 at 10.48.18.png

Tick "Use Image Sequence". Change the sequence to the undistorted footage. 

Screen Shot 2022-04-27 at 10.50.58.png

I needed to scale up the image plane as well as film back size by using a simple python code as shown above.

Screen Shot 2022-04-27 at 10.53.01.png

Film back size set up. 

Screen Shot 2022-04-27 at 11.11.45.png

Here I wanted to add a sphere to the scene. To have the shadows of the sphere, I made a plane and assigned aiShadowMatte to it. 

Screen Shot 2022-04-27 at 11.17.03.png

Added an Area Light and turned off Normalize

Screen Shot 2022-04-27 at 11.19.04.png

Set up the image size in the Render Settings window. The resolution of my footage needed to be multiplied by 1.1. 

Screen Shot 2022-04-27 at 11.19.39.png

I could also find the correct image size (after scaling up) by going back to Nuke and clicking on the Overscan node. 

Screen Shot 2022-04-27 at 11.22.36.png

I disabled the image plane when rendering the sequence as I wanted to comp the sequence in Nuke. 

Screen Shot 2022-04-27 at 11.25.22.png

Render Settings set up. 

Screen Shot 2022-04-27 at 11.29.03.png

Rendered out beauty (RGBA), specular and shadow passes to be able to composite them in Nuke. 

Comping CG back in Nuke (2 methods)
Screenshot (1089).png
Using Shuffle node
Screenshot (1090).png
Assignment 02

Above is the original footage I shot for the second assignment. 

I tracked the scene in 3DE using various techniques, and then exported and brought the tracking data into Nuke. 

All markers are cleaned. Scene is ready for green screen removal and adding 3D assets in Maya. 

AS_MECH_CHAR_PER_ANIM_04.jpg

Above is my mech character I am going to put it in the scene right in place of the blue cube.

Note:  The whole process of modelling, texturing, rigging, and so on is done by me. 

I removed the green screen and used different motion capture data for my mech characters in the scene. Used multiple lights as well as 3D assets in the scene, and a bit of Python code to animate my main spotlight. 

Here I have added smoke to the scene, and most importantly, I colour-graded the lights and the reflections on the characters. I also added lens dirt and used the red colour all over the scene to emulate the feeling of being in a nightclub. However, there are still a few problems that I will work on them. For instance, the table needs to be roto'ed to be placed in front of the 3D background.  

Next, I played around with light groups and AOVs. I also used a bit of Python code to make flashing lights in the scene. Different render passes were used to add more life and attraction to the sequence. For instance, I defocused the background using the Z Depth channel. Also, I changed the colour and intensity of some of the lights in the scene.

​

Note: My Maya sequence was rendered with relatively low sampling rates because my computer could not handle a higher sampling quality. As a result, the rendered sequence seems noisy. I will try to render my Maya file by another machine as soon as possible and put it here. 

I used a better quality Arnold rendered sequence here. Also, I made a few adjustments and changes, such as refining roto edges, redoing the green screen removal process, etc. 

Final result

Video with sound

FINAL DEVELOPMENT WITH SOUND. 

​

NOTE: While I tried very hard to be as creative as possible and am happy with the result, I believe there is still room for improvement. Render quality still needs to be improved, and I will further work on a few details of my scene during the summer.

Screen recording from inside Maya and Nuke
VFX Breakdown

Video with sound. 

bottom of page