3D Match Moving
Week 01: Nuke 3D and Camera Projection
During the first session of the module, we were introduced to the 3D workspace and tools in Nuke. We learnt how to add 3D objects, textures and cameras in the 3D workspace. We then explored texturing the objects, transforming them and animating cameras, and finally render out them as a 2D scene. Ultimately, we made a few camera projections and applied those techniques to 2D images. Camera projection techniques can transform a 2D matte painting into a 3D scene. Therefore, in camera projection techniques, An image is "projected" into 3D geometry through a camera.
![Screen Shot 2016-11-14 at 11.42.34 (1).png](https://static.wixstatic.com/media/27f617_0e7619b56f7c4e9f88374b7c1b64e574~mv2.png/v1/fill/w_858,h_429,al_c,q_90,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202016-11-14%20at%2011_42_34%20(1).png)
Exercise 1: A basic 3D setup
This is a basic 3D set up with a few 3D nodes. All the elements in a 3D workspace can be combined with a Scene node - all the geometric objects, cameras, and lights. I animated the camera's 'translate' and 'rotate' attributes.
![Screenshot (921).png](https://static.wixstatic.com/media/27f617_fb668a78725b43f3b8ded804842d675d~mv2.png/v1/fill/w_599,h_386,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screenshot%20(921).png)
Final node graph.
![Screenshot (922).png](https://static.wixstatic.com/media/27f617_02bcb9ddddef402a98294494d0cfa74a~mv2.png/v1/fill/w_600,h_504,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screenshot%20(922).png)
Screen recording.
Exercise 2: Projecting a tunnel to a cylinder
Method #1:
Made a basic setup with Project3D, Cylinder, Scene, ScanlineRender, render cam and projection cam nodes. Note: rotate the cylinder to mimic the tunnel while the camera is going through it.
![Screen Shot 2022-02-16 at 10.01.28.png](https://static.wixstatic.com/media/27f617_8d786e537a1a4ce9a43572224f91af3c~mv2.png/v1/fill/w_600,h_668,al_c,q_90,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-02-16%20at%2010_01_28.png)
I scaled up the cylinder along the Y-axes to mimic the tunnel.
![Screen Shot 2022-02-16 at 10.03.55.png](https://static.wixstatic.com/media/27f617_7efe5036e56c45dfacffbf489fc0fb9b~mv2.png/v1/fill/w_599,h_455,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-02-16%20at%2010_03_55.png)
Final node graph.
![Screenshot (923).png](https://static.wixstatic.com/media/27f617_30b78b6d26e2401398dc71def194264c~mv2.png/v1/fill/w_599,h_552,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screenshot%20(923).png)
Screen recording. Check out the workflow in the video above.
Method #2:
I used FrameHold instead of Projection Cam here.
![Screen Shot 2022-02-16 at 10.07.14.png](https://static.wixstatic.com/media/27f617_7668934662214b33bea4bdcc321e16c0~mv2.png/v1/fill/w_599,h_646,al_c,q_90,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-02-16%20at%2010_07_14.png)
Node graph.
![Screenshot (924).png](https://static.wixstatic.com/media/27f617_36dd514368b2420ebbe26a0c6c3116d4~mv2.png/v1/fill/w_599,h_521,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screenshot%20(924).png)
Screen capture.
Exercise 2: Projecting different parts of an image to cards
This is an exciting and useful workflow and can be used to build a 3D Environment from a 2D Image
We have already set up render and projection cameras. Chose 'Fill' for resize type of the reformat node.
![Screen Shot 2022-02-16 at 10.54.36.png](https://static.wixstatic.com/media/27f617_ae1a48f8e7ae4ac0b6ffbdac9b903522~mv2.png/v1/fill/w_624,h_442,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-02-16%20at%2010_54_36.png)
I started from the back wall, roto'ed it and used Project3D node for projection.
![Screen Shot 2022-02-16 at 11.12.53.png](https://static.wixstatic.com/media/27f617_9df5ba1ee7dc46079946c2514d2c9837~mv2.png/v1/fill/w_624,h_546,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-02-16%20at%2011_12_53.png)
![Screen Shot 2022-02-16 at 11.31.04.png](https://static.wixstatic.com/media/27f617_0883adb8273e4bf796987a2608a72c8e~mv2.png/v1/fill/w_833,h_469,al_c,q_90,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-02-16%20at%2011_31_04.png)
Projected various parts of the image to several cards, then moved them to build up the room.
![Screen Shot 2022-02-16 at 11.39.21.png](https://static.wixstatic.com/media/27f617_4a8a70cda2cf4d34b67a0f58ab8c225a~mv2.png/v1/fill/w_613,h_513,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-02-16%20at%2011_39_21.png)
Note: Set the number of samples to render per pixel, to produce motion blur and antialiasing.
![Screen Shot 2022-02-16 at 11.42.26.png](https://static.wixstatic.com/media/27f617_ced710bd9df2462d8ae0a3c88212ff00~mv2.png/v1/fill/w_613,h_461,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-02-16%20at%2011_42_26.png)
Note: I used the Remove node to remove channels from the input clip, and make the rendering process smoother and faster.
![Screen Shot 2022-02-16 at 12.00.26.png](https://static.wixstatic.com/media/27f617_a7d2c7751b384a939d01e96eea4212df~mv2.png/v1/fill/w_612,h_431,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-02-16%20at%2012_00_26.png)
![Screenshot (925).png](https://static.wixstatic.com/media/27f617_ee1b9999f16c4099bbc2ff349ed088d1~mv2.png/v1/fill/w_817,h_372,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screenshot%20(925).png)
![Screenshot (926).png](https://static.wixstatic.com/media/27f617_4fae70e8e10047f2ad6fcd464e03762c~mv2.png/v1/fill/w_817,h_517,al_c,q_90,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screenshot%20(926).png)
![Screenshot (927).png](https://static.wixstatic.com/media/27f617_c9ba2f7ec4934dd395ba6afd83e96caa~mv2.png/v1/fill/w_821,h_478,al_c,q_90,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screenshot%20(927).png)
Workflow (screen capture)
Render
Week 02: Nuke 3D Tracking
We looked at 3D Camera Tracking this week. 3D Match Moving, as previously discussed, is the process of mixing filmed sequences (live backplates) with CGI in such a way that they match one another. The goal is to transform live backplate footage into a 3D environment and simulate parallax movement in the scene. As a result, 3D Match Moving is the act of detecting a pattern in a set of pixels and following it across the screen. The programme then monitors points in order to create a three-dimensional camera and recreate the motions. Lens distortion will be one of the problems we will confront. Thus, we need to straighten out the curvature caused by the lens to do 3D tracking correctly.
Exercise 1
![Screen Shot 2022-02-23 at 09.41.22.png](https://static.wixstatic.com/media/27f617_0fd22c535b664a25a2e69607163e97db~mv2.png/v1/fill/w_701,h_434,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-02-23%20at%2009_41_22.png)
I first brought in a camera tracker node to track the scene. Water and the moving people are not suitable for tracking, so I roto'ed them out.
![Screen Shot 2022-02-23 at 09.41.55.png](https://static.wixstatic.com/media/27f617_0a5fffb39e3d427fad31c0313229e5a9~mv2.png/v1/fill/w_701,h_463,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-02-23%20at%2009_41_55.png)
Turn on Preview Features in the camera tracker settings.
![Screen Shot 2022-02-23 at 09.42.13.png](https://static.wixstatic.com/media/27f617_6c662e49fc0a4f13ba88eb092e67e5a8~mv2.png/v1/crop/x_390,y_0,w_1497,h_1039/fill/w_700,h_486,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-02-23%20at%2009_42_13.png)
Increased the number of features to track in each frame.
![Screen Shot 2022-02-23 at 09.44.06.png](https://static.wixstatic.com/media/27f617_9eef88cf73434c6c85ffe670d31860eb~mv2.png/v1/fill/w_701,h_496,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-02-23%20at%2009_44_06.png)
Selected Source Alpha to apply the previously made mask (roto).
![Screen Shot 2022-02-23 at 09.45.15.png](https://static.wixstatic.com/media/27f617_e50406daa0c94fd48ff20f8ebe926d9c~mv2.png/v1/fill/w_701,h_474,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-02-23%20at%2009_45_15.png)
Hit Track to start tracking.
![Screen Shot 2022-02-23 at 09.48.37.png](https://static.wixstatic.com/media/27f617_efd1aa579de148e497c8b980d3763b76~mv2.png/v1/fill/w_700,h_501,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-02-23%20at%2009_48_37.png)
After tracking, we need to hit Solve to check the tracking quality.
![Screen Shot 2022-02-23 at 09.49.07.png](https://static.wixstatic.com/media/27f617_536ffcd9ebb447eeaf91cbe871532af9~mv2.png/v1/fill/w_701,h_525,al_c,q_90,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-02-23%20at%2009_49_07.png)
Deleted the invalid tracks (amber and red marks)
![Screen Shot 2022-02-23 at 09.49.49.png](https://static.wixstatic.com/media/27f617_4c46d29367c64086b6fa425136253e67~mv2.png/v1/fill/w_700,h_540,al_c,q_90,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-02-23%20at%2009_49_49.png)
Update Solve to check the tracks again. We may need to do these steps again to delete all the invalid tracks.
![Screen Shot 2022-02-23 at 09.53.12.png](https://static.wixstatic.com/media/27f617_7be3a656ef7e4550b4f2750e3f028a8b~mv2.png/v1/fill/w_701,h_490,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-02-23%20at%2009_53_12.png)
In this stage, the Camera Tracker node created a scene, a camera tracker point cloud and camera nodes for me.
![Screen Shot 2022-02-23 at 10.02.24.png](https://static.wixstatic.com/media/27f617_01b56b17d9c24f199b1a3386ea747f9e~mv2.png/v1/fill/w_701,h_536,al_c,q_90,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-02-23%20at%2010_02_24.png)
First pressed W on the keyboard to see two media in viewer. Next, reduced the opacity of the scene to be able to see dots. Finally, chose Vertex Selection to be able to select dots.
![Screen Shot 2022-02-23 at 10.02.55.png](https://static.wixstatic.com/media/27f617_132db33e08c547c39cce3c5917078079~mv2.png/v1/fill/w_701,h_499,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-02-23%20at%2010_02_55.png)
While the specific dots are selected, brought in a card and pressed Match Selection Points
![Screen Shot 2022-02-23 at 10.09.27.png](https://static.wixstatic.com/media/27f617_120094d2e357423084c8a120d660e04f~mv2.png/v1/fill/w_701,h_486,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-02-23%20at%2010_09_27.png)
I then tried to align the card with the dots.
![Screen Shot 2022-02-23 at 10.26.17.png](https://static.wixstatic.com/media/27f617_13882aa094904780a95aa49ab168f6f4~mv2.png/v1/fill/w_701,h_568,al_c,q_90,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-02-23%20at%2010_26_17.png)
Final result.
![Screen Shot 2022-02-23 at 10.27.47.png](https://static.wixstatic.com/media/27f617_94149bfc17e14ab58048cefd3c1350d6~mv2.png/v1/fill/w_701,h_584,al_c,q_90,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-02-23%20at%2010_27_47.png)
Node graph.
Exercise 2
![Screen Shot 2022-02-23 at 11.32.10.png](https://static.wixstatic.com/media/27f617_a9dd358642c74a62a90b888b941a5593~mv2.png/v1/fill/w_701,h_639,al_c,q_90,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-02-23%20at%2011_32_10.png)
In this exercise, instead of using a checkerboard card, we projected a patch to a card and snapped the card to a part of the scene using a camera tracker.
![Screen Shot 2022-02-23 at 11.32.34.png](https://static.wixstatic.com/media/27f617_96b422e4217c43dd9221b529c4f5861c~mv2.png/v1/fill/w_703,h_515,al_c,q_90,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-02-23%20at%2011_32_34.png)
Node graph.
3D Clean up Script layout
![Screen Shot 2019-02-06 at 11.30.36.png](https://static.wixstatic.com/media/27f617_a3e1055512574e308ed7a923b3b852c2~mv2.png/v1/fill/w_703,h_522,al_c,q_90,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202019-02-06%20at%2011_30_36.png)
Week 3: 3D Equalizer
We were introduced to 3D Equaliser this week. 3D Equaliser is one of the most powerful 3D tracking software, widely used by most of the major studios around the globe. During the session, we got our head around the software user interface and practised a few essential tracking workflows.
To simplify the way we interact with the software, we set up a bunch of shortcuts as below:
![GUI_Shortcut Keys.jpg](https://static.wixstatic.com/media/27f617_4a05165cb0064fc9ac24169acc065d28~mv2.jpg/v1/fill/w_840,h_473,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/GUI_Shortcut%20Keys.jpg)
Tracking in 3DEquiliser (important steps and tips)
![Screen Shot 2022-03-02 at 10.35.04.png](https://static.wixstatic.com/media/27f617_61422190037f4ee5ad7fec52bffdad52~mv2.png/v1/fill/w_693,h_511,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-03-02%20at%2010_35_04.png)
Don't forget to set the frame rate when importing your footage!
![Screenshot (977).png](https://static.wixstatic.com/media/27f617_9232d5b15fe7458b811593235d375836~mv2.png/v1/fill/w_694,h_457,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screenshot%20(977).png)
First things first, head to Playback -> Export Buffer Compression File if playback is slow. If you already have the buffer file, you need to import it.
![Screen Shot 2022-03-02 at 10.51.48.png](https://static.wixstatic.com/media/27f617_1598ef0cc7e7469eba3407b3b156c7db~mv2.png/v1/fill/w_694,h_560,al_c,q_90,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-03-02%20at%2010_51_48.png)
Set Filmback Width, then Pixel Aspect and finally Focal Length. Also, put an Identifier in the related field.
![Screen Shot 2022-03-02 at 10.56.25.png](https://static.wixstatic.com/media/27f617_b3b99d6678be467391ebc8fe3cd1f514~mv2.png/v1/fill/w_704,h_520,al_c,q_90,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-03-02%20at%2010_56_25.png)
Ctrl-click to make a tracking point. When done, Alt-click to deselect the point. To reselect a point, Alt-drag around it. To gauge, press G. To track press T.
![Screen Shot 2022-03-02 at 11.08.01.png](https://static.wixstatic.com/media/27f617_dce005b243f146e7be081591f00c14d1~mv2.png/v1/fill/w_703,h_499,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-03-02%20at%2011_08_01.png)
When a point goes out of the screen, the tracking process stops. This also happens when the image is so blurry. So we need to go to the relevant frame and make the tracking area smaller.
![Screen Shot 2022-03-02 at 11.12.10.png](https://static.wixstatic.com/media/27f617_7180d6e5c030402bbc717e85fe76a0aa~mv2.png/v1/fill/w_704,h_515,al_c,q_90,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-03-02%20at%2011_12_10.png)
We may change the tracking mode of specific points to "Marker".
![Screen Shot 2022-03-02 at 11.15.03.png](https://static.wixstatic.com/media/27f617_aa7608fce1fb4f108dc52bb87f954072~mv2.png/v1/fill/w_704,h_511,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-03-02%20at%2011_15_03.png)
Adding more tracks helps us have a more accurate result.
![Screen Shot 2022-03-02 at 11.22.34.png](https://static.wixstatic.com/media/27f617_c2225fe5c3ce499882b399471eace759~mv2.png/v1/fill/w_703,h_460,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-03-02%20at%2011_22_34.png)
3D Orientation veiw.
![Screen Shot 2022-03-02 at 11.24.22.png](https://static.wixstatic.com/media/27f617_703d11d4b61f4b89bebcea5785258c79~mv2.png/v1/fill/w_704,h_472,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-03-02%20at%2011_24_22.png)
Activate Deviation Browser to be able to see the tracking quality.
![Screen Shot 2022-03-02 at 12.31.08.png](https://static.wixstatic.com/media/27f617_52b1a1b31aae4d8b8c68cbf28e663d7e~mv2.png/v1/fill/w_704,h_536,al_c,q_90,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-03-02%20at%2012_31_08.png)
Go to Calc -> Calc All From Scratch. You can also press Alt + C.
![Screen Shot 2022-03-02 at 12.31.26.png](https://static.wixstatic.com/media/27f617_64eb0e729a90437697b361ccf439ba1b~mv2.png/v1/fill/w_703,h_469,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-03-02%20at%2012_31_26.png)
Tick the box for "All Points" to be able to see all tracking points in the deviation browser.
![Screen Shot 2022-03-02 at 12.33.58.png](https://static.wixstatic.com/media/27f617_bc872a3632e74d35a4fa9d59bf0df802~mv2.png/v1/fill/w_704,h_500,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-03-02%20at%2012_33_58.png)
Bring up Parameter Adjustment Window and adjust the focal length.
![Screenshot (978).png](https://static.wixstatic.com/media/27f617_dbea0fb118aa4b83b62dc2c20cd403c0~mv2.png/v1/fill/w_704,h_471,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screenshot%20(978).png)
Double-click on the lens to open up the attribute editor. Then go to Lens Distortion and select 3DE Classic LD Model. Tick the boxes for Distortion and Quartic Distortion.
![Screenshot (979).png](https://static.wixstatic.com/media/27f617_6abcd756f2434fb89fe575adf2abbe32~mv2.png/v1/fill/w_704,h_494,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screenshot%20(979).png)
Then again head to Parameter Adjustment and this time press Adaptive All and then Adjust...
Screen capture
Week 04: Lenses and Cameras
We practised the previously learnt workflows of tracking in 3D Equaliser this week. Also, we looked at camera setting camera constraints, changing the camera's height according to the survey data, and creating distance constraints. So what we need to do is put the survey data into the 3D Equaliser to have better and more accurate tracking.
Survey data
![Survey_image_01.jpg](https://static.wixstatic.com/media/27f617_58ea80d27e3648198599f0695622073e~mv2.jpg/v1/fill/w_744,h_496,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Survey_image_01.jpg)
Practice #1: Nodal shot (fixed camera position)
Setting camera constraint
![Screen Shot 2022-03-09 at 10.45.36.png](https://static.wixstatic.com/media/27f617_f556c9c543164c9eb1c33a3249de96a7~mv2.png/v1/fill/w_712,h_489,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-03-09%20at%2010_45_36.png)
In this scene, the camera only rotates, so its position is fixed. However, you can see that 3D Equaliser assumed that the camera had movement. We need to fix this!
![Screen Shot 2022-03-09 at 10.47.17.png](https://static.wixstatic.com/media/27f617_00431bc91be84f48b8621da029f7829e~mv2.png/v1/fill/w_707,h_489,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-03-09%20at%2010_47_17.png)
To fix that, we set up a camera constraint.
![Screen Shot 2022-03-09 at 10.47.34.png](https://static.wixstatic.com/media/27f617_a1be6203c493431e828ce86a18a07aa9~mv2.png/v1/fill/w_707,h_470,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-03-09%20at%2010_47_34.png)
You can see that the camera is now fixed and has only rotation.
Changing camera's height
![Screenshot (980).png](https://static.wixstatic.com/media/27f617_0a748048b0ef4788b1f9942728aa8ff0~mv2.png/v1/fill/w_706,h_450,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screenshot%20(980).png)
Creating distance constraint
![Screenshot (981).png](https://static.wixstatic.com/media/27f617_7f3a9889edf644db9240b0397f29fb19~mv2.png/v1/fill/w_707,h_436,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screenshot%20(981).png)
Practice #2: Free shot (parallax view)
![Screenshot (999).png](https://static.wixstatic.com/media/27f617_45f0593f721d4f6f89d2105ff36d167d~mv2.png/v1/fill/w_702,h_471,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screenshot%20(999).png)
Selected the tracking points on the ground.
![Screenshot (1000).png](https://static.wixstatic.com/media/27f617_40cd9cfa0bda46fead5ff6bb93bf9219~mv2.png/v1/fill/w_702,h_480,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screenshot%20(1000).png)
Algined the points to XZ Plane so that all the points will be placed above the grid.
![Screenshot (1001).png](https://static.wixstatic.com/media/27f617_9c3631272567484285e3d5be6927b500~mv2.png/v1/fill/w_702,h_467,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screenshot%20(1001).png)
Selected one of the ground points and moved to to origin.
![Screenshot (1003).png](https://static.wixstatic.com/media/27f617_3ad84550bd5b404fb918c4e1cec287fd~mv2.png/v1/fill/w_701,h_461,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screenshot%20(1003).png)
Created locators from all the points.
![Screenshot (1006).png](https://static.wixstatic.com/media/27f617_43a45185704542ae9dcf3b4d2761a635~mv2.png/v1/fill/w_702,h_452,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screenshot%20(1006).png)
Scaled up the locators for us to be able to clearly see them in the scene.
![Screenshot (1009).png](https://static.wixstatic.com/media/27f617_337d45bb963e4b7b9cfda30da58783f7~mv2.png/v1/fill/w_702,h_491,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screenshot%20(1009).png)
Made a cone to snap it to a locator positioned on the ground.
![Screenshot (1011).png](https://static.wixstatic.com/media/27f617_2f5bcb9a1cfd4ee88cf4a50e79553475~mv2.png/v1/fill/w_699,h_457,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screenshot%20(1011).png)
Cone snapped to a ground locator.
![Screenshot (1012).png](https://static.wixstatic.com/media/27f617_f8222d0ead474b59a60a68206072f126~mv2.png/v1/fill/w_702,h_500,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screenshot%20(1012).png)
Result.
Screen capture
Week 05: 3DE Freeflow and Nuke
This week we learnt how to bake the scene in 3DE export out the assets to use them in Nuke. Overall, we export out the camera, locators, 3d objects and lens distortion and use a specific workflow for lens distortion in Nuke. As another exercise, we created a patch based on the information we brought into Nuke from 3DE.
3DE: Exporting assets for Nuke
![Screen Shot 2022-03-16 at 10.25.28.png](https://static.wixstatic.com/media/27f617_98a23e03db254812917466bf29b386b8~mv2.png/v1/fill/w_733,h_473,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-03-16%20at%2010_25_28.png)
Cintinuing from where I left off last week. I had a cylinder snapped to a point on the ground and many tracking points in the scene.
![Screen Shot 2022-03-16 at 10.25.51.png](https://static.wixstatic.com/media/27f617_18850ad598c44ed78358e973cea79eda~mv2.png/v1/fill/w_733,h_539,al_c,q_90,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-03-16%20at%2010_25_51.png)
In order to export the assets from 3DE, I needed to bake the scene first.
![Screen Shot 2022-03-16 at 10.30.00.png](https://static.wixstatic.com/media/27f617_897b918116a14b7a8d1c161eeb718dbd~mv2.png/v1/fill/w_733,h_525,al_c,q_90,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-03-16%20at%2010_30_00.png)
Export Project: exports the camera. Browse the file and put it to MATCHMOVING -> CAMERA
![Screen Shot 2022-03-16 at 10.31.42.png](https://static.wixstatic.com/media/27f617_56b8761bbed548d99ba5705225a20ddb~mv2.png/v1/fill/w_733,h_480,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-03-16%20at%2010_31_42.png)
Selected all the points, and created locators from them. Then chose all the locators and headed to Geo -> Export OBJ... Put the output file to MATCHMOVING -> GEO.
![Screen Shot 2022-03-16 at 10.34.18.png](https://static.wixstatic.com/media/27f617_93e51c73bf1f452b8ec374e8cecacb5f~mv2.png/v1/fill/w_733,h_544,al_c,q_90,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-03-16%20at%2010_34_18.png)
To export the lens distortion, I headed to File -> Export -> Export Nuke LD_3DE Lens Distortion Node. Put the exported file to MATCHMOVING -> UNDISTORT
![Screen Shot 2022-03-16 at 10.35.26.png](https://static.wixstatic.com/media/27f617_ba59bfff07e245d9a5f6225d0deff748~mv2.png/v1/fill/w_733,h_499,al_c,q_90,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-03-16%20at%2010_35_26.png)
Selected the cylinder and exported it by going to 3D Models -> Export Obj File... Put it to MATCHMOVING -> GEO
Nuke: using 3DE data
![Screen Shot 2022-03-16 at 10.42.27.png](https://static.wixstatic.com/media/27f617_e8117c3067c04328a45ff18d0ac4e64d~mv2.png/v1/fill/w_733,h_480,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-03-16%20at%2010_42_27.png)
The camera output file dragged to Nuke.
![Screen Shot 2022-03-16 at 10.46.05.png](https://static.wixstatic.com/media/27f617_e54f546d6d7c43bd9373fda7f7a0f82b~mv2.png/v1/fill/w_737,h_549,al_c,q_90,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-03-16%20at%2010_46_05.png)
I dragged and dropped all the 3DE exported files namely lens distortion, 3D object, camera and locators. I only kept the camera node and deleted other extra nodes for the camera.
![Screen Shot 2022-03-16 at 11.00.08.png](https://static.wixstatic.com/media/27f617_b7606b46e86b40d2a4be3a746c724679~mv2.png/v1/fill/w_731,h_498,al_c,q_90,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-03-16%20at%2011_00_08.png)
Important: UD and RD should be set up as above.
![Screen Shot 2022-03-16 at 11.08.05.png](https://static.wixstatic.com/media/27f617_0c5eefcbb18247b181d60e74b0fe95f8~mv2.png/v1/fill/w_731,h_512,al_c,q_90,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-03-16%20at%2011_08_05.png)
Important: as I scaled up (for 1.1) the footage by Overscan (reformat) node, I need to multiply Horiz Aperture and Vert Aperture each in 1.1.
![Screen Shot 2022-03-16 at 11.14.21.png](https://static.wixstatic.com/media/27f617_e514fdc90e8d465aa14e350fe7058d8a~mv2.png/v1/fill/w_731,h_576,al_c,q_90,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-03-16%20at%2011_14_21.png)
The nodes and setup I used to be able to see the cylinder and locators in the scene.
![Screen Shot 2022-03-16 at 11.46.35.png](https://static.wixstatic.com/media/27f617_22729d85229c48b3ba1087ba82c63fcf~mv2.png/v1/fill/w_730,h_561,al_c,q_90,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-03-16%20at%2011_46_35.png)
Here I wanted to use a patch and remove the text on the plate. I then needed to use 3DE data to track the plate.
![Screen Shot 2022-03-16 at 11.49.19.png](https://static.wixstatic.com/media/27f617_36eb1185651e4125a236b576ee688a93~mv2.png/v1/fill/w_731,h_566,al_c,q_90,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-03-16%20at%2011_49_19.png)
Roto'ed the plate using the above nodes and set up.
![Screen Shot 2022-03-16 at 11.52.17.png](https://static.wixstatic.com/media/27f617_12cba2cf86da4598bba0ec96b449e9fc~mv2.png/v1/fill/w_731,h_600,al_c,q_90,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-03-16%20at%2011_52_17.png)
Projected the roto to a card and aligned the card with the plate in 3D.
![Screen Shot 2022-03-16 at 12.19.50.png](https://static.wixstatic.com/media/27f617_b49d45f45cb74c58a979c721cc575aac~mv2.png/v1/fill/w_730,h_608,al_c,q_90,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-03-16%20at%2012_19_50.png)
Result.
![Screen Shot 2022-03-16 at 12.20.09.png](https://static.wixstatic.com/media/27f617_6885491cea1c4ae1977898e4e84624ab~mv2.png/v1/fill/w_730,h_475,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-03-16%20at%2012_20_09.png)
Node graph.
Week 06: Surveys
This week, we looked at how to set survey points within a scene. We learned how to use all the provided survey data and various measurements to track elements more precisely in our scene in 3DE.
Survey image
![decking_survey_data_02.jpg](https://static.wixstatic.com/media/27f617_6097f6ecd52447a284c7e2963fe34076~mv2.jpg/v1/fill/w_805,h_537,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/decking_survey_data_02.jpg)
Using the measurements and making survey points
![Screen Shot 2022-03-23 at 09.38.23.png](https://static.wixstatic.com/media/27f617_6408b6c7072a4ff9b99c0f95cb131229~mv2.png/v1/fill/w_694,h_521,al_c,q_90,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-03-23%20at%2009_38_23.png)
In the 3D Orientation view, go to Points Groups and R-click on Points to add new points.
![Screen Shot 2022-03-23 at 09.39.46.png](https://static.wixstatic.com/media/27f617_e32dbf1d3caa424e94d04ee139538403~mv2.png/v1/fill/w_694,h_542,al_c,q_90,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-03-23%20at%2009_39_46.png)
Select Exactly Surveyed for Survey Type.
![Screen Shot 2022-03-23 at 09.40.48.png](https://static.wixstatic.com/media/27f617_f45e541dc6334269a5985c12f6d8b96b~mv2.png/v1/fill/w_693,h_529,al_c,q_90,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-03-23%20at%2009_40_48.png)
Put the exact measurements in Position XYZ fields.
![Screen Shot 2022-03-23 at 09.42.21.png](https://static.wixstatic.com/media/27f617_f16d601d917a4e1197d2b2e58ae4a7d9~mv2.png/v1/fill/w_692,h_512,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-03-23%20at%2009_42_21.png)
Go to Preferences to change the Regular Units to the unit you have your measurements in.
![Screen Shot 2022-03-23 at 09.49.53.png](https://static.wixstatic.com/media/27f617_6f007b85a60344cfbeea641fb2b3f871~mv2.png/v1/fill/w_690,h_478,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-03-23%20at%2009_49_53.png)
I first started to make survey points for the ground.
![Screen Shot 2022-03-23 at 09.52.24.png](https://static.wixstatic.com/media/27f617_47cd2edb03f340a38ba665c9ad04b555~mv2.png/v1/fill/w_689,h_523,al_c,q_90,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-03-23%20at%2009_52_24.png)
We can duplicate our survey points as above.
![Screen Shot 2022-03-23 at 10.17.19.png](https://static.wixstatic.com/media/27f617_8cea5025d2484efdacd590f6a21276da~mv2.png/v1/fill/w_689,h_454,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-03-23%20at%2010_17_19.png)
All the survey points made.
![Screen Shot 2022-03-23 at 10.21.45.png](https://static.wixstatic.com/media/27f617_06e6890d12fb4a4ab39c23e0c1589421~mv2.png/v1/fill/w_689,h_507,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-03-23%20at%2010_21_45.png)
After making the survey points, it's time to track them. Simply select the point, then control-click on the corresponding area and track when on Manual Tracking. Head to the Image Controls Window to brighten up the scene so that we can see dark spots.
![Screen Shot 2022-03-23 at 10.35.29.png](https://static.wixstatic.com/media/27f617_abca8bd02d4c44df857e3bcabef650df~mv2.png/v1/fill/w_689,h_473,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-03-23%20at%2010_35_29.png)
All points tracked.
![Screen Shot 2022-03-23 at 11.10.29.png](https://static.wixstatic.com/media/27f617_a3d5b276d3574c1aa4a957a3671e69fe~mv2.png/v1/fill/w_689,h_498,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-03-23%20at%2011_10_29.png)
As before, adjust the lens distortion.
![Screen Shot 2022-03-23 at 11.10.43.png](https://static.wixstatic.com/media/27f617_88ad50528af048a9b7382572d9da6f9a~mv2.png/v1/fill/w_689,h_506,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-03-23%20at%2011_10_43.png)
Select Brute Force the lens distortion.
![Screen Shot 2022-03-23 at 11.11.30.png](https://static.wixstatic.com/media/27f617_604cb656e4a34bda8d49e4418e883765~mv2.png/v1/fill/w_691,h_506,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-03-23%20at%2011_11_30.png)
Select Adaptive All for focal length.
![Screen Shot 2022-03-23 at 11.19.23.png](https://static.wixstatic.com/media/27f617_18a60e5740064416a511b49acdb54de6~mv2.png/v1/fill/w_691,h_472,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-03-23%20at%2011_19_23.png)
To check the tracks, go to Lineup view and see if the red dot is placed in the middle of the green cross. If not, rectify the track.
![Screen Shot 2022-03-23 at 11.42.49.png](https://static.wixstatic.com/media/27f617_55bc089efa9a491bacde65eaa95d68e7~mv2.png/v1/fill/w_690,h_484,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-03-23%20at%2011_42_49.png)
I made three tracking points on the wall to make the tracking even more precise. Then I needed to align those points with the survey points I already made for the wall.
![Screen Shot 2022-03-23 at 11.43.03.png](https://static.wixstatic.com/media/27f617_e6c3f4e13d0a433b87ca6d8b9145079e~mv2.png/v1/fill/w_691,h_454,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-03-23%20at%2011_43_03.png)
Converted the wall tracking points to survey points.
![Screen Shot 2022-03-23 at 11.50.27.png](https://static.wixstatic.com/media/27f617_f791d5495b0d472fa920332e32b03780~mv2.png/v1/fill/w_691,h_508,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-03-23%20at%2011_50_27.png)
Used Push Points Tool to align the points.
Assignment 01: 3D Match Moving Project
For the first assignment, I was given a video and a survey image to work on and do 3D match moving by using 3D Equaliser and Nuke.
​
For the assignment, I had to:
1- have a minimum of 40 tracked points
2- get a low deviation
3- use survey data
4- add locators and 3D Cube
5- export camera, LD data, locator geo, and 3d models to Nuke
6- undistort lens and reformat
7- place a cleanup patch on the fire exit sign
​
I tried to do whatever the assignment had asked me, then moved on to the next level and added a tv screen (colour bar) and a sign above the door by using the tracking data I had brought into Nuke from 3DE. Below you can see my developmental work.
Development #1:
Here I did everything the assignment had asked me.
Development #2:
I put a colour bar image on the left-hand TV and used the 3DE info I'd brought into Nuke.
Development #3:
Added a sign above the door.
Node graph
![Screenshot (1031).png](https://static.wixstatic.com/media/27f617_961d2746c4ed4ab0a079683f78811f36~mv2.png/v1/fill/w_740,h_516,al_c,q_90,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screenshot%20(1031).png)
Screen recording (3DE)
Screen recording (Nuke)
Week 07: Shooting footage for assignment 02
We went to the studio this week to shoot footage for the second assignment.
Week 08: Surveys (part 2)
We continued to explore how to use various survey data in 3DE to solidify our tracking process this week. We had a look at how to import multiple reference images into 3DE and track points across the reference images and our footage.
![Screen Shot 2022-04-06 at 09.22.57.png](https://static.wixstatic.com/media/27f617_c1af12cd62a6416e9e3b4380c4e6f72e~mv2.png/v1/fill/w_698,h_495,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-04-06%20at%2009_22_57.png)
3DE4 -> File -> Import -> Import Multiple Reference Frames... v1.1
![Screen Shot 2022-04-06 at 09.23.30.png](https://static.wixstatic.com/media/27f617_c779b2cafb6a422c80b25fc3beae1ed1~mv2.png/v1/fill/w_699,h_456,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-04-06%20at%2009_23_30.png)
Navigated to where the reference images are and imported them into 3DE.
![Screen Shot 2022-04-06 at 09.25.07.png](https://static.wixstatic.com/media/27f617_3b8580a7e50540b6a11995dde146db8b~mv2.png/v1/fill/w_699,h_498,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-04-06%20at%2009_25_07.png)
Deleted DS_Store camera as we don't need that.
![Screen Shot 2022-04-06 at 09.32.55.png](https://static.wixstatic.com/media/27f617_91bc21638a38427e875f806a0f411f37~mv2.png/v1/fill/w_699,h_513,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-04-06%20at%2009_32_55.png)
We need to have a lens for our footage and another for our ref images.
![Screen Shot 2022-04-06 at 09.50.09.png](https://static.wixstatic.com/media/27f617_7e1ecfae86764b4ba9513e622be0510b~mv2.png/v1/fill/w_699,h_519,al_c,q_90,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-04-06%20at%2009_50_09.png)
Reference lens set up.
![Screen Shot 2022-04-06 at 09.50.13.png](https://static.wixstatic.com/media/27f617_b8ca9833ad914bb9a19e0414256dbd40~mv2.png/v1/fill/w_699,h_527,al_c,q_90,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-04-06%20at%2009_50_13.png)
Footage lens set up.
![Screen Shot 2022-04-06 at 09.37.25.png](https://static.wixstatic.com/media/27f617_f413aa469ce547988f86877716663a0f~mv2.png/v1/fill/w_700,h_484,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-04-06%20at%2009_37_25.png)
Assigning lenses to the cameras.
![Screen Shot 2022-04-06 at 11.19.10.png](https://static.wixstatic.com/media/27f617_742cc696ecea4ed0a183203d98316e0d~mv2.png/v1/fill/w_698,h_496,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-04-06%20at%2011_19_10.png)
I made more than 40 tracking points and shared them across the ref images and the main footage.
Week 9 Lens Distortion and Grids
This week, we learnt how to use a grid to have a more precise lens distortion set up in 3D Equaliser. First, we were given a grid shot that was filmed for the purpose of lens distortion. Then, we brought the grid shot into 3DE as a reference camera and calculated the lens distortion based on that.
![Screen Shot 2022-04-20 at 09.30.08.png](https://static.wixstatic.com/media/27f617_1b0de1b393bd460ebe50407c55f48330~mv2.png/v1/fill/w_714,h_526,al_c,q_90,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-04-20%20at%2009_30_08.png)
Lens attributes set up.
![Screen Shot 2022-04-20 at 09.45.21.png](https://static.wixstatic.com/media/27f617_f700550bce3545ae9dd990f292294d8d~mv2.png/v1/fill/w_713,h_452,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-04-20%20at%2009_45_21.png)
About 30 tracking points added.
![Screen Shot 2022-04-20 at 09.49.10.png](https://static.wixstatic.com/media/27f617_fe1c6fcea5114b13a5737d281e25d23c~mv2.png/v1/fill/w_714,h_534,al_c,q_90,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-04-20%20at%2009_49_10.png)
Made a camera constraint since the shot is not freemove, is nodal.
![Screen Shot 2022-04-20 at 09.49.19.png](https://static.wixstatic.com/media/27f617_7951c6d70d064866b429a5f281188f56~mv2.png/v1/fill/w_713,h_484,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-04-20%20at%2009_49_19.png)
Used a camera constraint.
![Screen Shot 2022-04-20 at 09.57.14.png](https://static.wixstatic.com/media/27f617_48d8919e090f46268acdf4bfe6256f05~mv2.png/v1/fill/w_714,h_569,al_c,q_90,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-04-20%20at%2009_57_14.png)
Chose 3DE Radial as the lens distortion model.
![Screen Shot 2022-04-20 at 09.58.22.png](https://static.wixstatic.com/media/27f617_c5a8fcbd3c6b4fd9a49740cb0dbcf706~mv2.png/v1/fill/w_716,h_528,al_c,q_90,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-04-20%20at%2009_58_22.png)
Brought it the grid shot as a reference camera.
![Screen Shot 2022-04-20 at 09.59.00.png](https://static.wixstatic.com/media/27f617_ef1e2e070ebe443fbf2192b6bf34565a~mv2.png/v1/fill/w_710,h_528,al_c,q_90,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-04-20%20at%2009_59_00.png)
Grid shot selected.
![Screen Shot 2022-04-20 at 09.59.57.png](https://static.wixstatic.com/media/27f617_6b423124d05b47a9ae502125f3714aa0~mv2.png/v1/fill/w_711,h_511,al_c,q_90,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-04-20%20at%2009_59_57.png)
Changed the viewer to Distortion Grid Controls.
![Screen Shot 2022-04-20 at 10.01.05.png](https://static.wixstatic.com/media/27f617_caa73ce17402447494dd3c8ada202d2e~mv2.png/v1/fill/w_704,h_496,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-04-20%20at%2010_01_05.png)
Aligned some points with the checkerboard. Pressed Snap to have 3DE align the others points for us.
![Screen Shot 2022-04-20 at 10.02.37.png](https://static.wixstatic.com/media/27f617_19b5edbaed134fc4b6305a3e62a7417e~mv2.png/v1/fill/w_704,h_521,al_c,q_90,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-04-20%20at%2010_02_37.png)
Expanded the points to cover the whole grid.
![Screen Shot 2022-04-20 at 10.02.49.png](https://static.wixstatic.com/media/27f617_ff4ec38bb7324821bc94b1ee53a37971~mv2.png/v1/fill/w_704,h_478,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-04-20%20at%2010_02_49.png)
Finally calculated the lens distortion based on the grid we brought in.
![Screen Shot 2022-04-20 at 10.03.40.png](https://static.wixstatic.com/media/27f617_85df9792d7d848ec960e03af5ca463dd~mv2.png/v1/fill/w_705,h_507,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-04-20%20at%2010_03_40.png)
Calc lens distortion window.
Week 10: Nuke Maya Pipeline
I looked at how to take tracked footage from 3DE to Maya this week. In Maya, I learned how to prep the software to bring in my assets, set up the size of the image sequence correctly and add a 3D object to the scene. Before that, I applied lens distortion data to the image sequence in Nuke in order to undistort my footage for Maya.
![Screen Shot 2022-04-27 at 10.19.48.png](https://static.wixstatic.com/media/27f617_fd9d2ac3a2624184b0ab6295739ef371~mv2.png/v1/fill/w_710,h_486,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-04-27%20at%2010_19_48.png)
In 3DE, I tracked the footage. Next, I needed to export Mel script and lens distortion files. Note: Make sure to set the start frame to 1001 when exporting.
![Screen Shot 2022-04-27 at 10.26.39.png](https://static.wixstatic.com/media/27f617_e9987f2c84ee47468002883fc406d768~mv2.png/v1/fill/w_709,h_507,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-04-27%20at%2010_26_39.png)
Undistorted the sequence as above.
![Screen Shot 2022-04-27 at 10.29.07.png](https://static.wixstatic.com/media/27f617_46f5b561123b4a3abd5d71127ee50a22~mv2.png/v1/fill/w_710,h_468,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-04-27%20at%2010_29_07.png)
Named the undistorted sequence appropriately.
![Screen Shot 2022-04-27 at 10.48.18.png](https://static.wixstatic.com/media/27f617_42d7b939d3674d57978c06e17b37ec5a~mv2.png/v1/fill/w_710,h_486,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-04-27%20at%2010_48_18.png)
Tick "Use Image Sequence". Change the sequence to the undistorted footage.
![Screen Shot 2022-04-27 at 10.50.58.png](https://static.wixstatic.com/media/27f617_5feba021577446bc8e5c93426329def5~mv2.png/v1/fill/w_708,h_492,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-04-27%20at%2010_50_58.png)
I needed to scale up the image plane as well as film back size by using a simple python code as shown above.
![Screen Shot 2022-04-27 at 10.53.01.png](https://static.wixstatic.com/media/27f617_c94176948e104d22a9e0f2e3c5119e2e~mv2.png/v1/fill/w_707,h_502,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-04-27%20at%2010_53_01.png)
Film back size set up.
![Screen Shot 2022-04-27 at 11.11.45.png](https://static.wixstatic.com/media/27f617_dae50db4719b4e4ba780255541ee2c26~mv2.png/v1/fill/w_710,h_479,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-04-27%20at%2011_11_45.png)
Here I wanted to add a sphere to the scene. To have the shadows of the sphere, I made a plane and assigned aiShadowMatte to it.
![Screen Shot 2022-04-27 at 11.17.03.png](https://static.wixstatic.com/media/27f617_5983e4bb2f264758a0f0778a3aa92b6f~mv2.png/v1/fill/w_710,h_499,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-04-27%20at%2011_17_03.png)
Added an Area Light and turned off Normalize.
![Screen Shot 2022-04-27 at 11.19.04.png](https://static.wixstatic.com/media/27f617_ea56c9b7785d40e4818350bfd1b54434~mv2.png/v1/fill/w_710,h_493,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-04-27%20at%2011_19_04.png)
Set up the image size in the Render Settings window. The resolution of my footage needed to be multiplied by 1.1.
![Screen Shot 2022-04-27 at 11.19.39.png](https://static.wixstatic.com/media/27f617_fb30749757b94269b750af9f7255800b~mv2.png/v1/fill/w_710,h_514,al_c,q_90,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-04-27%20at%2011_19_39.png)
I could also find the correct image size (after scaling up) by going back to Nuke and clicking on the Overscan node.
![Screen Shot 2022-04-27 at 11.22.36.png](https://static.wixstatic.com/media/27f617_fb0a386058e04d5fb176d4142d8204f2~mv2.png/v1/fill/w_712,h_514,al_c,q_90,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-04-27%20at%2011_22_36.png)
I disabled the image plane when rendering the sequence as I wanted to comp the sequence in Nuke.
![Screen Shot 2022-04-27 at 11.25.22.png](https://static.wixstatic.com/media/27f617_06228936066e470c88a652d315d4f24a~mv2.png/v1/fill/w_710,h_504,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-04-27%20at%2011_25_22.png)
Render Settings set up.
![Screen Shot 2022-04-27 at 11.29.03.png](https://static.wixstatic.com/media/27f617_040bdd174f87419eafe7f53ea1eb29ff~mv2.png/v1/fill/w_711,h_530,al_c,q_90,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screen%20Shot%202022-04-27%20at%2011_29_03.png)
Rendered out beauty (RGBA), specular and shadow passes to be able to composite them in Nuke.
Comping CG back in Nuke (2 methods)
![Screenshot (1089).png](https://static.wixstatic.com/media/27f617_bc0d0827ad574f25a72864d77397677f~mv2.png/v1/fill/w_715,h_474,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screenshot%20(1089).png)
Using Shuffle node
![Screenshot (1090).png](https://static.wixstatic.com/media/27f617_7724de00dbfd42758e7ea0866996f539~mv2.png/v1/fill/w_714,h_474,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/Screenshot%20(1090).png)
Assignment 02
Above is the original footage I shot for the second assignment.
I tracked the scene in 3DE using various techniques, and then exported and brought the tracking data into Nuke.
All markers are cleaned. Scene is ready for green screen removal and adding 3D assets in Maya.
![AS_MECH_CHAR_PER_ANIM_04.jpg](https://static.wixstatic.com/media/27f617_d5a807efc9534746b6cde2ba689665c6~mv2.jpg/v1/fill/w_716,h_403,al_c,q_80,usm_0.66_1.00_0.01,enc_avif,quality_auto/AS_MECH_CHAR_PER_ANIM_04.jpg)
Above is my mech character I am going to put it in the scene right in place of the blue cube.
Note: The whole process of modelling, texturing, rigging, and so on is done by me.
I removed the green screen and used different motion capture data for my mech characters in the scene. Used multiple lights as well as 3D assets in the scene, and a bit of Python code to animate my main spotlight.
Here I have added smoke to the scene, and most importantly, I colour-graded the lights and the reflections on the characters. I also added lens dirt and used the red colour all over the scene to emulate the feeling of being in a nightclub. However, there are still a few problems that I will work on them. For instance, the table needs to be roto'ed to be placed in front of the 3D background.
Next, I played around with light groups and AOVs. I also used a bit of Python code to make flashing lights in the scene. Different render passes were used to add more life and attraction to the sequence. For instance, I defocused the background using the Z Depth channel. Also, I changed the colour and intensity of some of the lights in the scene.
​
Note: My Maya sequence was rendered with relatively low sampling rates because my computer could not handle a higher sampling quality. As a result, the rendered sequence seems noisy. I will try to render my Maya file by another machine as soon as possible and put it here.
I used a better quality Arnold rendered sequence here. Also, I made a few adjustments and changes, such as refining roto edges, redoing the green screen removal process, etc.
Final result
Video with sound.
FINAL DEVELOPMENT WITH SOUND.
​
NOTE: While I tried very hard to be as creative as possible and am happy with the result, I believe there is still room for improvement. Render quality still needs to be improved, and I will further work on a few details of my scene during the summer.
Screen recording from inside Maya and Nuke
VFX Breakdown
Video with sound.