Nuke Practical
Nuke Practical
Study in:
This study covers essential 3D nodes and workflows within Foundry Nuke, focused on creating and
integrating 3D elements into a 2D compositing workflow. Key concepts include 3D geometry, camera
tracking, lighting, and texture mapping, which allow for the blending of 3D assets with live-action
footage or other 2D elements.
Explored Tools:
3D Camera Tracker: This tool analyzes live-action footage to generate a 3D camera that
matches the movement and perspective of the scene, allowing 3D elements to be accurately
integrated.
Geometry Nodes (Card, Sphere, Cube, Axis): Basic 3D geometry nodes like Card, Sphere, and
Cube are used to create simple 3D shapes, while the Axis node allows positioning and
orientation adjustments within the 3D space.
Lighting Nodes (PointLight, Spotlight, DirectionalLight): These nodes simulate different light
sources within the 3D environment, allowing for realistic shading and shadows on 3D
elements.
ScanlineRender Node: Converts 3D scenes into 2D renders. This node is essential for creating
final composites, integrating lighting, textures, and camera views to produce a realistic shot.
Project3D Node: Projects 2D images or textures onto 3D geometry, useful for adding
textures to objects or creating matte paintings that match the scene’s perspective.
Shader and Material Nodes (Phong, Lambert): Used to define the surface appearance of 3D
objects. These shaders can control properties such as reflectivity, glossiness, and
transparency.
Depth and Z-Depth Nodes: These nodes work with depth data to create depth-based effects
like fog, focus pulls, and atmospheric effects for added realism.
PointCloud Generator: Creates a point cloud based on camera data or other 3D data,
providing a visual reference for matching or placing objects in complex scenes.
Concepts Covered:
1. Camera Tracking and Match-Moving: Using the 3D Camera Tracker to analyze footage and
match the camera’s movement and perspective in Nuke’s 3D space for seamless 3D
integration.
2. 3D Geometry Creation and Manipulation: Working with Geometry nodes (Card, Sphere,
Cube, Axis) to create, position, and animate simple 3D objects within the scene.
3. Lighting and Shading: Utilizing light nodes and shaders (Phong, Lambert) to create realistic
lighting and surface properties, matching the look and feel of live-action footage.
6. Depth-Based Effects: Leveraging depth data for advanced compositing effects, such as fog, z-
depth-based focus, or atmospheric perspective, adding depth to a scene.
7. Point Clouds for Placement and Reference: Using point clouds generated from footage to
accurately place 3D objects within the 3D space, useful for complex compositing tasks.
Study in:
This study delves into advanced match-moving techniques within Foundry Nuke, focusing on
achieving highly accurate camera tracking and object tracking to seamlessly integrate CGI with live-
action footage. Key concepts include refining camera tracking, manual and automatic tracking
techniques, working with lens distortion, and optimizing track data for complex scenes.
Explored Tools:
3D Camera Tracker Node: Core to match-moving, this node analyzes footage to generate 3D
tracking points and a virtual camera. Advanced settings allow for refining camera movement
and addressing complex shots.
Lens Distortion Node: Corrects lens distortion or applies known distortion to CGI elements.
Essential for ensuring that tracked elements match the original footage’s lens characteristics.
Manual Tracking and Point Tracker Node: Enables precise manual tracking of key points in
footage for shots where automatic tracking struggles, allowing for fine control over track
points.
Solver and Refine Features in Camera Tracker: Used to refine the accuracy of a camera track,
these features help to adjust track points, solve multiple cameras in complex shots, or
troubleshoot problematic tracks.
Survey Data and Locators: Integrates real-world measurements or survey data (when
available) to improve tracking accuracy and aid in aligning the CGI scene with real-world
geometry.
Planar Tracker Node: Useful for tracking flat surfaces in scenes with minimal perspective
shift. Ideal for background elements or simpler tracking tasks within a shot.
Depth Map and Z-Depth Nodes for Tracking Depth: Incorporates depth information to
create depth-based tracking solutions, enhancing precision in 3D space alignment.
Exporting 3D Track Data: Exports track data to other software or Nuke scripts for continuity
across shots or sharing tracking information across a project.
Concepts Covered:
1. Accurate Camera Tracking for Complex Shots: Using the Camera Tracker’s advanced settings
to achieve precise camera movement for complex or fast-moving scenes, addressing
challenges like motion blur and partial occlusions.
2. Lens Distortion Matching: Correcting and reapplying lens distortion to ensure that CGI
elements match the live-action footage precisely, especially for wide-angle or anamorphic
lenses.
3. Manual Tracking Techniques: Employing the Point Tracker and Manual Tracking for shots
where automated tracking fails, such as scenes with minimal texture or high motion blur.
4. Refining Track Data: Using Solver and Refine tools to improve tracking accuracy, solve
complex shots, and minimize drift, ensuring consistent integration of CGI elements
throughout the shot.
6. Planar Tracking for Static Elements: Applying the Planar Tracker for elements with minimal
perspective changes, such as background surfaces, for stable tracking in simpler parts of the
scene.
8. Exporting and Sharing Track Data: Exporting tracking data for use in other Nuke scripts or
external software, ensuring consistency in camera movement and object placement across
multiple shots.
Study in:
This study focuses on the essential techniques for shooting live-action footage that will be
composited with CGI elements in post-production. Key concepts include planning for compositing,
setting up shots to match CGI requirements, capturing data for tracking and lighting, and managing
on-set elements to facilitate seamless integration of VFX.
Camera Setup and Specifications: Selecting the right camera settings (resolution, frame rate,
shutter speed) to match the quality and characteristics needed for VFX compositing, ensuring
clarity and detail.
Lens Distortion and Lens Data Capture: Capturing lens data and managing lens distortion to
match CGI elements with live-action footage accurately. This includes understanding lens
type (anamorphic, wide-angle) and capturing metadata.
Green Screen/Blue Screen Setup: Setting up chroma key screens with proper lighting and
positioning, ensuring minimal spill and shadow for effective keying during compositing.
Tracking Markers Placement: Adding tracking markers to surfaces and actors as needed for
match-moving and motion tracking in post-production. Includes understanding ideal marker
placement and visibility.
HDRI and Lighting References: Capturing High Dynamic Range Images (HDRI) and lighting
reference spheres (grey, chrome) on set to accurately replicate on-location lighting for CGI
elements.
Depth and Distance Measurement: Using measurement tools (such as laser or tape
measures) to record the distance between the camera and key elements in the scene. Useful
for recreating accurate 3D environments and aligning CGI elements.
Environment and Texture Capture: Photographing or scanning the environment and key
textures on set to create realistic textures and backgrounds in CGI.
Shadow and Reflection Control: Using shadow casters, reflective surfaces, or blackouts to
control how shadows and reflections interact on set, making it easier to integrate CGI
elements.
Concepts Covered:
1. Camera Settings for VFX Integration: Selecting appropriate resolution, frame rate, and
shutter speed for high-quality footage that supports detailed compositing and matches CGI
render requirements.
2. Lens Distortion Management: Understanding lens types and capturing lens data for accurate
lens distortion matching in post, ensuring CGI elements align with the real footage.
3. Optimal Green/Blue Screen Setup: Setting up chroma key screens with proper lighting,
positioning, and minimal spill to create a clean key during the compositing process.
4. Marker Placement for Tracking: Strategically placing tracking markers for effective 3D
camera tracking and object tracking in post-production, including considerations for visibility
and alignment.
5. Capturing HDRI and Lighting References: Gathering HDRI and lighting sphere references to
recreate accurate lighting, reflections, and shadow behavior for CGI in post-production.
6. Recording Measurements for Depth Matching: Measuring and recording distances and
angles between the camera and scene elements, essential for building realistic 3D
environments in post.
7. Environment and Texture Data Gathering: Collecting texture and environmental data (e.g.,
photos, scans) to replicate scene details in CGI, allowing for seamless integration of
elements.
8. Managing Shadows and Reflections on Set: Using tools like reflectors, flags, and diffusers to
control how shadows and reflections appear in the footage, simplifying CGI integration.
Topic: Advanced Multipass Compositing
Study in:
This study explores advanced techniques in multipass compositing, where various render passes are
used to create a high level of control over the final image. Key concepts include combining and
adjusting different passes (e.g., diffuse, specular, reflection) to achieve a photorealistic look, color
grading, and fine-tuning each layer to enhance the final composite's realism.
Explored Tools:
Diffuse Pass: Represents the base color of an object without any lighting or reflections. Used
as the foundation for compositing.
Specular Pass: Adds highlights and reflections from light sources, essential for defining
surface properties and enhancing the material's look.
Reflection Pass: Captures only the reflections on the object’s surfaces. This pass allows fine
control over reflective details for realistic integration into the scene.
Ambient Occlusion (AO) Pass: Adds contact shadows in crevices and where objects meet,
providing depth and a sense of grounding.
Z-Depth Pass: Contains depth information to create depth-based effects like fog, depth of
field, and atmospheric perspective.
Normal Pass: Provides surface orientation data, allowing for re-lighting and adjusting
shadows in compositing by manipulating surface normals.
Shadow Pass: Isolates shadows cast by objects, which can be adjusted independently for
better integration with live-action or CGI environments.
Emission Pass: Isolates self-illuminated elements, like lights or glowing objects, allowing for
control over their brightness and color during compositing.
Light Passes (Key, Fill, Rim): Separates lighting layers into different passes for key, fill, and rim
lighting, providing ultimate control over the lighting in the composite.
Concepts Covered:
1. Layering and Blending Passes: Using layering techniques (add, multiply, screen) to blend
different passes together, achieving realistic lighting and shadow interplay.
2. Fine-Tuning Reflections and Highlights: Adjusting the Reflection and Specular passes to
control glossiness, reflectivity, and highlight intensity, matching the look of real-world
materials.
3. Ambient Occlusion for Depth: Using the Ambient Occlusion pass to add subtle shadows and
depth where objects meet, grounding them in the scene and enhancing realism.
4. Z-Depth-Based Effects: Leveraging the Z-Depth pass to apply depth-of-field, fog, and
atmospheric effects, creating a sense of distance and space in the scene.
5. Re-Lighting with Normal Pass: Using the Normal pass to add or adjust lighting in post-
production without re-rendering, giving greater flexibility to fine-tune the lighting.
6. Shadow Adjustments: Isolating and adjusting shadows using the Shadow pass, allowing for
control over shadow color, softness, and opacity, especially when matching with live-action
footage.
7. Emissive and Glow Effects: Enhancing glow and light from self-illuminated objects using the
Emission pass, allowing precise control over brightness and color for VFX-heavy scenes.
8. Creative Color Grading and Finishing: Applying color grading to balance all passes, creating a
cohesive look and ensuring the composite blends naturally with live-action footage.
Study in:
This study focuses on advanced keying techniques in Foundry Nuke, with a particular emphasis on
achieving clean keys for complex elements such as hair, fur, and semi-transparent objects. Key
concepts include precise control of edges, managing spill suppression, working with multiple keyers,
and refining mattes to handle fine details without losing texture or introducing noise.
Primatte and Keylight Nodes: Core keying tools in Nuke used for pulling basic keys from
green screen or blue screen footage. Advanced settings allow for fine-tuning edge quality,
color spill, and transparency.
Ultimatte Keyer: A more advanced keying option, often used for handling difficult edges and
color corrections. Ultimatte can be helpful in balancing between foreground and background
color tones.
Despilling Nodes: Tools like EdgeBlur, DespillMadness, and custom despill setups are used to
remove unwanted color spill from green or blue screen backgrounds, essential for preserving
natural colors in fine details like hair.
Edge Extend and Edge Matte: Techniques and tools for extending or refining the edges of the
matte, used to address hair edges and prevent harsh cut-offs around delicate details.
Erode and Dilate Nodes: Helps adjust the thickness of the alpha matte, allowing control over
how much background is visible around the subject’s edges, particularly useful for hair and
thin objects.
Unpremult and Premult Workflow: Allows for manipulation of alpha channels and RGB
channels separately, enabling control over color corrections and edge refinements without
impacting the matte's transparency.
Additive Keying for Hair and Transparency: Uses multiple keys for different parts of the
subject, especially useful for subjects with hair or transparent fabrics. This technique allows
compositors to combine several keys to preserve both soft edges and fine detail.
Luminance Key and Difference Key: Used in conjunction with other keyers to extract fine
details based on luminance differences, helpful in isolating delicate areas like hair or semi-
transparent objects.
Concepts Covered:
1. Refining Edges and Softness: Using Primatte and Keylight to pull a clean initial key, followed
by fine-tuning edge softness, essential for natural-looking transitions around hair and
detailed edges.
2. Color Spill Suppression and Despill Techniques: Applying despill nodes and custom despill
techniques to remove color contamination from the background, especially around light-
colored or translucent hair.
3. Edge Matte Creation for Delicate Details: Using Edge Extend or custom edge mattes to
enhance the matte’s boundary, providing better control over hair edges and avoiding harsh
lines.
4. Multipass Keying for Complex Elements: Combining multiple keys (additive keying) to
achieve a balanced composite, where hair and semi-transparent objects retain their natural
appearance without losing detail.
5. Erosion and Dilation for Matte Adjustments: Controlling the matte's size and softness by
adjusting the Erode and Dilate settings, allowing subtle adjustments around intricate details
like hair.
6. Unpremult/Premult Workflow for Fine Control: Separating alpha and RGB channels to make
precise adjustments, such as color correction or edge blur, without impacting transparency
or introducing artifacts.
7. Luminance-Based Keying for Additional Detail: Using Luminance Key or Difference Key in
combination with other keying methods to isolate hair and soft edges based on brightness,
adding flexibility in handling complex details.
Study in:
This study explores the use of advanced projection techniques for rotoscoping (roto) in Foundry
Nuke, a method that leverages 3D projections to create accurate and efficient rotoscopes,
particularly for complex shapes and motion. Key concepts include using camera projections, 3D
geometry, and tracking data to create stable and precise mattes, reducing the need for extensive
frame-by-frame adjustments.
Project3D Node: Projects images or textures onto 3D geometry within Nuke, allowing
rotoscoping to follow complex shapes with perspective changes, even across multiple
frames.
Card and Geometry Nodes: Uses basic 3D geometry (e.g., cards, spheres) to match the scene
elements and act as the base for projections, providing structure and alignment for roto
shapes.
RotoPaint Node with Projection: Combining RotoPaint with projections to create shapes
that follow object contours and camera movement, reducing roto effort for complex or fast-
moving scenes.
Depth Map Utilization: Uses depth data from Z-depth passes or generated depth maps to
align roto elements with the correct depth in the scene, ensuring accurate placement of
projections.
FrameHold and Stabilization Techniques: Freezes or stabilizes key frames in the footage to
simplify projection setup, especially useful for scenes with significant camera or subject
movement.
Refinement with Matte Adjustments: After projecting, matte adjustments with edge
refinement, feathering, and blur nodes are applied to enhance accuracy and blend roto
edges with the scene seamlessly.
Lighting Adjustments in 3D Roto Projections: Adjusting projected roto shapes to account for
lighting changes in the live-action footage, helping the roto blend naturally across light and
shadow changes.
Concepts Covered:
1. Camera Tracking and Alignment: Using the 3D Camera Tracker to capture the scene’s camera
movement, aligning projections and roto shapes with the original footage for accurate
positioning.
2. Projection on 3D Geometry: Utilizing 3D geometry and the Project3D node to project roto
shapes onto objects, particularly useful for complex objects with movement, perspective
shifts, or curves.
3. Efficient Roto with Stabilized Projections: Applying stabilization techniques to simplify the
roto setup, allowing for stable projections across frames and reducing manual adjustments.
4. Depth Matching for Roto Accuracy: Using depth information to place roto shapes accurately
within 3D space, ensuring that roto elements match the real-world positioning and depth of
scene elements.
5. FrameHold and Key Frame Projection Techniques: Freezing specific frames and creating
projections on these key frames for efficient roto across shots, minimizing frame-by-frame
adjustments.
6. Edge Refinement for Smooth Transitions: Enhancing roto edges with matte adjustments like
feathering and blur to ensure that projected shapes blend naturally with surrounding
elements.
7. Adapting to Lighting Variations: Adjusting projected roto shapes to follow changes in lighting
and shadows within the footage, allowing the roto to appear natural even in scenes with
dynamic lighting.
8. Combining Projection Roto with Manual Refinement: Blending projection-based roto with
manual adjustments to address fine details and challenging elements, ensuring a high level
of detail in the final matte.
Study in:
This guide addresses methods for camera matching in Foundry Nuke when camera details, such as
focal length and grid references, are unavailable. These techniques utilize Nuke's powerful
compositing and 3D tracking tools to approximate camera motion and achieve convincing alignment
of CGI with live-action footage.
Explored Tools:
3D Camera Tracker: Core to matching shots, this tool estimates camera movement in scenes.
Use reference points and parallax cues to solve scenes without specific camera data.
Transform Nodes (Scale, Rotate, Translate): Adjusts positioning, scaling, and rotation of
layers to visually align CGI with background elements, essential for manual camera matching.
Grade, ColorCorrect, and HueCorrect Nodes: Useful for visually integrating CGI with live-
action footage through color adjustments, ensuring smooth blending.
Roto and RotoPaint Nodes: Create and refine masks, helpful for isolating elements and
maintaining perspective in manual camera matching.
Blur and Defocus Nodes: Add realistic depth effects, helping to recreate depth of field for
layers, especially in scenes with subtle focus variations.
Concepts Covered:
1. Reference Points and Parallax Cues: Analyze scene elements that shift as the camera moves.
Use these cues in Nuke's 3D Camera Tracker to help estimate depth and position in the
absence of precise data.
2. Estimating Focal Lengths: Experiment with common focal lengths (e.g., 35mm, 50mm)
within the 3D Camera Tracker until the scene appears visually correct.
3. Manual Placement and Scaling: Use Transform nodes to manually adjust camera position,
scale, and rotation, aligning CGI with background elements closely by eye.
4. 3D Point Clouds for Depth Cues: Align key features in your scene using point clouds from the
3D Camera Tracker as a basic spatial layout for guiding camera placement.
5. Fine-Tuning with Match-Move Nodes: Adjust Match-Move settings to control camera
transformations incrementally, refining the alignment of CGI with live-action.
Study in:
This study focuses on mastering advanced camera solving techniques within Foundry Nuke. The aim
is to achieve accurate and complex 3D tracking for seamless integration of CGI with live-action
footage, even in challenging environments.
Lens Distortion Node: Use this to correct or match the distortion in footage, ensuring that
3D elements match the live-action environment accurately. Start with undistorting footage,
solving the camera, and then redistorting the final composite for a realistic blend.
3D Camera Tracker - High-Detail Mode: For detailed scenes, use high-detail mode to capture
finer tracking points, particularly helpful for scenes with complex textures or surfaces. This
will ensure a more accurate solve by capturing subtle movements and depth cues.
Survey Footage Workflow: Use multiple shots or reference images of the same scene to
triangulate camera positions. By aligning point clouds from different angles, you can achieve
a more precise 3D camera solve.
Manual Feature Tracking and Point Cloud Refinement: Use manual feature tracking on
specific elements that remain steady in the scene to lock critical points, then refine the point
cloud to ensure consistency and minimize drifting.
Parallax and Z-Depth Calibration: Incorporate Z-depth techniques to capture subtle parallax
effects, especially when tracking across complex 3D surfaces or objects in foreground and
background layers.
Concepts Covered:
1. Multi-Point Camera Solving: Use Nuke’s ability to handle multiple trackers and align them
into a cohesive point cloud. This is particularly useful for scenes with complicated camera
movement, such as rotations and tilts.
2. Depth Layering with Key Frames: For scenes where depth is crucial, animate key frames on
different depth layers to achieve a realistic 3D space. This helps in shots where focal distance
changes mid-shot, creating accurate parallax between layers.
3. Solving for Large Parallax Shifts: Use tracking markers or identifiable high-contrast points to
solve for large parallax shifts. This approach is particularly useful for aerial or drone footage,
where camera paths are dynamic.
4. Refining Solve Quality: Adjust settings such as Focal Length, Lens Distortion, and Tracking
Accuracy to minimize reprojection errors and refine the camera solve. For difficult solves,
consider breaking the scene into sections and solving in parts.
5. Integrating Environmental Effects: After camera solving, integrate environmental effects like
fog or motion blur with the 3D elements. Use depth maps to control these effects based on
distance from the camera.
Study in:
This guide explores creative methods for achieving seamless compositing within Foundry Nuke,
focusing on techniques that blend CGI and live-action footage smoothly. By leveraging innovative
approaches, you can enhance the realism and visual consistency of your composited scenes.
Edge Blending and Light Wraps: Light wraps allow CGI elements to inherit the subtle light
bleed from the background, creating a natural blend. This technique is especially effective for
scenes with strong backlighting or high-contrast edges.
Multi-Pass Compositing and AOVs (Arbitrary Output Variables): Use multi-pass rendering to
gain control over individual aspects like diffuse, specular, shadow, and ambient occlusion.
Combining these in layers gives more control to match CGI with live-action lighting and
shading.
Atmospheric Integration: Use fog, haze, or depth-of-field effects to help integrate CGI
elements. Applying these effects can replicate environmental depth cues, essential for
making objects feel like they are part of the same space.
Relighting Techniques with Normals and Position Passes: Use normals and position passes
to re-light objects in the composite. This allows you to adjust lighting post-render, making it
easier to match the live-action lighting conditions dynamically.
Motion Blur Matching: Use motion blur settings on CGI elements that match the motion of
the camera and live-action footage. Fine-tune blur amounts to align CGI movement with the
real-world motion, enhancing the illusion of integration.
Concepts Covered:
1. Depth and Layering: Layer elements with varying depth to enhance parallax and make
composites more dynamic. Use Z-depth to apply effects like fog or color grading selectively
based on distance.
2. Color and Tone Matching: Use Grade and ColorCorrect nodes creatively to match the overall
color tone of CGI to the live-action environment. Subtle color adjustments are crucial to
maintaining consistency in light and shadow.
3. Using Noise and Grain: Add slight film grain or noise to CGI elements to match the texture of
live-action footage. This technique helps prevent the CGI from looking too clean or digital.
4. Blending Techniques for Shadows and Reflections: For shadows, use soft shadow edges and
graded opacities to match ambient lighting. For reflections, integrate subtle reflections on
surfaces by compositing elements like reflective floors or windows to ground CGI in the
scene.
5. Texture Matching and Ambient Occlusion: Match textures on CGI elements with live-action
surfaces for consistency. Use ambient occlusion passes to darken creases, corners, or where
objects intersect, making CGI objects appear more realistically embedded in the
environment.
Study in:
This guide focuses on mastering animation curves and keyframe handling in Foundry Nuke. By
controlling animation curves and keyframes effectively, you can create smooth transitions, realistic
motion, and nuanced adjustments in your animations.
Graph Editor: Nuke’s Graph Editor is essential for viewing and modifying animation curves.
Use it to adjust the speed, timing, and easing of animated properties like position, rotation,
and opacity.
Bezier and Linear Interpolation: Experiment with different interpolation types to control
how your animation progresses between keyframes. Bezier curves allow for smoother, more
natural movement, while linear interpolation provides a constant speed, which is useful for
mechanical motion.
Keyframe Reduction and Smoothing: For smoother animations, use keyframe reduction to
simplify overly complex animations. This can help eliminate jittery movements and make
adjustments more manageable.
Custom Easing with Handle Control: Customize the easing of curves by adjusting Bezier
handles. This technique allows for refined control over acceleration and deceleration, helping
to create realistic motion.
Looping and Offset Animation: For repeated actions, use looping techniques in the curve
editor. Offset the start and end values for animations like rotating wheels or blinking lights,
which benefit from repetitive motion.
Concepts Covered:
1. Curve Adjustments for Realistic Motion: Use curves to simulate real-world physics, such as
acceleration and deceleration, by fine-tuning the curve shape between keyframes. Adjusting
the slope and tension of curves gives you nuanced control over the timing.
2. Timing and Spacing Control: Experiment with spacing between keyframes to control timing.
Larger gaps create faster motion, while closer keyframes create slower transitions. This is
crucial for achieving believable motion.
3. Overshoot and Settle Effects: For animations that should “settle” into position, such as a
bouncing object, add slight overshoots and dampened oscillations to the curve. This
technique adds realism by mimicking physical inertia.
4. Non-Linear Motion with Ease-In and Ease-Out: Use ease-in and ease-out to make motion
start and end smoothly. This is particularly effective for animations that need a natural-
looking start or stop, like character movements or object interactions.
5. Managing Complex Animations with Expressions: For more intricate animations, consider
using expressions to control keyframe values. This is useful for creating procedural
animations, such as pulsing lights or rhythmic movements.
Study in:
This guide focuses on using deep compositing nodes in Foundry Nuke to create more accurate and
versatile composites. Deep compositing is especially useful for scenes with overlapping layers, dense
effects like fog or smoke, and precise depth-based adjustments.
DeepRead and DeepWrite: Import and export deep image data, preserving per-pixel depth
information throughout your workflow. Use DeepWrite for rendering out deep images from
3D software.
DeepRecolor: Adjust color values of deep images based on depth. This node is helpful for
adding subtle color variations across different depths within a single element, enhancing the
sense of depth.
DeepTransform: Adjust the position, scale, and rotation of deep images, allowing for depth-
aware transformations. This is especially useful for adjusting placement without losing depth
data.
DeepHoldout: Create holdouts by cutting out areas based on depth. This is ideal for isolating
or removing objects at specific depth levels without affecting foreground elements.
Concepts Covered:
1. Depth-Based Masking and Holdouts: Use DeepHoldout and DeepRecolor to mask or adjust
areas at specific depth levels, avoiding the need for manual rotoscoping. This technique is
useful for integrating effects like fog or smoke layers with live-action footage.
2. Depth-Driven Color and Lighting Adjustments: With DeepRecolor, apply color changes based
on depth to simulate environmental effects. This can enhance realism by making distant
elements slightly bluer or darker, for example.
4. Optimizing Render Passes with DeepToImage: Use DeepToImage for efficient rendering
when deep data is no longer needed in later stages of the pipeline. This approach saves
computational resources while preserving depth-based effects.
5. Depth-Based Defocus and Blur Effects: By using depth data, you can apply depth-of-field
effects more precisely. Use the DeepRecolor or DeepDefocus nodes to add natural depth
blurring, enhancing the 3D look of the composite.
Study in:
This guide explores techniques for 3D relighting and retexturing in Nuke, allowing you to adjust
lighting and textures in compositing without re-rendering in 3D software. This workflow is especially
useful for refining CGI elements and achieving cohesive integration with live-action footage.
Normal and Position Passes: Use normal and position passes to inform relighting. The
normal pass provides information about surface orientation, while the position pass offers
data about spatial location, both essential for accurate light calculations.
Relight Node: Nuke’s Relight node enables you to add lights based on normal and position
passes, allowing for flexible lighting changes in post-production. Experiment with different
light types (e.g., point, directional) and intensities to match the scene’s natural lighting.
Ambient Occlusion (AO) Passes: Use AO passes to add realistic shadows in crevices and
where surfaces meet. AO contributes to depth perception and creates a more grounded look
for CGI elements when relighting.
Specular and Reflection Passes: Specular and reflection passes allow you to fine-tune
highlights and reflections based on the new lighting setup. Adjusting these passes can add
realism to surfaces that need dynamic lighting.
UV and STMap Passes: UV passes map the texture precisely onto the 3D surface, while
STMap is used to remap textures. Using these in combination allows you to retexture objects
accurately within Nuke, adding or changing textures in post without rendering new 3D
models.
Expression and ColorCorrect Nodes for Texture Adjustment: Use these nodes to modify
existing textures on the fly. The Expression node can add procedural changes, while
ColorCorrect can adjust hue, saturation, and contrast based on the look you want to achieve.
Displacement and Bump Maps: If the original scene includes displacement or bump maps,
use similar maps in retexturing workflows to achieve detailed surface textures. Adjusting
these can give depth to otherwise flat textures, enhancing realism.
Concepts Covered:
1. Dynamic Lighting Adjustments: Adjust lights dynamically using the Relight node, enabling
fine-tuning for shadow angles, light fall-off, and color temperature without re-rendering. This
technique is useful for achieving consistent lighting across shots.
3. Procedural Texturing and Mapping: Apply procedural textures using UV and STMap passes,
which allow you to map complex patterns or gradients onto surfaces accurately.
4. Simulating Depth with AO and Shadow Passes: Use AO and shadow passes to create depth
in areas with heavy shadowing, giving a more realistic feel when objects overlap or interact
closely.
5. Surface Material Adjustments: Adjust material properties like roughness and metallic
attributes using ColorCorrect and grade nodes, adding more control over texture
appearance.
Study in:
This guide focuses on advanced VR compositing techniques to create immersive, interactive 360-
degree experiences. VR compositing involves handling panoramic footage, stitching, reorienting, and
integrating CG elements in a way that maintains spatial coherence when viewed in VR.
Spherical Transform and LatLong Nodes: Use Spherical Transform to convert footage
between different projections (e.g., latlong to cube map). This enables you to work more
comfortably with panoramic footage while ensuring that the perspective is accurate for VR.
VR Viewer in Nuke: Utilize the VR Viewer in Nuke to preview your composite in 360 degrees.
This tool lets you assess stitching quality, lighting, and spatial continuity within the VR space.
Advanced Keying and Roto for 360-Degree Composites: Keying and roto work can be
challenging in VR because elements are visible from every angle. Refine your keying
techniques to ensure that transitions are smooth and that there’s no break in the spherical
continuity.
Concepts Covered:
1. 360-Degree Stitching and Reprojection: Practice stitching multiple camera feeds together to
create seamless panoramic footage. Ensure that reprojected elements align perfectly across
seam lines for a smooth viewer experience.
2. Depth and Parallax Management: VR compositing often involves handling depth information
accurately. Use depth passes and parallax adjustments to position objects within the VR
space to avoid unnatural shifts as viewers change angles.
3. Seamless Integration of CGI and Live Footage: Use reorientation nodes to match lighting and
shadows in CGI elements with the live-action VR footage. This is key for ensuring that CGI
additions feel like a natural part of the 360-degree environment.
4. Light and Shadow Adjustments in 360 Degrees: Position lights carefully to avoid unnatural
shadows or lighting inconsistencies in VR. Practice creating lights that mimic natural sources,
taking care to ensure that light spreads naturally across the entire sphere.
5. Advanced VR Transitions: Experiment with transitions suited for VR, such as spatial dissolves
and fades that maintain immersion. This is especially useful for scene changes in VR
storytelling.
Topic: [Insert Topic Here, e.g., “Panoramic Matte Creation with Camera Projection”]
Study in:
This study covers essential techniques and workflows for creating panoramic mattes and utilizing
camera projection within [software name, e.g., Foundry Nuke]. Key concepts include panoramic
scene creation, camera projection mapping, and integrating 3D elements to enhance realism. This
approach is widely used to establish large-scale environments in visual effects, blending matte
paintings with live-action footage or CGI for a seamless, immersive look.
Explored Tools:
[Relevant Tool 1, e.g., 3D Camera Tracker]: Essential for creating accurate camera movements and
projections in panoramic scenes, matching the movement and perspective of live-action footage.
[Relevant Tool 2, e.g., Project3D Node]: Used to project images or matte paintings onto 3D geometry,
allowing dynamic perspectives and seamless transitions.
[Additional Tools]: Other tools you’ll use, like RotoPaint for detailing, Merge for layering,
Blur/Defocus for depth of field, etc.
Concepts Covered:
Panoramic Matte Creation: Techniques for stitching and aligning multiple images or paintings to
create a seamless 360-degree environment.
2. Layering and Depth Integration: Combining multiple depth layers to create a sense of
space and realism, adjusting scales and transforms for consistency.
3. Color Matching and Grading: Using grading tools to harmonize colors and light across
different panoramic sections, making composites visually coherent.
Study in:
This study focuses on the essential techniques and workflows for compositing on set, which is crucial
for real-time feedback and ensuring accurate integration of CGI with live-action footage. Onset-based
compositing helps VFX artists preview and troubleshoot potential compositing issues during
production, enhancing the quality and efficiency of post-production work. Key concepts include
camera tracking, color matching, and real-time keying on set to validate shots and ensure they align
with the final VFX vision.
Explored Tools:
Onset Camera Tracking: Real-time camera tracking to capture precise camera movement data,
allowing seamless CGI integration during post-production.
Keying Tools: Real-time keying tools (e.g., Ultimatte, Primatte) used on set to preview CGI elements
against blue/green screen footage, ensuring accurate foreground and background separation.
Color Grading Tools: Tools for onset color matching to ensure that lighting and color match the scene
requirements, reducing the need for extensive adjustments in post.
Onset Monitors and Compositing Software: Portable compositing systems that enable immediate
preview of composited shots, such as using Nuke with live feeds for real-time feedback.
Concepts Covered:
1. Real-Time Tracking and Keying: Using real-time tracking and keying tools on set to
ensure the camera movement and keying are correctly captured, aiding seamless CGI
integration.
2. Color Matching On Set: Applying grading techniques on set to harmonize lighting and
colors, ensuring consistency with the final vision and reducing post-production
adjustments.
4. Camera and Lighting Data Collection: Gathering accurate camera settings, lighting
setups, and environmental information for use in post-production, ensuring visual
coherence.
Study in:
This study focuses on the essential techniques and workflows in VFX editorial, a critical process for
organizing, managing, and reviewing VFX shots within the context of the overall film or project. VFX
editorial involves handling shot versions, ensuring continuity, managing shot status, and
collaborating closely with the editing team and VFX supervisors. Key concepts include shot tracking,
version control, frame accuracy, and communication between departments to maintain visual and
narrative consistency.
Explored Tools:
Timeline and Editing Software: Tools like Avid Media Composer, Adobe Premiere Pro, or specialized
VFX editorial software to manage shot sequences and edit timelines with frame accuracy.
Shot Tracking Software: Applications like Shotgun or FTrack for tracking VFX shots, versions, statuses,
and feedback, providing a centralized view of the project’s progress.
Frame.io or Similar Review Platforms: For collaborative reviews, allowing teams to comment on and
approve shots, ensuring all notes and changes are easily accessible.
Versioning Tools: Systems to manage multiple versions of shots, ensuring that the correct iterations
are used in the final sequence.
Concepts Covered:
1. Timeline Management and Continuity: Arranging shots within the timeline, ensuring
frame-accurate placement, and maintaining continuity across VFX-heavy sequences.
2. Shot Tracking and Status Management: Using tracking software to monitor the status
of VFX shots, from initial plate delivery to final approval, facilitating smooth project
progression.
3. Version Control: Managing different versions of shots, ensuring that the latest
approved versions are integrated while keeping previous iterations organized and
accessible.
4. Collaboration with Editorial and VFX Teams: Working closely with editors, VFX
supervisors, and artists to align the visual and narrative aspects of the shots with the
film’s story.
5. Quality Control and Technical Checks: Ensuring shots meet quality standards,
checking for frame accuracy, color consistency, and technical specifications before
final delivery.
Study in:
This study focuses on advanced compositing techniques widely used in large-scale VFX production
environments. Industrial compositing methods emphasize efficiency, scalability, and maintaining
high-quality output across hundreds or thousands of shots. Key concepts include automated
workflows, optimized node setups, shot templating, and consistent color pipelines to meet the
demands of feature films, television series, and commercials.
Explored Tools:
Template-Based Node Structures: Pre-built node setups for common compositing tasks, allowing for
faster and more consistent results across multiple shots.
Automated Scripting and Pipeline Integration: Using Python scripting and pipeline integration tools
(e.g., Shotgun or proprietary pipeline tools) to automate repetitive tasks and streamline shot
updates.
Color Management Systems: Implementing consistent color spaces across shots using tools like OCIO
(OpenColorIO) to ensure color accuracy throughout the compositing pipeline.
High-Dynamic-Range (HDR) and Multi-Channel EXR Handling: Working with complex EXR files
containing multiple channels and HDR data, essential for high-quality compositing in industrial
environments.
Concepts Covered:
1. Template-Based Workflow: Utilizing standardized templates for recurring tasks (e.g.,
keying, roto, depth-of-field) to maintain consistency and improve efficiency.
Study in:
This study focuses on the collaborative aspects of compositing within a VFX team. Working effectively
as part of a compositing team requires strong communication, version control, adherence to shared
workflows, and the ability to integrate feedback smoothly. Teamwork is essential for large projects
where multiple artists work on interconnected shots, ensuring consistency and alignment with the
project’s visual goals.
Explored Tools:
Version Control and Shot Tracking: Tools like Shotgun, FTrack, or internal tracking systems for
managing shot assignments, tracking versions, and monitoring feedback.
Shared Templates and Node Graph Standards: Using standardized templates and structured node
setups to maintain consistency across shots handled by different artists.
Collaboration Platforms: Tools like Frame.io, Slack, or internal messaging systems to facilitate
feedback, quick updates, and ongoing communication.
Quality Control Checklists: Standardized checklists for quality control that ensure every shot meets
project standards before passing to the next stage.
Concepts Covered:
5. Cross-Shot Consistency: Ensuring color, lighting, and overall look are consistent
across shots worked on by different artists, using reference frames and
communication to maintain cohesion.
Study in:
This study focuses on emerging trends and advancements in compositing tools, which are reshaping
workflows and introducing innovative techniques to visual effects. Staying updated with new
software, tools, and features enables artists to work more efficiently and produce higher-quality
results. Key areas include AI-powered automation, real-time compositing, enhanced 3D capabilities,
and improved color grading tools, each of which has implications for the future of compositing.
Explored Tools:
AI-Based Tools: AI-powered nodes and features in tools like Foundry Nuke, Adobe After Effects, and
other platforms that automate tasks such as rotoscoping, tracking, and keying.
Real-Time Compositing Software: New compositing tools that support real-time rendering (e.g.,
Unreal Engine’s compositing features) for immediate feedback, ideal for virtual production.
Enhanced 3D and Depth Capabilities: Tools that integrate 3D compositing more seamlessly, like
Blender’s compositor, Nuke’s deep compositing, and new depth-based workflows.
Advanced Color Grading and HDR Support: Features supporting HDR and improved color spaces in
compositing tools for better color accuracy and dynamic range.
Concepts Covered:
1. AI and Machine Learning: Exploring AI-driven tools that automate repetitive tasks,
improving efficiency and reducing manual labor for tasks like rotoscoping and object
removal.
4. HDR and Color Space Management: Leveraging enhanced color grading tools and
HDR capabilities for better visual fidelity and color accuracy across various display
formats.
This study focuses on creating a compelling compositing demo reel that showcases a range of skills,
techniques, and expertise in visual effects. A well-curated demo reel highlights the artist’s strengths,
style, and versatility, making it an essential tool for landing roles in the VFX industry. Key elements
include selecting impactful shots, organizing them cohesively, and emphasizing high-quality, complex
compositing work that best represents the artist’s abilities.
Explored Tools:
Editing Software: Tools like Adobe Premiere Pro or DaVinci Resolve for compiling and editing shots
into a polished reel.
VFX Breakdown Creation: Creating breakdowns using Nuke or After Effects to showcase the steps
involved in complex shots.
Color Grading and Finishing Tools: Applying final color adjustments and polishing shots to ensure
they look consistent and professional.
Title and Text Design Tools: Incorporating introductory slides, text overlays, and contact information
using After Effects or similar tools.
Concepts Covered:
1. Shot Selection and Sequencing: Choosing the best shots that showcase different
compositing skills, ensuring each shot has a unique impact and flows logically within
the reel.
3. Highlighting Key Skills: Focusing on shots that demonstrate advanced skills like 3D
compositing, keying, relighting, and effects integration to make a strong impression.
4. Polished Editing and Transitions: Using smooth transitions and professional editing to
create a cohesive narrative, ensuring the reel maintains viewer engagement.
5. Personal Branding and Contact Information: Including brief titles, name overlays, and
contact information for easy reference by potential employers or clients.