 Prior to digital tracking, in both electronic and optical effects, for most effects
shots the camera was locked off. It was just impossible with out motion control of
the capturing cameras to align to shots in post-production. While hand tracking
has been attempted, the eye is so well adjusted to noticing movement and float -
that often times sub pixel accuracy is required to sell a shot and fool an
audience.
 Based on differences in color and brightness the computer tries to track certain
points in the video footage by comparing each frame with its previous frame to
determine the new position oft the selected point. As you can see in this example
(below) the tracking marker sticks to this little piece of paper on the street, although
the camera is moving. That means that the computer needs an accumulation of
pixels that remain mostly the same over the whole video. If this group of pixels is
altered too much by a sudden bright light source for instance, the computer can not
track that anymore.
 2D TRACKING
Composting Programs like After Effects, Nuke, Black magic fusion, Shake have TWO
DIMENTIONAL motion tracking Capabilities. This feature translates images in two
dimensional space and can add effects such as motion blur in an attempt to
eliminate relative motion between two features and two moving images. This
technique is sufficient to create believability when the two images do not include
major changes in camera perspective.
 3D TRACKING
Three Dimensional tracking /match moving tools make it possible to fetch three dimensional
information from two dimensional photography. There are many Applications capable of 3D
match moving like Autodesk Match mover, PF-Track, Boujou, SynthEyes, 3D Equalizer , Nuke
Camera Tracker, Voodoo Etc.
These Applications allow user to extract camera movement and other relative motion from
arbitrary footage. The tracking information can be transferred to other 3D software like Maya,
3Ds Max, Modo, Cinema 4d to animate virtual cameras and CGI objects.
Source/plate [ Any film sequence or live action shoot ]
Evaluation footage [ camera movement, Tracking Marker, Plate Formats, Placing object or
Character. ]
Additional Information [ Camera reports, Survey Data, set Measurements]
Define the Camera [ Techniques and Match moving Software]
3D [ 3D set, Proxy object for test]
Testing [ Basic test for alignment/Checker Test]
Auto Tracking
Where software Calculates the footage and create as best as possible tracking for
us, its also called soft tracks.
Manual Tracking
Where we manually create the tracking points, consider as a best tracking points. Its
also called Hard tracks.
Tracking color Red means Poor tracks
Tracking color Green means GOOD tracks
Tracking color Red means Failed tracks
First software Auto track for us of we manually Track points
Second we solve the camera means where convert 2D point into 3D
points and create a 3D word .
Thirdly we orient our scene, means define our 3d world, which one is
ground, which one is top, Defining coordinate system x y z axis.
Forth placing test object for checking the quality of track
Fifth export the scene to 3D software where we set fitting the shot or
put proxy geo for testing, where we put checker as texture render out
and do the ultimate test on composting software by putting checker
layer overlay mode on master plate.
Dolly Camera
The dolly gets its name from filmmaking where a camera, mounted on a
wheel tripod, Is moved towards or away from the scene.
PANNING Camera
In cinematography, panning refers to the horizontal movement or rotation
of a still
Or video camera. Panning a camera results in a motion similar to that of
someone
Shaking their head “no”. Its also called as a nodal pan shot
Tilting Camera
Tilting is a cinematographic technique in which the camera is stationary
and rotates
In a vertical plane(or tilting plane). Tilting camera motion similar to
someone nodding their head “yes”
TRACKING Camera
Tracking camera in which camera is moved on horizontal and vertical
plane only
Match moving software's primary Concern is Camera. After all, it has to
take an image sequence and provide 3D cameras that accurately copy the
camera that had filmed the sequence. Software must know not only where
the camera was positioned and which way it facing but also what focal
length and format used.
The focal length is the distance between the center of the primary lens and the
film. Focal length depends upon camera’s lens.
A shorter focal length gives a more prominent perspective
A larger focal length gives more flat look, which means more of the scene
is visible in the image.
Shorter focal length gives you wide angle view
Where as long focal length gives you narrow angle of view
Focal length also determine the perspective of an image. But it strictly speaking
Perspective only changes with ones location relative to their subject. Wide angle
lens
Exaggerates or stretches perspective, whereas the telephoto lens compress or
flatters Perspective.
Less of the background is visible and background objects appear to be closer.
In these photographs, we changed the distance between the camera and
the subject so that the subject always appears to be the same size. Note
the difference in the balance between the portrait subject and the building
in the background: the wider the lens, the greater the apparent distance to
the building.
Wide angle lenses (short focal length) capture more because they have a
wider picture angle, while telephoto lenses (long focal length) have a narrower
picture angle. Below are some typical focal lengths:
In geometric and cathode ray tube(CRT)displays, distortion is a deviation from
Rectilinear projection. Rectilinear projection in which straight lines in scene remain
Straight in an image lens distortion is a form of optical aberration. The radial distortion
Can usually be classified as one of two types.
When straight lines are curved inwards in a shape of a barrel, this type of aberration is called
“barrel distortion”. Commonly seen on wide angle lenses, fish eye lens barrel distortion happens
because the field of view of the lens is much wider than the size of the image sensor and hence
it needs to be “squeezed” to fit. As a result, straight lines are visibly curved inwards, especially
towards the extreme edges of the frame. Here is an example of strong barrel distortion
Barrel Distortion
Pincushion distortion is the exact opposite of barrel distortion – straight lines are
curved outwards from the canter. This type of distortion is commonly seen on
telephoto lenses, and it occurs due to image magnification increasing towards the
edges of the frame from the optical axis. This time, the field of view is smaller than
the size of the image sensor and it thus needs to be “stretched” to fit. As a result,
straight lines appear to be pulled upwards in the corners, as seen below:
Lens Distortion can have a serious effect on the Quality of Tracking and if left uncorrected can
make it impossible to get a useable camera track from your footage. Distortion makes the object
to appear to change shape depending on where they are in the frame. A cube in the future of the
frame will look different to a cube edge of the frame because the distortion increases as you
move further from the center. This can make thing very difficult for 3D tracking because it can not
work out the 3D structure of non rigid objects. 3D tracking software has such tools that allow you
estimate how much distortion there is in your shot and correct for it. These tools unwrapping the
image until features that you know should be straight line up with calibration line that is always
straight. This task is made much easier if you shoot lens grid below with your camera at the tile
of your shoot
Survey data basically properties of a scene or measurement of set. It will help us to set the over
all scale of the scene it mean real world world size of scale value.
After taking survey data from set we can use it on 3D tracking software and later export it to 3D
software like max and Maya. When you use the survey data in your 3D animation package , the
set is already fit on place. There is no need of creating camera rigs or adjusting camera position.
But make sure your survey data is accurate.
In big budget movies survey data are captured by professional surveyor who used sophisticated
survey equipment to accurately plot various points on the set in 3D space.
Hope you lean something from this. Wish you all the best.
REGARDS
DIPJOY ROUTH

Fundamentals of matchmoving

  • 2.
     Prior todigital tracking, in both electronic and optical effects, for most effects shots the camera was locked off. It was just impossible with out motion control of the capturing cameras to align to shots in post-production. While hand tracking has been attempted, the eye is so well adjusted to noticing movement and float - that often times sub pixel accuracy is required to sell a shot and fool an audience.
  • 3.
     Based ondifferences in color and brightness the computer tries to track certain points in the video footage by comparing each frame with its previous frame to determine the new position oft the selected point. As you can see in this example (below) the tracking marker sticks to this little piece of paper on the street, although the camera is moving. That means that the computer needs an accumulation of pixels that remain mostly the same over the whole video. If this group of pixels is altered too much by a sudden bright light source for instance, the computer can not track that anymore.
  • 4.
     2D TRACKING CompostingPrograms like After Effects, Nuke, Black magic fusion, Shake have TWO DIMENTIONAL motion tracking Capabilities. This feature translates images in two dimensional space and can add effects such as motion blur in an attempt to eliminate relative motion between two features and two moving images. This technique is sufficient to create believability when the two images do not include major changes in camera perspective.
  • 5.
     3D TRACKING ThreeDimensional tracking /match moving tools make it possible to fetch three dimensional information from two dimensional photography. There are many Applications capable of 3D match moving like Autodesk Match mover, PF-Track, Boujou, SynthEyes, 3D Equalizer , Nuke Camera Tracker, Voodoo Etc. These Applications allow user to extract camera movement and other relative motion from arbitrary footage. The tracking information can be transferred to other 3D software like Maya, 3Ds Max, Modo, Cinema 4d to animate virtual cameras and CGI objects.
  • 6.
    Source/plate [ Anyfilm sequence or live action shoot ] Evaluation footage [ camera movement, Tracking Marker, Plate Formats, Placing object or Character. ] Additional Information [ Camera reports, Survey Data, set Measurements] Define the Camera [ Techniques and Match moving Software] 3D [ 3D set, Proxy object for test] Testing [ Basic test for alignment/Checker Test]
  • 7.
    Auto Tracking Where softwareCalculates the footage and create as best as possible tracking for us, its also called soft tracks. Manual Tracking Where we manually create the tracking points, consider as a best tracking points. Its also called Hard tracks. Tracking color Red means Poor tracks Tracking color Green means GOOD tracks Tracking color Red means Failed tracks
  • 8.
    First software Autotrack for us of we manually Track points Second we solve the camera means where convert 2D point into 3D points and create a 3D word . Thirdly we orient our scene, means define our 3d world, which one is ground, which one is top, Defining coordinate system x y z axis. Forth placing test object for checking the quality of track Fifth export the scene to 3D software where we set fitting the shot or put proxy geo for testing, where we put checker as texture render out and do the ultimate test on composting software by putting checker layer overlay mode on master plate.
  • 9.
    Dolly Camera The dollygets its name from filmmaking where a camera, mounted on a wheel tripod, Is moved towards or away from the scene. PANNING Camera In cinematography, panning refers to the horizontal movement or rotation of a still Or video camera. Panning a camera results in a motion similar to that of someone Shaking their head “no”. Its also called as a nodal pan shot
  • 10.
    Tilting Camera Tilting isa cinematographic technique in which the camera is stationary and rotates In a vertical plane(or tilting plane). Tilting camera motion similar to someone nodding their head “yes” TRACKING Camera Tracking camera in which camera is moved on horizontal and vertical plane only
  • 11.
    Match moving software'sprimary Concern is Camera. After all, it has to take an image sequence and provide 3D cameras that accurately copy the camera that had filmed the sequence. Software must know not only where the camera was positioned and which way it facing but also what focal length and format used.
  • 12.
    The focal lengthis the distance between the center of the primary lens and the film. Focal length depends upon camera’s lens. A shorter focal length gives a more prominent perspective A larger focal length gives more flat look, which means more of the scene is visible in the image.
  • 13.
    Shorter focal lengthgives you wide angle view Where as long focal length gives you narrow angle of view
  • 14.
    Focal length alsodetermine the perspective of an image. But it strictly speaking Perspective only changes with ones location relative to their subject. Wide angle lens Exaggerates or stretches perspective, whereas the telephoto lens compress or flatters Perspective.
  • 15.
    Less of thebackground is visible and background objects appear to be closer.
  • 16.
    In these photographs,we changed the distance between the camera and the subject so that the subject always appears to be the same size. Note the difference in the balance between the portrait subject and the building in the background: the wider the lens, the greater the apparent distance to the building.
  • 17.
    Wide angle lenses(short focal length) capture more because they have a wider picture angle, while telephoto lenses (long focal length) have a narrower picture angle. Below are some typical focal lengths:
  • 19.
    In geometric andcathode ray tube(CRT)displays, distortion is a deviation from Rectilinear projection. Rectilinear projection in which straight lines in scene remain Straight in an image lens distortion is a form of optical aberration. The radial distortion Can usually be classified as one of two types. When straight lines are curved inwards in a shape of a barrel, this type of aberration is called “barrel distortion”. Commonly seen on wide angle lenses, fish eye lens barrel distortion happens because the field of view of the lens is much wider than the size of the image sensor and hence it needs to be “squeezed” to fit. As a result, straight lines are visibly curved inwards, especially towards the extreme edges of the frame. Here is an example of strong barrel distortion Barrel Distortion
  • 20.
    Pincushion distortion isthe exact opposite of barrel distortion – straight lines are curved outwards from the canter. This type of distortion is commonly seen on telephoto lenses, and it occurs due to image magnification increasing towards the edges of the frame from the optical axis. This time, the field of view is smaller than the size of the image sensor and it thus needs to be “stretched” to fit. As a result, straight lines appear to be pulled upwards in the corners, as seen below:
  • 21.
    Lens Distortion canhave a serious effect on the Quality of Tracking and if left uncorrected can make it impossible to get a useable camera track from your footage. Distortion makes the object to appear to change shape depending on where they are in the frame. A cube in the future of the frame will look different to a cube edge of the frame because the distortion increases as you move further from the center. This can make thing very difficult for 3D tracking because it can not work out the 3D structure of non rigid objects. 3D tracking software has such tools that allow you estimate how much distortion there is in your shot and correct for it. These tools unwrapping the image until features that you know should be straight line up with calibration line that is always straight. This task is made much easier if you shoot lens grid below with your camera at the tile of your shoot
  • 22.
    Survey data basicallyproperties of a scene or measurement of set. It will help us to set the over all scale of the scene it mean real world world size of scale value. After taking survey data from set we can use it on 3D tracking software and later export it to 3D software like max and Maya. When you use the survey data in your 3D animation package , the set is already fit on place. There is no need of creating camera rigs or adjusting camera position. But make sure your survey data is accurate. In big budget movies survey data are captured by professional surveyor who used sophisticated survey equipment to accurately plot various points on the set in 3D space.
  • 23.
    Hope you leansomething from this. Wish you all the best. REGARDS DIPJOY ROUTH