0% found this document useful (0 votes)
21 views

Chapt 05

This chapter discusses fundamental concepts in video, including: - Video is a series of images displayed rapidly to create the illusion of motion. - Frame rate is typically 24-30 frames per second. - Pixels make up each image and are the smallest display unit. - There are different types of video signals like component, composite, and S-video. - Analog video represents a continuous signal while digital video uses discrete images. - Displays can be interlaced or progressive; interlacing alternates between fields.

Uploaded by

Meseret Abiy
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
21 views

Chapt 05

This chapter discusses fundamental concepts in video, including: - Video is a series of images displayed rapidly to create the illusion of motion. - Frame rate is typically 24-30 frames per second. - Pixels make up each image and are the smallest display unit. - There are different types of video signals like component, composite, and S-video. - Analog video represents a continuous signal while digital video uses discrete images. - Displays can be interlaced or progressive; interlacing alternates between fields.

Uploaded by

Meseret Abiy
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 43

Chapter 5

Fundamental concepts in
video

1
Outline
 Types of Video Signals
 Types of Video Display
 Analog Video
 Digital Video
 Different TV Standards

2
Introduction
 This chapter introduce the principal notions needed to
understand video.
 Digital video compression will be explored later.

3
Introduction
 Since video is created from a variety of sources, we begin
with the signals themselves.

4
Video is the technology of electronically capturing, recording, processing, storing,
transmitting, and reconstructing a sequence of still images representing scenes in
motion.

5
Fundamental concepts in Video
 Video is a series of images. When this series of images are
displayed on screen at fast speed ( e.g 30 images per second), we
see a perceived motion.
 It projects single images at a fast rate producing the illusion of
continuous motion.
 These single images are called frames.
 The rate at which the frames are projected is generally between
24 and 30 frames per second (fps).
 Each screen-full of video is made up of thousands of pixels.
 A pixel is the smallest unit of an image. A pixel can display only
one color at a time.
 Your television has 720 vertical lines of pixels (from left to right)
and 486 rows of pixels (top to bottom).
 A total of 349,920 pixels (720 x 486) for a single frame.

6
Basic Concepts (Video Representation)
◦ Human eye views video
 immanent properties of the eye determine essential conditions related
to video systems.
◦ Video signal representation consists of 3 aspects:
 Visual Representation
 objective is to offer the viewer a sense of presence in the scene and of
participation in the events portrayed.
 Transmission
 Video signals are transmitted to the receiver through a single television
channel
 Digitalization
 analog to digital conversion, sampling of gray(color) level, quantization.

7
aspect ratio
Aspect ratio describes the dimensions of video screens and video picture
elements.

All popular video formats are rectilinear, and so can be described by a ratio
between width and height.

The screen aspect ratio of a traditional television screen is 4:3. High definition
televisions use an aspect ratio of 16:9.

8
Chrominance
Chrominance (chroma for short), is the signal used in video systems to convey the
color information of the picture, separately from the accompanying luma signal.

Chrominance is usually represented as two color-difference components: U = B'–Y'


(blue – luma) and V = R'–Y' (red – luma). Each of these difference components may
have scale factors and offsets applied to them, as specified by the applicable video
standard.

luma represents the brightness in an image

9
Types of Video Signals
 Video signals can be organized in three different ways: component
video, composite video, and S-video.
 Component video
 In popular use, it refers to a type of analog video information that is
transmitted or stored as three separate signals for the red, green, and
blue image planes. Each color channel is sent as a separate video
signal.
 This kind of system has three kind wires (and connectors) connecting
the camera or other devices to a TV or monitor.
 Most computer systems use Component Video, with separate signals for
R, G,and B signals.
 For any color separation scheme, Component Video gives the best color
reproduction since there is no “crosstalk” between the three channels.
 This is not the case for S-Video or Composite Video, discussed next. Component
video, however, requires more bandwidth and good synchronization of the three
components.

10
Component video

11
Composite Video — 1 Signal
 Composite video: color (“chrominance”) and intensity
(“luminance”) signals are mixed into a single carrier
wave.
 This type of signal used by broadcast color TVs; it is
downward compatible with black-and-white TV.
 When connecting to TVs, Composite Video uses only one wire
and video color signals are mixed, not sent separately. The
audio and sync signals are additions to this one signal.
 Since color and intensity are wrapped into the same
signal, some interference between the luminance and
chrominance signals is inevitable

12
S-Video (separate video) — 2 Signals
 S-video as a compromise, uses two wires, one for luminance
and another for a composite chrominance signal.
 As a result, there is less crosstalk between the color
information and the crucial gray-scale information.
 The reason for placing luminance into its own part of the signal
is that black-and-white information is most crucial for visual
perception.
 In fact, humans are able to differentiate spatial resolution in grayscale
images with a much higher acuity than for the color part of color
images.
 As a result, we can send less accurate color information than must be
sent for intensity information — we can only see fairly large blobs of
color, so it makes sense to send less color detail.

13
Types of Video
 There are two Types of video
 Analog video is represented as a continuous (time-
varying) signal.
 Digital video is represented as a sequence of digital
images.

14
Analog Video
 Most TV is still sent and received as analog signal.
 An analog signal f(t) samples a time-varying image. So-
called “progressive” scanning traces through a complete
picture (a frame) row-wise for each time interval.
 A high resolution computer monitor typically uses a time
interval of 1/72 second.
 In TV, and in some monitors and multimedia standards as
well, another system, called “interlaced” scanning is used:
 The odd-numbered lines are traced first, and then the even-
numbered lines are traced. This results in “odd” and “even”
fields —two fields make up one frame.
 In fact, the odd lines (starting from 1) end up at the middle of a line at
the end of the odd field, and the even scan starts at a half-way point.

15
Analog Video

 Figure 5.1 shows the scheme used. First the solid (odd) lines are traced, P to Q,
then R to S, etc., ending at T; then the even field starts at U and ends at V.
 The jump from Q to R, etc. in Figure 5.1 is called the horizontal retrace, during
which the electronic beam in the CRT is blanked. The jump from T to U or V to
P is called the vertical retrace.
 The scan lines are not horizontal because a small voltage is applied, moving
the electron bean down over time.
16
Analog Video
 Interlacing was invented because, when standards were
being defined, it was difficult to transmit the amount of
information in a full frame quickly enough to avoid
flicker, the double number of fields presented to the eye
reduces the eye perceived flicker.
 Because of interlacing, the odd and even lines are
displaced in time from each other —generally not
noticeable except when very fast action is taking place on
screen, when blurring may occur.
 Since it is sometimes necessary to change the frame rate,
resize, or even produce stills from an interlaced source
video, various schemes are used to “de-interlace” it.

17
Analog Video
a) The simplest de-interlacing method consists of
discarding one field and duplicating the scan lines of the
other field. The information in one field is lost
completely using this simple technique.
b) Other more complicated methods that retain information
from both fields are also possible.

18
Digital video
 The advantages of digital representation for video are
many, For example:
 Video can be stored on digital devices or in memory,
ready to be processed (noise removal, cut and paste, etc.),
and integrated to various multimedia applications;
 Direct access is possible, which makes nonlinear video
editing achievable as a simple, rather than a complex, task;
 Repeated recording does not degrade image quality.
 Ease of encryption and better tolerance to channel noise

19
The disadvantages of digital video are:
 Analog-type of distortions, as well unique digital
distortions related to sampling and quantizing, result in
a variety of visible impairments.
 Wide bandwidth requirements for recording,
distribution and transmission necessitate sophisticated
bit-rate reduction and compression schemes to achieve
manageable bandwidths.
 Unlike analog signals, the digital signals do not
degrade gracefully and are subjected to a cliff effect.

20
Types of video Display
 There are two ways of displaying video on screen:
 Interlaced scanning
 Progressive scanning

21
Interlaced Scanning
 Interlaced scanning writes every second line of the picture
during a scan, and writes the other half during the next
sweep.
 Doing that we only need 25/30 pictures per second.
 This idea of splitting up the image into two parts became
known as interlacing and the splitted up pictures as fields.
 Graphically seen a field is basically a picture with every
2nd line black/white.

22
Interlaced Scanning
 During the first scan the upper field is
written on screen.
 The first, 3rd, 5th, etc. line is written and
after writing each line the electron beam
moves to the left again before writing the
next line.
 Currently the picture exhibits a "combing"
effect, it looks like you’re watching it
through a comb.
 When people refer to interlacing artifacts or
say that their picture is interlaced this is
what they commonly refer to.
 Once all the odd lines have been written the
electron beam travels back to the upper left
of the screen and starts writing the even
lines.
 As it takes a while before the phosphor
stops emitting light and as the human brain
is too slow instead of seeing two fields
what we see is a combination of both fields
- in other words the original picture.

23
Progressive Scanning
 PC CRT displays are fundamentally different from TV
screens.
 Monitor writes a whole picture per scan.
 Progressive scan updates all the lines on the screen at the
same time, 60 times every second.
 This is known as progressive scanning.
 Today all PC screens write a picture like this.

24
Progressive Scanning

25
Comparisons b/n computer and TV
Display

Computer Television
Scans 480 horizontal lines from top to Scans 625, 525 horizontal lines
bottom
Scan each line progressively Scan line using interlacing system
Scan full frame at a rate of typically Scan 25-30 HZ for full time
66.67 HZ or higher

Use RGB color model Uses limited color palette and restricted
luminance (lightness or darkness)

26
Chroma Subsampling
 Chroma subsampling is the practice of encoding images by
implementing less resolution for chroma information than for
luma information. It is used in many video encoding schemes
— both analog and digital

 Because of storage and transmission limitations, there is always


a desire to reduce (or compress) the signal. Since the human
visual system is much more sensitive to variations in brightness
than color, a video system can be optimized by devoting more
bandwidth to the luma component (usually denoted Y'), than to
the color difference components Cb and Cr.
 The signal is divided into a luma (Y') component and two color
difference components (chroma)
27
Chroma Subsampling

28
CCIR Standards for Digital Video
 CCIR is the Consultative Committee for International
Radio,
 one of the most important standards it has produced is
CCIR-601, for component digital video.

 Table 5.3 shows some of the digital video specifications,


all with an aspect ratio of 4:3. The CCIR 601 standard
uses an interlaced scan, so each field has only half as
much vertical resolution

29
CCIR Standards for Digital Video

30
CCIR Standards for Digital Video
 CIF stands for Common Intermediate Format specified by
the CCITT (International Telegraph and Telephone
Consultative Committee).
 The idea of CIF is to specify a format for lower bitrate.
 QCIF stands for “Quarter-CIF”.

31
High definition TV (HDTV)
 refers to video having resolution substantially higher than
traditional television systems. HD has one or two million
pixels per frame.
 The first generation of HDTV was based on an analog
technology developed by Sony in Japan in the late 1970s.
 Modern plasma television uses this
 It consists of 720-1080 lines and higher number of pixels
(as many as 1920 pixels).
 Having a choice in between progressive and interlaced is
one advantage of HDTV.

32
Video Broadcasting Standards/ TV
standards
 There are three different video broadcasting standards:
PAL, NTSC, and SECAM

33
NTSC Video
 NTSC (National Television System Committee) TV standard is mostly used
in North America and Japan. It uses the familiar 4:3 aspect ratio (i.e., the
ratio of picture width to its height) and uses 525 scan lines per frame at
30 frames per second (fps).
 The problem is that NTSC is an analog system. In computer video, colors
and brightness are represented by numbers (digital). But with analog
television, everything is just voltages, and voltages are affected by wire
length, connectors, heat, cold, video tape, and so on.
a) NTSC follows the interlaced scanning system, and each frame is divided into
two fields, with 262.5 lines/field.
b) Thus the horizontal sweep frequency is 525×29.97 ≈ 15, 734 lines/sec,

34
PAL Video
 PAL (Phase Alternating Line) is a TV standard widely
used in Western Europe, China, India, and many other
parts of the world.
 PAL uses 625 scan lines per frame, at 25frames/second(40
ms/frames), with a 4:3 aspect ratio and interlaced fields.
 In Broadcast TV signal uses composite Video.

35
SECAM (Sequential Color with
Memory)
 SECAM uses the same bandwidth as PAL but transmits
the color information sequentially.
 SECAM is very similar to PAL.
 It specifies the same number of scan lines and frames per
second.
 It is the broadcast standard for France, Russia, and parts of
Africa and Eastern Europe.

36
Fig Television standards used in
different countries

37
HDTV vs Existing Signals (NTSC, PAL, or SECAM)
 The HDTV signal is digital resulting in crystal clear,
noise-free pictures and CD quality sound.
 It has many viewer benefits like choosing between
interlaced or progressive scanning.

38
Four Factors of Digital Video
 With digital video, four factors have to be kept in mind.
These are :
 Frame rate
 Spatial Resolution
 Color Resolution
 Image Quality

39
Frame Rate
 The standard for displaying any type of non-film video is
30 frames per second (film is 24 frames per second).
 This means that the video is made up of 30 (or 24)
pictures or frames for every second of video.
 Additionally these frames are split in half (odd lines and
even lines), to form what are called fields.

40
Spatial Resolution
 The third factor is spatial resolution - or in other words, "How big is
the picture?". Since PC and Macintosh computers generally have
resolutions in excess of 640 by 480, most people assume that this
resolution is the video standard.
 A standard analogue video signal displays a full, over scanned image
without the borders common to computer screens.
 The National Television Standards Committee ( NTSC standard used
in North America and Japanese Television uses a 768 by 484 display.
 The Phase Alternative system (PAL) standard for European television
is slightly larger at 768 by 576.
 Most countries endorse one or the other, but never both.
 Since the resolution between analogue video and computers is
different, conversion of analogue video to digital video at times must
take this into account.
 This can often the result in the down-sizing of the video and the loss of
some resolution.

41
Color Resolution
 This second factor is a bit more complex.
 Color resolution refers to the number of colors displayed
on the screen at one time.
 Computers deal with color in an RGB (red-green blue)
format, while video uses a variety of formats.
 One of the most common video formats is called YUV.
 Although there is no direct correlation between RGB and
YUV,they are similar in that they both have varying levels
of color depth (maximum number of colours).

42
Image Quality
 The last, and most important factor is video quality.
 The final objective is video that looks acceptable for your
application.
 For some this may be 1/4 screen, 15 frames per second
(fps), at 8 bits per pixel.
 Other require a full screen (768 by 484), full frame rate
video, at 24 bits per pixel (16.7 million colours).

43

You might also like