JOURNAL OF CRITICAL REVIEWS
ISSN- 2394-5125 VOL 7, ISSUE 3, 2020
REVIEW ON NIGHT VISION TECHNOLOGY IN
AUTOMOBILES
Vineet Kumar1, Abhishak Bhattacharya2
1,2
Dept. of Mechanical Engineering, Sharda University, Greater Noida, Uttar Pradesh
Email Id- [email protected], [email protected]
Received: 06 December 2019 Revised and Accepted: 16 February 2020
ABSTRACT: The two most common terms in transportation and manufacturing are health and quality of life.
The universe is thought to have been an eon of mean and brave machinery from a simple form of everyday life.
The protection of people inside and outside the vehicle in the car manufacturing industry, therefore, is of prime
importance, and scientists daily work to ensure the human race increasingly complex ways of safety. After low
light, the risk of a fatal crash increases rapidly, even as the traffic is downhill. Nobody on earth is willing to face
accidents while traveling, human safety is truly important and therefore also essential for preventing accidents.
Study shows that the majority of accidents in the world are caused by poor vision at night. The accidents will
affect the lives of cars and people. Several academics and R&D companies have suggested safety improvements
in vehicle design. In the present digital and smart world, the smart night vision device for cars needs time,
design and implementation. The paper will discuss about the working of a traditional night vision technology
used by the manufacturers with the help of example of automobile giant BMW.
KEYWORDS: Accidents, Automobile, Design, Manufacturing, Night vision, R&D, Traffic.
I. INTRODUCTION
The streets of today have turned into an unforgettable nightmare for the road users with uncontrolled vehicle
running on the roads at illegal speeds and this problem evolver more in the night especially because of the
drunken drivers ruling our roads. The dangerous accidents that take place during the night is mainly because of
the poor visibility which increases one more challenge, drivers finds it challenging to predict the road ahead of
them and react in time and it does not happen to drunken drivers but also to conscious drivers. Vehicle driving
highly relies on the visual senses of the driver for proper reaction through visual information processing. Little
bit of in accuracy in analyzing these objects may lead to serious consequences. Statistics show that alone in U.S
more than 20% serious accidents took place between midnight and before 6 am. Taking an example dipped
headlamps can only illuminate approximately 56 meters when brake distance is 80 meters at 100km/hr. speed.
Thus comes the need of night vision system which particularly uses headlights or infra-red sensors/cameras to
provide clear view of approaching road, pedestrians, bends, poles and other vehicles and afterwards this
systems informs the driver through acoustic, visual or other sources about the forthcoming obstacle. Initially
these systems were first installed in high end cars such as BMW and Mercedes-Benz. [1]
Electromagnetic Spectrum
It is necessary to learn something about electromagnetic spectrum and light before using the night vision
systems. Humans are only visible to the rays falling below the visible region of electromagnetic spectrum and
invisible to the electromagnetic spectrum infra-red ultraviolet region. Nevertheless, night vision technology
enables people to see rays falling in the infrared region of electromagnetic spectrum, usually night vision
devices used in cars catch the infrared image of distant barriers on the road as every object emits infrared rays
(heat rays) even at night. [2]
All the high end cars nowadays have standard ‘night vision system’ which is high beam headlight unit. Even
though there is still more scope in its improvement but there performance is still least acceptable. However this
has its own limitations because it cannot be used in one way roads. Therefore there is a need of a night vision
system which improves the visibility. The short detection distances for dark objects under low beam conditions
determines the detection distance deficiency which must be overcome by night vision system. In the normal
legal speeds a driver should be able to detect then react and stop within time prior any obstacle.
813
JOURNAL OF CRITICAL REVIEWS
ISSN- 2394-5125 VOL 7, ISSUE 3, 2020
II. LITERATURE SURVEY
Working
There are generally two modes in night vision system i.e. far and near infrared. The near infrared requires
assistance which are special type of bulbs mounted just next to the headlights which are aimed in a straight
beam, but unlike high beams they don’t blind upcoming drivers. In the wavelength of 800 to 900 nm the NIR
system illuminates the atmosphere with infrared light. Object infrared reflection is controlled by a Charge
Coupled Display (CCD) and converted to a digital signal. The digital signal from the Charge Coupled Display
(CCD) is converted into an image processor which is projected in a black and white head-up display on the
windscreen. Far infra-red senses energy higher up the infra-red scale, which objects emit as heat. This far IR
vision is also known as passive vision, because no availability of special light source. The special camera used
by these devices-mainly a phased array of pixel-sized IR detector objects-produces a temperature pattern called
a thermos gram, which is refreshed (30) times per second. The heat from an animal or a pedestrian is much
stronger than the heat from the camera around it. The thermo gram data is converted into an image for display
on the monitor by a signal processor. Neither of these systems has been found to have a clear advantage. But
everyone doesn't believe it makes sense to see cars at night. The greatest problem with visibility at night is that
these systems require the driver to turn off the lane, which is no good idea, and drivers will only increase speed,
thinking that they are less vulnerable. To avoid this problem, the driver is immediately warned of the
approaching object. Effective and better algorithms are required to send warning to user at a brisk pace if a
pedestrian is detected.[3]
Infrared sensors
Infrared radiation in the light band is detected by the infrared sensor. These are made up of silicon material and
when infrared rays’ falls directly over it, then the sensor gets excited as represented in figure 1. Wavelength
ranging from 700 nanometers up to 1 millimeter. They can sense both near infrared as well as far infrared and
also provide varying electrical signals for detected photons of different wavelength. Graphic signals are
generated when electric signals get amplified and then processed and eventually gets displayed on an output
device.[4]
Night vision processing unit
It is the most significant component of the night vision system. The signal generated by the infrared sensor is
obtained by the night vision processing unit and processed to make digital visual signal. The condition of the
field is determined by the night vision processing unit and does not require amplification of signal to provide
better output. Now in the market there are more complicated control units which are designed to perform
plurality of functions such as spotting high intensified lights in field and projecting it in order to not cause bright
spot on display screen. More advanced night vision processing units (NVPU) can work along other driver and
safety assistance systems available in automobile sector to provide lifesaving night vision system.[5]
814
JOURNAL OF CRITICAL REVIEWS
ISSN- 2394-5125 VOL 7, ISSUE 3, 2020
Working of night vision system in Automobile
During low light in night vision system as represented in figure 2, IR rays are projected by the infrared
projectors on driving field. Photons get emitted from the infrared LEDs towards the field and because of the
surrounding these get reflected. The night vision camera captures the reflected rays and is detected through the
IR sensors. After this the signal is converted to image signals which is eventually displayed by the display unit.
[6]
LCD Monitor
There is no color information in the night visual image and therefore sufficient monochromatic displays do the
work. The LCD (P22) green phosphor monitor is commonly used because in this wave length the human eye is
more sensitive to green than in the middle of the spectrum of visible light. Interestingly, looking at green photos
does not put a lot of pressure on the human eye. A green yellow Phosphor (P43) LCD monitor is used by the
latest generation of night vision display and provides an operator with a much better viewing experience.
Working in BMW cars
The Night Vision system used in BMW car is a thermal camera that transforms thermal radiation to electronic
signals and then to photos that can be seen by the human eye. The thermal image is then converted into a visible
picture in the control display using the sensor first into electric signals and then using image processing
software. In relation to temperature the sensor elements change the resistance. As the temperature increases,
signal also increases and it results in much whiter pixels. Up to 60 times a second, the sensor can produce a new
image.[7]
This leads to a cleaner and clearer picture Fig 4 as compared to the distorted image as shown in Fig 3. Virtually
any solid or liquid body absorbs and dispenses with heat radiation. But the human eye is not aware of the heat
radiation, as it is part of the longer-wave infrared spectrum. It reflects electromagnetic waves of 8 μm to 15 μm
duration from a physical perspective. Far Infrared (FIR) is known as this long-wave infrared radiation. The
benefit of using radiation in Far Infrared range is the wider range of wavelength 0.7 μm to 1.4 μm compared to
Near Infrared systems.
The spectrum of these devices only requires illumination. Far infrared (FIR) systems are essentially composed
of an optical element, a thermal camera, a control unit and a monitor. The spectrum of these devices only
requires illumination. Far infrared (FIR) systems are essentially composed of an optical element, a control unit,
a thermal imaging camera and a display unit.
815
JOURNAL OF CRITICAL REVIEWS
ISSN- 2394-5125 VOL 7, ISSUE 3, 2020
Components used in night vision system
A thermal imaging sensor and a heated optical element are part of the thermal imaging camera. N number of
sensor components are found in the thermal imaging array. One of these sensor components is allocated to each
display pixel. Based on the impinging level of heat radiation, the sensor components produce an electrical
signal. The higher the temperature, the pixel becomes much brighter. The heat radiation is transformed into
electric signals based on the principle of resistance transition.
Up to 60 times per second the image can be replaced. Approximately, calibration of the camera every 120
seconds is required in order to ensure a consistent quality image. It can take up to about. 0.5 seconds in this
calibration. The image can therefore be viewed in the display as freezed view. This camera is directly mounted
on the buffer mounting bracket with a mirror directly behind the left ventilation grille. The camera is fitted with
a sensor for detecting heat-emitting objects (Wavelengths from 8 μm to 15 μm) within the far Infra-red range.
320x 240 pixels of sensor size. The maximum viewing angle is 36 °. The "Bend / Curve mode" feature
calculations are performed in the camera. Ambient temperatures of –103 ° F to 180 ° F (38 ° C to+ 82 ° C) are
available. The camera and the imagery sensor are thermally insulated to protect the camera against heat. The
washer jet is screwed to the camera bracket and located right on the front lens of the camera. It is connected
directly to the headlight washer machine and therefore operates with it. The inside of the camera house is fitted
with a heater system to keep the optical element from misting or freezing. The heater is activated if precipitation
is detected by the rain / light sensor or below 32 ° F (0 ° C). An image intensifiers tube in rigid casing, typically
used for military forces, is a Night-Vision Device (NVD). Recently, vision technology for people has become
widely available.[1], [8]–[10]
816
JOURNAL OF CRITICAL REVIEWS
ISSN- 2394-5125 VOL 7, ISSUE 3, 2020
Night vision control unit
The control unit is placed behind the glove box in the front device holder.
The package increases camera image data to 640x 480 pixels, from 320x 240 pixels. The control display only
shows one item. The "Full Screen" function will show 640x 240 pixels, while the screen split function will be
shown in 400x 240 pixels. Control unit transmits the diagnosis, programming and coding data to the camera.
The heater is operated by the camera and the front lens.
Furthermore, this unit transforms the camera's symmetrical image data into a CVBS signal, and provides this
signal for either the Navigation System or the video module depending on the specification of the user. Behind
the glove box in the front holder the night vision control unit is mounted. A 12-pin plug connection is provided
in the camera housing [11][6], [12]–[15].
III. RESULTS
Advantages
Night vision technology aids in improving the vision of the driver in darkness as well as in the dusky conditions.
In the night it becomes very difficult for the driver to drive and guide the vehicle because of the lack of adequate
light in the surrounding. Dusky conditions specifically in the winters makes it much more difficult as the driver
is in capable to judge or recognize any obstacle on coming towards it. The highways were the traffic runs from
both the sides the most challenging part for the driver is to keep focus on the road in spite of the glare coming
from the head lamps from oncoming traffic. The glare forces the eyes of the driver to contract thus driver can
get distracted. The night vision technology nullifies this effect. Other road user which are sometimes not visible
by the driver such as cyclist, animals or small kids are easily highlighted with the help of this technology. These
re traced with the help of capturing the heat illuminated from the body. This provides safety not only to the
driver but also to the other road users. Night vision technology provides better overall overview of the driving
conditions, road situations, landscapes that helps the driver to plan its next turn or to regulate its speed according
to the forthcoming road conditions. This technology also provides zooming feature to the driver where the user
is easily able to zoom into any object which requires attention from the driver in order to skip the obstacle
safely.
Disadvantages
The aim of the Pedestrian Warning Algorithm (PW Algorithm) is to detect and alert the driver accurately. In the
driver's eyes, the final product of the good system offers an early warning and probably additional data such as a
pedestrian location or an icon overlaid during night viewing. Generic image processing algorithms have long,
addressed similar goals, in automotive applications there are several problems unique to image processing. The
whole picture changes constantly.
IV. CONCLUSION
It's now time to have such hybrid safety systems on new latest cars to save many lives. All automobile giants
will turn their research and development work towards such revolutionary technology and make this world a
safer place to live in. There are still a lot of ideas to come and it is the responsibility of the young budding
engineer to think innovatively and to develop techniques such as the night vision sensors used in cars and other
vehicles that show great success in the West, which should be immediately implemented in eastern territory and
the disastrous accidents on roads should be reduced.
V. REFERENCES
[1] R. Gade and T. B. Moeslund, “Thermal cameras and applications: A survey,” Mach. Vis. Appl., 2014.
[2] T. J. Gordon and M. Lidberg, “Automated driving and autonomous functions on road vehicles,” Veh.
Syst. Dyn., 2015.
[3] D. Forslund and J. Bjarkefur, “Night vision animal detection,” in IEEE Intelligent Vehicles Symposium,
Proceedings, 2014.
[4] R. Usamentiaga, P. Venegas, J. Guerediaga, L. Vega, J. Molleda, and F. G. Bulnes, “Infrared
817
JOURNAL OF CRITICAL REVIEWS
ISSN- 2394-5125 VOL 7, ISSUE 3, 2020
thermography for temperature measurement and non-destructive testing,” Sensors (Switzerland). 2014.
[5] P. Khurana, R. Arora, and M. K. Khurana, “Implementation of electronic stability control and adaptive
front lighting system for automobiles,” in 2014 International Conference on Power, Control and
Embedded Systems, ICPCES 2014, 2014.
[6] W. Zhang, Q. M. J. Wu, G. Wang, and X. You, “Tracking and pairing vehicle headlight in night scenes,”
IEEE Trans. Intell. Transp. Syst., 2012.
[7] P. V. Adhav and P. S. A. Shaikh, “Adaptive Front Lighting System Using CCD,” IOSR J. Electron.
Commun. Eng., 2014.
[8] S. Vidas, P. Moghadam, and M. Bosse, “3D thermal mapping of building interiors using an RGB-D and
thermal camera,” in Proceedings - IEEE International Conference on Robotics and Automation, 2013.
[9] G. Baffou et al., “Thermal imaging of nanostructures by quantitative optical phase analysis,” ACS Nano,
2012.
[10] M. J. Wooster et al., Thermal Infrared Remote Sensing. 2013.
[11] S. J. Lee, J. Jo, H. G. Jung, K. R. Park, and J. Kim, “Real-time gaze estimator based on driver’s head
orientation for forward collision warning system,” IEEE Trans. Intell. Transp. Syst., 2011.
[12] Y. Luo, J. Remillard, and D. Hoetzer, “Pedestrian detection in near-infrared night vision system,” in
IEEE Intelligent Vehicles Symposium, Proceedings, 2010.
[13] D. Martín et al., “IVVI 2.0: An intelligent vehicle based on computational perception,” Expert Syst.
Appl., 2014.
[14] W. F. Abaya, J. Basa, M. Sy, A. C. Abad, and E. P. Dadios, “Low cost smart security camera with night
vision capability using Raspberry Pi and OpenCV,” in 2014 International Conference on Humanoid,
Nanotechnology, Information Technology, Communication and Control, Environment and Management,
HNICEM 2014 - 7th HNICEM 2014 Joint with 6th International Symposium on Computational
Intelligence and Intelligent Informatics, co-located with 10th ERDT Conference, 2014.
[15] A. Dasgupta, A. George, S. L. Happy, and A. Routray, “A vision-based system for monitoring the loss of
attention in automotive drivers,” IEEE Trans. Intell. Transp. Syst., 2013.
818