0% found this document useful (0 votes)
18 views

Park 2015

This document discusses the implementation of an eye gaze tracking system to help disabled people interact with computers and devices. It proposes a modified pupil center corneal reflection hardware method to improve tracking accuracy. It also suggests an adaptive exposure control algorithm that is robust against light changes and improves performance both indoors and outdoors.

Uploaded by

Harsh Lohar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
18 views

Park 2015

This document discusses the implementation of an eye gaze tracking system to help disabled people interact with computers and devices. It proposes a modified pupil center corneal reflection hardware method to improve tracking accuracy. It also suggests an adaptive exposure control algorithm that is robust against light changes and improves performance both indoors and outdoors.

Uploaded by

Harsh Lohar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

2015 IEEE 29th International Conference on Advanced Information Networking and Applications

Implementation of an Eye Gaze Tracking System for the Disabled People

Junghoon Park Taeyoung Jung, Kangbin Yim


DMC R&D Center Dept. of Information Security Engineering
Samsung Electronics Soonchunhyang University
Suwon, Korea Asan, Korea
[email protected] {jtyworld, yim}@sch.ac.kr

Abstract—The paper proposes a modified pupil center available in a context and user, much training is needed to
corneal reflection(PCCR) hardware method to improve the get used to the device, which can cause confusion between
system accuracy. The modified PCCR eye gaze tracking sys- the pilot and the information available through the eye. Since
tem, a new version of the PCCR eye gaze tracking system
supplemented by the relation between IR LED position and input and output must be performed at the same time with
the distance from the eye gaze tracking system to the monitor the same eye, this can lead to fatigue[7] . When viewed in a
screen, improves the tracking accuracy within one degree. long-term perspective, since it is possible to read through and
The system also suggests a circuit that can do power control collect a lot of in- formation, eye gaze is likely to develop
adaptively between the minimum and maximum power. It is into a widespread next-generation interface.
confirmed that the system performs well both for indoors
and outdoors in spite of the reduced calculation. Besides, The pupil center corneal reflection(PCCR) method be-
very convenient mouse functions are proposed so that can be came dominant for finding humanFLs diverse eye gaze direc-
used on PC with the eye gaze tracking functions. The user tions, though the research on the eye tracking technology has
group confirmed their performances that result in high level been done for a very long period of time. The initial studies
of satisfaction with excellent tracking function. on the eye tracking technology were related to the general
Above all, the paper suggests the adaptive exposure control
algorithm for the proposed system which is robust against light. human interface for operating equipment and devices, but the
The adaptive exposure control algorithm presents excellent technology has since used in many fields, such as for market
performance over the existing system for both indoors and research in a recent study analyzing customerFLs behaviors
outdoors even when the calculation is down to one fifth. [8]. In particular, a real time eye gaze tracking system is
Keywords-Accessibility; Eye gaze control; Eye gaze system; most important for many human computer interaction(HCI)
Real-time implementation; applications, including stereoscopic synthesis, intend extrac-
tion, and behavior analysis. In order to make an eye gaze
I. I NTRODUCTION tracking system real time, the system must have an efficient
There are several ways of eye tracking, including an image pupil detection algorithm and ambience-independent image
processing method using a camera, a method in which the processing as well as reduced complexity, small size and
surface of the sensor can grasp the movement of the eye, number of circuit components [9][10].
and a method of using water on the lens [1][2][3]. When This paper proposes a method for getting clean images
you use your eyes as a means to access information by eye compared to the previous systems in order to reduce im-
gaze tracking, there is some discrepancy with the usability age processing overhead. Because it also helps reduce the
surface. It means the eyes must be used not only for getting number of image frames to be dropped during the image
information but also for controlling some devices or equip- processing, the proposed method can provide sufficient per-
ment. So the ability of the human eye that is manipulating formance even on a low cost hardware system by reducing
an object is sometimes limited in a view of angle [4]. It the transmission traffic. Between PC and prototype, much
follows that there are some difficulties in applying eye gaze data must be communicated. The USB 2.0 generally used in
tracking technology to control something immediately [5]. the note pc bandwidth is known as about 6.7Mbps when
There are advantages and disadvantages to eye gaze input test. The number of ALS (Amyotrophic lateral sclerosis,
devices. The advantages include the ability to collect the also known as Lou GehrigFLs disease) patients in Korea
natural response information of the person, but since it is is estimated to be about 1,200 or so, with over 2,000,000
not customary to control operations by using the human physically disabled people in the whole world. ALS begins
eye a draw- back, a period of training and adaptation are with irregular limb weakness, body-wide tremors and/or
required [6]. Because humans use their eyes a lot, getting speech difficulty[11].
used to some inconveniences may be simple. On the other this paper introduces a missionary project called low cost
hand, compared to other interfaces, given the reaction rate of real-time eye gaze system, a hardware-based pointing device
remote tracking that is possible, the variety of information framework for ALS patients that is composed of a remote

1550-445X/15 $31.00 © 2015 IEEE 904


DOI 10.1109/AINA.2015.286
type eye mouse hardware and firmware, a human interface
device(HID) driver software for the operating system and
an application software suit. This hardware-based pointing
device framework can be implemented with low cost ma-
terials via a convenient-to-deploy assembling environment.
This framework was designed as a result of the voice of
customers(VOC) from the ALS patients including potential
candidates, as well as analytical needs and desktop research
results. The implemented system was proved to provide
the excellent functionality of a conventional pointing device
through a practical usability test. It is expected that this
system can provide an IT experience even for ALS patients
Figure 1. Eye image captured via the proposed system
so that they can interact with the world, which was not
achievable before.

II. T HE PCCR EYE GAZE TRACKING PROTOTYPE the binary images. In the calculated images, the difference
SYSTEM between the two images is converted to pupil. Therefore,
The prototype PCCR eye gaze prototype is shown in only the difference between two images is to be extracted.
Fig. Fig. 4 and Fig. 5; it is a prototype model for several If the difference between the two images is a circular shape
experimental settings to find the optimum images. This that can be a blob candidate, depending on the situation the
system is composed of various parts in the camera, and candidate of pupil can be extracted depending on the known
experiments were carried out by configuring the system to information about pupil. IR reflection point is extracted from
match the optimum distance through the experiment on both the dark eye. Then, we must check whether it is bright eyes
sides of the reflected light LED. Eye-tracking systems use or dark eyes. The algorithm is, after separation of the bright
the difference vector (P-CR) between the pupil position (P) and dark eyes average values, to check which value is higher
and the corneal reflection (CR) to determine the gaze vector than the average of the image pixel sum. If dark eyes is
[12]. In this system, it is important that setting up the eye selected, decide the threshold value. The threshold value is
camera and performing a good calibration routine is just as as follows:
important as the design of the system. To obtain POG di-
rection, this system uses InfraLED reflections on the surface
T hreshold = (M axBright ∗ 9 + P upilBright)/10 (1)
of the cornea. In appropriate situations, one or more glint
lights (dark pupil) and full reflection lights (bright pupil)
from the retina by illuminated light near the optical axis of
the eyes. An important performance parameter is to get clear B. Image framing from an analog video signal
images from the PCCR system because it depends on fast The composite video baseband signal(CVBS) was defined
and exact calculation of each image by stable clear image to transfer a 2D image through a single physical line or
grabbing in Fig. 1. A prototype system uses a CMOS digital a single radio frequency(RF) channel in an analog form.
imaging sensor and a PC for the image processing instead of A video clip for one second consists of 30 (in the case
stand-alone FPGA. It processes 640×480 progressive scan of NTSC standard, but 25 in the case of PAL standard)
frames at a 60 frame per second rate. Any CPU-based imple- subsequent multiple still image frames. Each image frame
mentation of real-time image-processing algorithm has two is followed by a vertical synchronization(VSYNC) signal
major bottle-necks: data transfer bandwidth and sequential to separate the image frames from each other. A 2D still
data processing rate. After a frame is grabbed and moved image is also compounded with multiple 1D raster scanned
to memory, the CPU can sequentially process the pixels. horizontal line images. These line images are connected one
Instead of FPGA, we used note PC and then USB port that by one to construct one train signal that is transferred as one
Bandwidth is limited between Image capturing device and continuous analog signal. As a delimiter between the line im-
the PC. age signals, each line image is also followed by an horizontal
synchronization(HSYNC) signal. A video receiver composes
A. The structure of the proposed PCCR System moving images frame by frame from a training line image
To bring the image of the camera which supports the USB signal by detecting HSYNC and VSYNC signals. To make
video class UVC standard compatible interface, we use a exposure of the sequential image frames equally balanced
video input library using the bright eye and dark eye depend and stable, the exposure time should be synchronized with
on calculating region of interest(ROI). After two consecutive the timing of the image framing. Because the line images and
images are input, the saved two images are converted to the synchronizing signals are related with a natural image cut

905
(shot) that was captured earlier than the time the line images
and synchronizing signals appear, the exposing time should
be precisely optimized to achieve clear images. The lighting
system of the eye mouse turns infrared LEDs on and off
alternatively to minimize computation overhead. This means
an exact synchronization method is required between the
light system and the camera module. This is very important
because the amount of exposed light may depend on the
ambient situations due to lack of light in case of the out-
of-synchronization. Equalization of the exposure is directly
related to the synchronization.

C. Image acquisition and pupil tracking


As shown in Fig. 2, to produce the image from the
camera that supports the USB video class(UVC) standard
compatible interface, we use a video input library using the Figure 2. The flow of the eye gaze tracking algorithm
bright eye and dark eye depending on region of interest
(ROI) calculation. After two consecutive images are input
and saved, they are converted to the binary images. In the
calculated images, the difference between the two images is
converted to pupil. Therefore, only the difference between
two images is extracted. If the difference is a circular shape
that can be a blob candidate depending on the situation, the
candidate of pupil can be extracted depending on the infor-
mation known about pupil. IR reflection point is extracted
from the dark eye. Then, we must check whether they are
bright or dark.
If dark eyes are selected for finding glints, the threshold
value should be chosen. The threshold value is calculated
with Eq. (1) and it predicts glint brightness to extract glint.
When the dark eye ROI is extracted, the maximum bright-
ness can be expected with glint brightness. Therefore, the
threshold weight (maximum brightness and pupil brightness)
is set at 9:1 and slightly lower threshold than the maximum
brightness is applied to make easy glint extraction. The
weight of 9:1 is given because the value between glint Figure 3. The decision between bright and dark pupil
that is the brightest and bright pupil that is next to glint
brightness is used so that one value between them becomes
the threshold.
Fig. 3 shows the overall algorithm. First of all, the III. I MPLEMENTED SYSTEM
average pixel value of the image brought from the UVC
camera is calculated. Since the candidate image has only two
consecutive bright pupil and glint image, the value greater Fig. 4 shows the control board that controls central LED
than the pixel average is determined bright pupil while the and side LED. The MCU is Atmega328p from Atmel and
value smaller than that is deemed dark pupil. When it is the LED driver seen on the board is the circuit to precisely
considered the dark pupil, glint is found within the image. control the LED.
Since the glints are small, detection failure rate goes up and Fig. 5.(a) is the LED module to acquire glint from both
the glints are found with assumption that bright pixels are sides of the eye gaze tracking system like Fig. 5.(b). It is
in parallel with the X axis and two points exist. As the pupil acquired through switching by the control board explained
is surely around the glint, pupil blob candidates are found before. Fig. 5.(c) shows the center LED to gain bright
around the glint. The pupil is found with the same image pupil and comprises many LED modules. Brightness can
processing to find the glint followed by the center point be adjusted depending on the surrounding brightness like
using the ellipsoidal method. brightly or slightly brightly.

906
degree is about 1.1cm resolution on the monitor.
The result shows that the median filter brought even more
accurate results than the mean filter though the median
filter, but took more time. Nevertheless, the system already
reduced the calculation amount and secured enough room for
calculation. Thus, no problem occurred to use the proposed
modified median filter.

Figure 4. Controller board of the implemented eye tracking system

Figure 5. LED rings for interleaved exposures

IV. T RACKING R ESULTS


To acquire an image from each person, we set up the Figure 6. The relation between gaze point and calculated point
proposed system that is 1680×1050 resolution, 27-inch
monitor, 90Cm distance between the human eye and pro-
posed eye gaze tracking system. The test process includes
seeing the monitor for a few seconds as shown in Fig. ??, V. C ONCLUSION
recording/calculating the true value with filtering method The paper suggests the solution for the disabled who can
and displaying the results. After that, the test can check use their eyes only. The eye gaze tracking fails because
the eye gaze tracking point of the proposed system and bright light indoors or the direct sunlight outdoors prevents
the distance between the gaze point and calculated point is the research from acquiring the bright pupil and dark pupil
measured as shown in Fig. 6. The test result on four people that are two clear consecutive images intended. Thus, it
are shown in Fig. 7. suggests the adaptive exposure controller to address the
Fig. 7 shows the result of a skilled individual who has issue. The PCCR eye gaze tracking system performs with
used the system three times or more when Person gazes reflection of the IR LED to the eye and brightness of the IR
the PC screen. Fig. 7.(a) is the output of the eye gaze LED plays a critical role to find the bright pupil and dark
tracking system that performs 12-point calibration while Fig. pupil. However, surrounding brightness does not produce
7.(b) uses a median filter to get accuracy of about one unlimited brightness of the IR LED and very bright light
or two degrees. The eyes stayed for the fixation time of may harm human eye. The previous system without the
300ms, which is deemed valid data. Otherwise, the incoming adaptive exposure controller brought two consecutive images
measurement value is ignored. This accuracy occurred at regardless of appropriate exposure. Though they were not
the corners of the monitor while it was especially difficult the intended bright pupil and dark pupil images, it repeated
to produce accuracy on the left top corner. because it was unnecessary image calculation by continuing to grab images.
far compared with the rest. It is considered that the result However, knowing the adaptive exposure timing to obtain
was produced due to the error as big as the circle diameter desired images makes it possible to grab the images taken
that happened during gazing at the monitor in the calibration at the desired moment without the need to calculate all 30
process, while the circle became smaller in the process of images per second after the adaptive exposure controller is
guiding during calibration. It is expected the error can be proposed. The proposed system produces the remote eye
reduced down to one degree or smaller if the gazer is guided gaze tracking system optimal to the form factor for the
to be able to see the circle bigger during calibration. When disabled and the eye gaze tracking system of general PCCR
the distance is 65cm between the monitor and the gazer, one method using various image processing technology. Gaze

907
[7] V. Henn, “Pathophysiology of rapid eye movements in the
horizontal, vertical and torsional directions” Biol Cybernetic.,
vol. 60, pp. 411-420, 1989.

[8] W. Lemahieu and B. Wyns, “Low cost eye tracking for


human-machine interfacing”, Journal of Eye tracking, Visual
Cognition and Emotion, vol 1, no.1, pp. 1-12, 2010.

[9] A. Amir, L. Zimet, A. Sangiovanni-Vincentelli, and S. Kao,


“An embedded system for an eye-detection sensor”, Computer
Vision and Image Understanding, vol. 98, no. 1, pp. 104-123,
2004.

[10] A. L. Yarbus, “Eye Movements and Vision”, Plenum Press,


(a) New York, 1967.

[11] ALS worldwide, http://www.alsworldwide.org/mission.html

[12] E. D. Guestrin and M. Eizenman, “General theory of remote


gaze estimation the pupil center and cornearl reflection”, IEEE
Transactions on Biomedical Engineering, vol.53, no. 6, pp.
1124-1133, 2006.

(b)
Figure 7. Measurement value and tracking result of the proposed system on
PC screen(person1)(a)measurement value (b)tracking result of the proposed
system

accuracy that is the performance yardstick of the eye gaze


tracking system is set at about one degrees and 12 point
calibration is used to reflect the gazers eye error.
R EFERENCES
[1] L. Young and D. Sheena, “Methods and designs: survey of eye
movements recording methods”, Behavioral Research Methods
and Instrumentation, pp. 397-429, 1975.

[2] R. J. K. Jacob, “The use of eye movements in human-computer


interaction techniques: what you look at is what you get”,
ACM Transactions on Information System, Association for
Computing Machinery, New York, vol. 9, pp. 152-169, 1991.

[3] Duchowski, “Eye tracking methodology: theory and practice”,


Springer, London, 2002.

[4] G. T. Buswell, “How people look at pictures: a study of the


psychology of perception in art”, The University of Chicage
Press, Chicago, 1935.

[5] T. E. Hutchinson, K. P. Jr. White, W.N. Martin, K. C. Reichert,


and L. A. Frey., “Human-computer interaction using eye-gaze
input”, IEEE Transaction on Systems, Man, And Cybernetics,
vol.19, no. 6, 1989.

[6] A. Yilmaz, O. Javed, M. Shah, “Object tracking: a survey”,


ACM Comput., vol. 38, no. 4, pp. 45, 2006.

908

You might also like