0% found this document useful (0 votes)
204 views

20Ise652-Virtual Reality

The document outlines the key components and learning outcomes of a course on virtual reality. The course will explain the fundamentals, hardware, software, applications and principles of virtual reality. Students will learn about creating virtual environments and evaluating suitable applications. The document lists two textbooks for the course and provides module outlines on topics like the definition of virtual reality, commercial VR technology, and the five classic components of a VR system including input devices, VR engine, and output devices.

Uploaded by

Sravan Kumar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
204 views

20Ise652-Virtual Reality

The document outlines the key components and learning outcomes of a course on virtual reality. The course will explain the fundamentals, hardware, software, applications and principles of virtual reality. Students will learn about creating virtual environments and evaluating suitable applications. The document lists two textbooks for the course and provides module outlines on topics like the definition of virtual reality, commercial VR technology, and the five classic components of a VR system including input devices, VR engine, and output devices.

Uploaded by

Sravan Kumar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 61

2 0 I S E 6 5 2 - V I RT U A L R E A L I T Y

COURSE OUTCOME

At the end of the Course, the Student will be able to:


• Explain fundamentals of Virtual Reality Systems.
• Summarize the hardware and software of the Virtual Reality.
• Analyze the applications of Virtual Reality.
• Illustrate technology, underlying principles, its potential and limits.
• To learn about the criteria for defining useful applications.
• Explain process of creating virtual environments.
BOOKS

TEXT BOOKS:

1. Samuel Greengard, Steven Jay Cohen, “Virtual Reality”, Gilden Media, First Edition,
2019.

2. Gregory C. Burdea& Philippe Coiffet, “Virtual Reality Technology”, Second Edition,


John Wiley & Sons, 2006

REFERENCE BOOKS:
Module 1: Contents

Introduction: The three I’s of virtual reality, commercial VR


technology and the five classic components of a VR system. Input
Devices: Three dimensional position trackers, navigation and
manipulation, interfaces and gesture interfaces.
Definition of Virtual Reality

• Virtual Reality – A hypothetical 3D visual world created by a computer, user


wears special goggle, headsets and fiber optic gloves etc., and can enter and
move about in this world and interact with objects as if inside it.

• Virtual reality (VR) refers to a computer-generated simulation in which a


person can interact within an artificial three-dimensional environment using
electronic devices, such as special goggles, with a screen or gloves fitted with
sensors.
• Virtual reality (VR) refers to a computer-generated simulation in
which a person can interact within an artificial three-dimensional
environment using electronic devices, such as
• special goggles with a screen or gloves fitted with sensors.
• VR applications immerse the user in a computer-generated
environment that simulates reality through the use of interactive
devices, which send and receive information and are worn as goggles,
headsets, gloves, or body suits.
The three I’s of virtual reality:
Non-immersive Virtual Reality

Non-immersive virtual reality is a type of virtual reality in


which you interact with a virtual environment usually through
a computer where you can control some characters or
activities within the experience, but the virtual environment is
not directly interacting with you.

A good example of non-immersive virtual reality would be a


computer game like Dota 2. You can control aspects of your
character and they will have an effect on the virtual
environment of the game. Technically you are interacting with
a virtual environment but not directly. Your character in the
game does that.
3- Semi-Immersive Virtual Reality

A semi-immersive virtual reality is something in between


non-immersive and fully immersive virtual reality. Using a
computer screen or VR glasses, you can move around in a
virtual environment but other than your visual experience
you will have no physical sensations to enhance the
experience.

A virtual tour can be a good example of semi-immersive


virtual technology.
 
Augmented Reality

Augmented Reality is a type of virtual reality that lets


the user see the real world usually through a phone
screen and make virtual changes to it on the screen. A
good example that will help you better understand
augmented reality is the mobile game application
Pokémon Go. 
Three I’s of virtual reality experience design
The Three I’s of Virtual Reality
The Three I’s of Virtual Reality
1. Immersion:

Immersion is what makes VR feel real to the audience. Giving the user a real-time perception of
being physically present in a virtual environment and a possibility to interact with it without
disruptions, translates directly into increased comfort towards experiences. It is like putting on a pair
of swimming goggles and jumping in the deep end and your world has changed instantly.

2. Interaction

In terms of functionality, VR is responsive to the user’s input – gestures, verbal commands, head
movement tracking etc. Customer’s interactions in the virtual world can be tracked and used as a tool
to understand customer’s needs and influence decision process

3. Imagination:

VR is the newest medium to tell a story and experience it, which gives an infinite number of
possibilities for marketing. The user’s mind capacity makes it possible to observe non-existent things
and create the illusion of them being real. Virtual experience can be designed to unfold a story, step
COMMERCIAL VR TECHNOLOGY
• DataGlove: The first company to sell VR products was VPL Inc., headed by Jaron
Lanier. This company produced the first sensing glove, called the DataGlove (Figure
1.6a).
• The standard interfaces components (and still today)
were the keyboard and the mouse. Compared to these,
the VPL DataGlove represented a quantum
improvement in the natural way one could interact with
computers. Its fiber-optic sensors allowed computers to
measure finger and thumb bending, and thus
interaction was possible through gestures.
COMMERCIAL VR TECHNOLOGY
• PowerGlove : After the appearance of the VPL DataGlove, the game company Nintendo
introduced the much cheaper PowerGlove, shown in Figure 1.6b. It used ultrasonic sensors to
measure wrist position relative to the PC screen and to measure finger bending. The downfall of
the PowerGlove was lack of sufficient games that used it, such that by 1993 its production had
stopped.
EyePhones : The first commercial head-mounted
displays (HMDs), called EyePhones, were introduced by
VPL in the late 1980s. These HMDs used LCD displays
to produce a stereo image, but at extremely low
resolution (360 x 240 pixels), such that virtual scenes
appeared blurred. Other drawbacks were high price
COMMERCIAL VR TECHNOLOGY

RB2 Model 2 integrates the EyePhone HMD interface, the VPL DataGlove Model 2
electronic unit, a spatial tracking unit for the HMD, a design and control workstation
(processors), as well as graphics adapters and to an optional 3D sound system. The next
step in integration was to shrink each of these components and put them on a board in a
single desk-side cabinet.
COMMERCIAL VR TECHNOLOGY
• Vision: In early 1991 a company in the United Kingdom, Division Ltd., introduced the

first integrated commercial VR workstation. It was called ‘Vision’ which had multiple
parallel processors, stereo display on an HMD, 3D sound, hand tracking, and gesture
recognition sensors, but it came at a high price ($70,000).

• The architecture also had an input/output (I/O) card and was scalable, allowing additional
I/O processors to be added.

• WorldToolKit (WTK): In 1992, the small U.S. company Sense8 Co. developed the first

version of its WorldToolKit (WTK), a library of C functions written specifically for VR


COMMERCIAL VR TECHNOLOGY

Virtual Reality Toolkit (VRT3): Another popular toolkit of the


1990s was the Virtual Reality Toolkit (VRT3), developed in the
United Kingdom by Dimension International. VRT3 was
designed to run on multiple computing platforms. Also, VRT3
used graphical programming through menus and icons. This
made programming easier to learn, but less rich in functionality.

At the same time, steady increase in HMD resolution was


happened during the 1990s meant much sharper images without
the unwanted jagged pixelation effect of earlier models. Image resolution evolution in the
1990s
COMMERCIAL VR TECHNOLOGY

Growth of the virtual reality industry since 1993


According to this recent study, the VR market is expected to reach $ 56.25 billion by 2025.
3.The five classic components of a VR system:
The five classic components of a VR system
• Input Devices:
 The input devices are the means by which the user interacts with the virtual world.
 They send signals to the system about the action of the user, so as to provide appropriate reactions
back to the user through the output devices in real time.
 They can be classified into tracking device, point input device, bio-controllers and voice
device.
 Tracking devices sometimes referred to as position sensors, are used in tracking the position of the
user, and they include, electromagnetic, ultrasonic, optical, mechanical and gyroscopic sensors,
data gloves, neural and bio or muscular controllers.
 Point-input devices include the normal mouse with extended functions and capability for 3D.
 Voice communication is a common way of interaction among humans. So it feels natural to
incorporate it into a VR system. Voice recognition or processing software can be used in
The five classic components of a VR system
• VR Engine:

 The VR engine could be a standard PC with more processing power and a powerful
graphics accelerator or distributed computer systems interconnected through high speed
communication network.
 The computer also handles the interaction with users and serves as an interface with the
I/O devices.
 In VR systems, the VR engine or computer system has to be selected according to the
requirement of the application.
 Processing power, Graphic display, image generation and I/O devices are some of the
most important factors and time consuming task in a VR system.
 The VR engine is required to recalculate the virtual environment approximately every
• The choice of the VR engine depends on the application field, user, I/O devices,
level of immersion and the graphic output required, since it is responsible for
calculating and generating graphical models, object rendering, lighting,
mapping, texturing, simulation and display in real-time.

• The computer also handles the interaction with users and serves as an interface
with the I/O devices.

• A major factor to consider when selecting the VR engine is the processing power
of the computer, and the computer processing power is the amount of senses
(graphical, sound, haptic, etc) that can be rendered in a particular time frame as
pointed.
The five classic components of a VR system
• Output Devices:

 The output devices get feedback from the VR engine and pass it on to the users through the
corresponding output devices to stimulate the senses.

 The classifications of output devices based on the senses are: graphics (visual), audio (aural),
haptic (contact or force), smell and taste. Of these, the first 3 are frequently used in VR systems,
while smell and taste are still uncommon.

 Two possible common options for the graphics are the stereo display monitor, and the HMD
which provides a higher level of immersion.

 Audio or sound is an important channel in VR; 3D sound can be used in producing different
sounds from different location to make the VR application more realistic.
The five classic components of a VR system
• Virtual Reality System Software and Tools: Virtual reality system software is a collection of
software for designing, developing and maintaining virtual environments and the database where
the information is stored. The tools can be classified into modeling tools and development tools.

• VR Modeling Tools: There are many modeling tools available for VR designing, the most
common ones are, 3ds Max, Maya and Creator. Engineering specific applications might use
software like CATIA, Pro/E, Solidworks, UG, etc.

• VR Development Tools: VR is a complex and integrative technology that borrows from many
other technologies, such as real time 3D computer graphics, tracking technology, sound
processing, and haptic technology, among others, therefore software development flexibility and
real time interaction is needed. Starting the development of a VR system from the basic codes in
• Careful consideration is needed in choosing VR development tools
due to the difference in flexibility provided by different software
packages as related to model input available, interface
compatibility, file format, animation ease, collision detection,
supported I/O devices and support community available to the
users.
TOP 5 VR SOFTWARE DEVELOPMENT
TOOLS
• Unity

• Amazon Sumerian

• Google VR for everyone

• Unreal Engine 4 (UE4)

• CRYENGINE
• Software tools in virtual reality

• 1. Unity
• Unity is famous for game development, however, it helps you to build VR solutions for many other sectors too. E.g.,
you can create VR solutions for automotive, transportation, manufacturing, media & entertainment, engineering,
construction, etc. with Unity.

• 2. Amazon Sumerian

• Amazon Sumerian is the VR engine from AWS, and you don’t need 3D graphics or VR programming skills to use it.
Sumerian works with all popular VR platforms like Oculus Go, Oculus Rift, HTC Vive, HTC Vive Pro, Google
Daydream, and Lenovo Mirage, moreover, it works with Android and iOS mobile devices too.

• 3. Google VR for everyone

• Google, the technology giant offers a wide range of VR development tools, and you can use them to create
immersive VR experience for your stakeholders. You can access these tools on the Google VR developer portal.
• 4. Unreal Engine 4 (UE4)
• Unreal Engine 4 (UE4) offers a powerful set of VR development tools.
With UE4, you can build VR apps that will work on a variety of VR
platforms, e.g., Oculus, Sony, Samsung Gear VR, Android, iOS, Google
VR, etc
• 5. CRYENGINE
• Well-known to 3D game developers, CRYENGINE is a robust choice
for a VR software development tool. You can build virtual reality apps
with it that will work with popular VR platforms like Oculus Rift,
PlayStation 4, Xbox One, etc..
• 6. Blender
• Blender is an open-source 3D creation suite, and it’s free. At the time of
writing, Blender 2.80 is its latest release. The Blender Foundation, an
independent organization for public benefit governs the development of
Blender.
• 7. 3ds Max
• 3ds Max is a popular 3D modeling and rendering software from Autodesk, and
you can use it for design visualization, creation of video games, etc. 
• 8. SketchUp Studio
• SketchUp Studio is a powerful 3D modeling tool focused on the construction
industry and architecture, and you can use it for virtual reality app development.
It’s useful for use cases like architecture, commercial interior design, landscape
architecture, residential construction, 3D printing, and urban planning.
• 9. Maya
• Maya is yet VR software development tool from Autodesk. With Maya, you can
create 3D animations, motion graphics, and VFX software.
• 10. Oculus Medium
• Oculus, the well-known provider of VR platforms like Oculus Rift S, Oculus
Quest, and Oculus Go also offers powerful VR development software, named 
Medium. It’s a comprehensive tool, which allows you to create 3D assets. 
• Applications of Virtual Reality:

•VR has found vast applications in many fields due to its characteristics and the benefits it
provide in solving complex real-world problems. Some of the application areas include:

•Architecture

•Arts

•Business

•Design and Planning

•Education and Training

•Entertainment

•Manufacturing

•Medical and Scientific Visualization.


Input Devices:
3D position trackers,
Navigation and manipulation,
Interfaces and gesture interfaces

This topic describes the VR interfaces used in tracking, VR navigation, and gesture input.
Output devices for visual, auditory, and haptic feedback to the user are the focus of the next
Input Devices
• One of the three I's defining virtual reality stands for
interactivity.
• In order to allow human-computer interaction it is necessary
to use special interfaces designed to input a user's commands
into the computer and to provide feedback from the simulation
to the user
• Today's VR interfaces are varied in functionality and purpose.
For example, body motion is measured with 3D position
trackers or using sensing suits, hand gestures are digitized by
sensing gloves, visual feedback is sent to stereo HMDs and
large volume displays, virtual sound is computed by 3D
Input Devices
• Some of these input/output (I/O) devices are commercially
available, some are still prototypes in a field which has
become a very active research area.
• The aim of researchers is to allow faster and more natural
ways of interaction with the computer and thus overcome the
communication bottleneck presented by the keyboard and the
computer mouse.
THREE-DIMENSIONAL POSITION TRACKERS
• Definition for tracker: The special-purpose hardware used in VR to measure the real-time
change in a 3D object position and orientation is called a tracker.

• Many computer application domains, such as navigation, missile tracking, robotics,


biomechanics, architecture, computer-aided design (CAD), education, and VR, require
knowledge of the real-time position and orientation of moving objects within some frame of
• reference.
A moving object in 3D space has six degrees of freedom, three for
translations and three for rotations. If a Cartesian coordinate
system is attached to the moving object (as illustrated in Fig), then
its translations are along the X, Y, and Z axes. Object rotations
about these axes are called yaw, pitch, and roll, respectively.
These define a dataset of six numbers that need to be measured
THREE-DIMENSIONAL POSITION TRACKERS
• Virtual reality applications typically measure the motion of the
user's head, limbs or hands, for the purpose of view control,
locomotion, and object manipulation.

• In the case of the head-mounted display illustrated in Figure, the


tracker receiver is placed on the user's head, so that when the
posture of the head changes, so does the position of the receiver.

• The user's head motion is sampled by an electronic unit and sent


to a host computer. Then the computer uses the tracker data to
calculate a new viewing direction of the virtual scene and to
render an updated image.
THREE-DIMENSIONAL POSITION TRACKERS
• Tracker Performance Parameters

• All 3D trackers, regardless of the technology they use, share a number of very
important performance parameters, such as accuracy, jitter, drift, and latency. These
are illustrated in Figure.
• Performance parameters of 3D trackers are

Performance parameters of 3D trackers are


• Tracker Accuracy - represents the difference between the object's actual 3D position and
that reported by tracker measurements.
• Jitter - Jitter is the variation in the time between data packets arriving, caused by network
congestion, or route changes. The longer data packets take to transmit, the more jitter affects
audio/video quality.
• Drift – Drift is the steady increase in tracker error with time. As time passes, the
tracker inaccuracy grows, which makes its data useless. Drift needs to be
controlled periodically. Measuring the continuous movement from one position to another.
• Latency - Latency is the delay between action and reaction. In the case of the 3D
tracker, latency is the time between the change in object position/orientation and
the time the sensor detects this change. Having low latency is crucial when using a
Head-mounted Display for VRa
• Tracker update rate - It represents the number of measurements (datasets) that
the tracker reports every second.
THREE-DIMENSIONAL POSITION TRACKERS
Various types of trackers are there based on working principle,

• Mechanical Trackers

• Magnetic Trackers

• Ultrasonic Trackers

• Optical Trackers

• Hybrid Inertial Trackers


• 1.Mechanical Trackers
• Mechanical tracking operates by attaching linkages to the object
you wish to track. Those linkages have sensors at each of the joints
that report the angle between the linkages.
• Often this is done by placing a variable resistor (potentiometer) at the
joint and reading the voltage there. As the angle of the linkage changes, the
amount of resistance in the potentiometer changes and a corresponding change
in voltage (that you can measure) occurs.
• The voltage can then be used to determine the angle between linkages. This
information, in combination with the angles between all other linkages in the
system, can be used to compute the location and pose of the object
• 2.Magnetic Trackers
• Magnetic GPS trackers allow you to easily monitor anything from
single vehicles to entire fleets, so they are total fit for purpose no
matter what the specific requirement of the device is.
• With rechargeable battery life ranging from seven days to six months,
there is a magnetic GPS tracker for any requirement.
• 3.Ultrasonic Trackers
• Ultrasonic tracking uses soundwaves that are picked up by device
microphones to gather data, pinpoint a user's location, and more.
• The methodology utilizes sounds that can't be heard by humans, but
can be detected by various devices, such as smartphones and tablets.
4.Optical Trackers
• optical tracking is a means of determining in real-time the
position of an object by tracking the positions of either active
or passive infrared markers attached to the object.
• The position of the point of reflection is determined using a camera
system.
• Optical tracking is a 3D localization technology based
on monitoring a defined measurement space using two or
more cameras.
• Each camera is equipped with an infrared (IR) pass filter in front of
the lens, and a ring of IR LEDs around the lens to periodically
illuminate the measurement space with IR light.
5.Hybrid Inertial Trackers
The newly introduced hybrid system maps the trajectory of the
top of the head coming from a full-body motion tracking
system to the head trajectory of a camera system in global
space.
The fused data enable the analysis of possible correlations of all
observables.
Interfaces and gesture interfaces
Interfaces and gesture interfaces
• Trackballs and mouses have the advantage of
simplicity, compactness, and quiet operation. By
their nature they limit the freedom of motion of
the user's hand to a small area close to the desk.
• Therefore the natural motion of the user's hand is
sacrificed, and interaction with the virtual world is
less.
• In order to have large gesture-based interactions
with the VR simulation, it is necessary that the 1/0
devices maintain the hand freedom of motion
Interfaces and gesture interfaces
• It is also desirable to allow additional degrees of
freedom by sensing individual finger motions. The
human fingers have degrees of freedom associated
with flexion-extension and lateral abduction-
adduction, as illustrated in Figure
• Additionally, the thumb has an anteposition-
retroposition motion, which brings it in opposition to
the palm.
• Definition Gesture interfaces are devices that
measure the real-time position of the user's fingers Terminology of hand and finger
motions
Interfaces and gesture interfaces
1. The Pinch Glove: The drawbacks that most sensing
gloves needs user-specific calibration, complexity, and
high cost. Each person has a different hand size, with
women generally having smaller hand size than men. As
a consequence, the glove-embedded sensors will overlap
different finger locations for different users. In order to
reduce inaccuracies, most sensing gloves need to be
calibrated to the particular user wearing them. Users
have to place their hands in predetermined gestures and
the sensor output measured. These raw values are then
processed using glove specific algorithms. The only
Interfaces and gesture interfaces
• Advantages of Pinch glove: The Pinch Glove has numerous advantages in terms of its
simplicity, lack of a need for calibration and possibility to use both hands for gesture
interaction.

• Limitations of Pinch glove: The glove cannot measure intermediary finger configurations
(finger joint angles).
Interfaces and gesture interfaces
2. The 5DT Data Glove: Sensing gloves need to measure the finger
joint angles. The 5DT Data Glove 5W sensing glove illustrated in
Figure, is to have one sensor per finger and a tilt sensor to measure
wrist orientation. Additional sensors are available in the 5DT Data
Glove 16 option.
The advantage of fiber-optic sensors is their compactness and lightness,
and users feel very comfortable wearing the glove. The optical fibers
are joined to an optoelectronic connector on the back of the hand. One
end of each fiber loop is connected to an LED, while light returning
Interfaces and gesture interfaces
3. The Didjiglove: Another sensing glove is the Didjiglove,
which uses 10 capacitive bend sensors to measure the
position of the user's fingers. The capacitive sensors can
measure the bending angle electrically. It has an A/D
converter, a multiplexer, a processor, and an RS232 line for
communication with the host computer.
The glove latency is minimal (10 msec) and its low cost
make the Didgiglove useful for VR interactions as well.
Interfaces and gesture interfaces
4. The CyberGlove: A more complex (and more expensive) sensing glove, which uses linear
bend sensors, is the CyberGlove. This glove was invented by Jim Kramer. The fingertips is
removed for better ventilation and to allow normal activities such as typing, writing, etc. As a
result the glove is light and easy to wear. There are between 18 and 22 sensors in the glove,
used to measure finger flexing (two or three per finger), abduction (one per finger), plus
thumb anteposition, palm arch, and wrist yaw and pitch.
The glove allows a maximum of 150 datasets to be sent every second
Interfaces and gesture interfaces
Interfaces and gesture interfaces

Tutorial videos

https://www.youtube.com/watch?v=NJwFG0EoS7E

https://www.youtube.com/watch?v=2C2_kbjtjRU

https://www.youtube.com/watch?v=OK2y4Z5IkZ0

You might also like