VR Summary
VR Summary
Definition:
o MR blends physical and virtual elements and spans the reality-virtuality
continuum. (RV Continuum)
o This continuum includes:
Augmented Reality (AR): Virtual objects overlaid on the real world.
Augmented Virtuality (AV): Real-world data integrated into virtual
environments.
Virtual Reality (VR): Fully immersive virtual environments where
real-world stimuli are minimized.
2. Classification Dimensions
Immersion:
o A technical measure of how much sensory input is directed towards the VR
environment, such as visuals, audio, or haptics.
o Key factors:
High-resolution displays.
Wide field of view (FOV).
Spatialized audio.
Presence:
o The psychological state of "being there."
o Two main types:
Place Illusion (PI): Feeling of being in the virtual environment.
Plausibility Illusion (Psi): Belief that the environment reacts in a
realistic and predictable way.
4. Historical Milestones in VR
Pre-Modern Era:
o 1838: Sir Charles Wheatstone introduces stereoscopic displays.
o 1929: Edwin Link designs the Link Trainer, a flight simulator.
Modern Foundations:
o 1960s:
Ivan Sutherland’s "Sword of Damocles" (first HMD).
Morton Heilig’s "Sensorama" (early multimodal simulator).
o 1990s: Introduction of CAVE (Cave Automatic Virtual Environment).
o 2010s: Consumer-level VR with devices like Oculus Rift.
5. VR Technologies
Display Systems:
o HMDs (Head-Mounted Displays):
Provide stereoscopic visuals and are equipped with tracking
technologies.
Examples: Oculus Rift, HTC Vive.
o Projection-Based VR:
Large-scale immersive systems like CAVEs.
Input and Tracking:
o 6DOF (Degrees of Freedom) tracking for position and orientation.
o Tools include optical tracking, inertial sensors, and hybrid systems.
6. Challenges in VR
1. Latency:
o Delays between user actions and system response break immersion.
o Threshold: Latency should be below 20ms for comfortable usage.
2. Tracking Accuracy:
o Imprecise tracking disrupts user experience, especially in tasks requiring fine
control.
3. Cybersickness:
o Symptoms include nausea and dizziness due to sensory conflicts or lag.
o Solutions:
Lowering latency.
Minimizing rapid motion or large FOV changes.
4. Accessibility:
o Cost and hardware requirements remain barriers for widespread adoption.
7. Applications of VR
2. Graphics Pipeline
o Shader Programs:
Vertex Shaders: Process each vertex (position, color, texture).
Fragment Shaders: Handle pixel-level details like lighting and
shadows.
3. Geometric Primitives
Triangles:
o Most commonly used due to simplicity and computational efficiency.
o Defined by:
Vertices (points in 3D space).
Edges (lines connecting vertices).
Normals (perpendicular vectors for lighting calculations).
Meshes:
o Collections of interconnected triangles forming complex 3D models.
4. Transformations
Types:
o Translation: Moves objects in the 3D space.
o Rotation: Spins objects around an axis.
o Scaling: Changes object size along specific axes.
Matrix Representation:
o Transformations use matrix multiplication for efficient computation.
5. Shading Models
Flat Shading:
o Applies a single color per polygon.
o Pros: Fast computation.
o Cons: Unrealistic for curved surfaces.
Gouraud Shading:
o Interpolates colors across vertices.
o Pros: Smooth transitions.
o Cons: Inaccurate for highlights.
Phong Shading:
o Interpolates normals for pixel-precise lighting.
o Pros: High realism, handles reflections well.
6. Lighting Models
Ambient Lighting:
o Provides uniform light across the scene.
Diffuse Lighting:
o Depends on surface orientation relative to the light source.
Specular Lighting:
o Creates shiny reflections, adding realism.
7. Textures
Purpose:
o Adds surface details like patterns or color variations.
UV Mapping:
o A process for mapping 2D images (textures) onto 3D models.
Bump Mapping:
o Simulates surface irregularities without modifying the geometry.
Frustum Culling:
o Ensures only objects within the camera’s view are rendered.
Backface Culling:
o Ignores surfaces facing away from the camera.
Level of Detail (LOD):
o Dynamically adjusts model complexity based on distance.
Scientific Visualization:
o Use Case: Visualizing complex physical or biological systems (e.g., molecular
structures, fluid dynamics).
o Benefits: Provides intuitive understanding of data, accelerates scientific
discovery.
Information Visualization:
o Focuses on abstract data (e.g., financial trends, network connections).
o Tools like heat maps, scatter plots, and parallel coordinates help users extract
insights.
Therapeutic Applications:
o Phobia Treatment:
Virtual Reality Exposure Therapy (VRET) gradually exposes patients
to their fears.
o Pain Management:
VR distracts patients during painful procedures, such as burn
treatments.
o PTSD Rehabilitation:
Simulates scenarios for stress inoculation and coping mechanisms.
Training:
o Surgeons and medical students use VR to practice procedures in a risk-free
environment.
Military:
o Combat scenarios for training soldiers without physical danger.
o Examples include flight simulators and virtual shooting ranges.
Aviation:
o VR flight simulators replicate aircraft controls and navigation.
Safety Training:
o Realistic simulations of emergency scenarios (e.g., fire drills, earthquake
responses).
5. Education
Interactive Learning:
o Immersive environments for subjects like STEM education or history.
Virtual Field Trips:
o Provides students access to inaccessible locations like outer space or ancient
ruins.
6. Cultural Heritage
Historical Reconstructions:
o VR recreates lost or damaged artifacts and environments.
o Allows users to explore ancient civilizations interactively.
Gaming:
o AAA VR titles (e.g., Skyrim VR) push the boundaries of interactivity and
storytelling.
o Indie games (e.g., Beat Saber) focus on creativity and engagement.
Interactive Art:
o VR installations enable users to experience art in dynamic ways, often
involving user input to modify the artwork.
8. Industrial Design
Prototyping:
o Enables designers to visualize and refine products before manufacturing.
o Reduces costs and accelerates development cycles.
Haptic Integration:
o Allows evaluation of ergonomics and usability of controls or interfaces.
9. Tourism
Virtual Tourism:
o Explores remote locations through photorealistic reconstructions.
o Examples: Google Earth VR, museum tours.
Projection Displays:
o Small Scale: Powerwalls and HoloBenches for detailed visualizations.
o Large Scale:
CAVE (Cave Automatic Virtual Environment):
Multi-screen immersive environments that surround the user.
Use synchronized projectors and stereoscopic glasses.
Domes: Provide a 360° field of view; common in planetariums.
o Challenges:
Calibration across screens.
Perspective distortion for multiple viewers.
Head-Mounted Displays (HMDs):
o Wired HMDs:
Connected to PCs or consoles for high-fidelity rendering.
Examples: HTC Vive, Valve Index.
Pros: High resolution, wide field of view.
Cons: Mobility constraints due to cables.
o Mobile HMDs:
Standalone devices like Oculus Quest.
Provide 6DOF tracking and real-time processing.
o Mixed Reality HMDs:
Optical See-Through: Combines digital overlays with the real world
using transparent screens (e.g., Microsoft HoloLens).
Video See-Through: Captures the real world and overlays it with
virtual content.
Autostereoscopic Displays:
o Deliver 3D visuals without the need for glasses.
o Technologies include lenticular lenses and light field displays.
Purpose:
o Provide tactile feedback, simulating textures, resistance, or force.
Types:
o Tactile Feedback: Simulates surface textures.
o Force Feedback:
Used in gloves or exoskeletons for resistance simulation.
Applications: Medical training, object manipulation.
Vestibular Displays:
o Simulate motion and balance changes to enhance immersion.
o Examples: Motion platforms for driving and flight simulators.
Olfactory Displays:
o Deliver scents to enrich sensory experiences.
o Applications: Tourism, food marketing, therapeutic environments.
Taste Displays:
o Simulate taste sensations using electrical stimulation.
o Example: Projects like "Project Nourished" for food experiences in VR.
Key Characteristics:
o Static Accuracy: Precision for stationary measurements.
o Dynamic Accuracy: Accuracy during motion.
o Latency: Delay between user movement and system response.
Types of Tracking:
1. Mechanical:
Uses rigid linkages for tracking.
Pros: High precision.
Cons: Limited range of motion.
2. Magnetic:
Tracks positions using magnetic fields.
Pros: No line-of-sight requirement.
Cons: Interference from metal objects.
3. Optical:
Cameras track markers or LEDs.
Examples: Lighthouse tracking (HTC Vive), SLAM (Simultaneous
Localization and Mapping).
Pros: High accuracy.
Cons: Requires clear line of sight.
4. Inertial:
Relies on accelerometers and gyroscopes.
Pros: Standalone, works in any environment.
Cons: Prone to drift and cumulative errors.
5. Acoustic:
Tracks sound wave propagation.
Cons: Susceptible to noise interference and latency.
2. Input Devices
Standard Controllers:
o Include buttons, triggers, and joysticks.
o Examples: SteamVR controllers, Oculus Touch.
Advanced Controllers:
o Include features like pressure sensitivity and capacitive touch (e.g., Valve
Index Knuckles).
Data Gloves:
o Capture hand and finger movements.
o Types:
Strain Gauge Gloves: Measure resistance changes as fingers bend.
Optical Gloves: Use fiber optics for precise tracking.
o Applications: Object manipulation, sign language recognition.
Treadmills:
o Provide a continuous walking surface (e.g., Infinadeck).
Slidemills:
o Users slide their feet on low-friction surfaces (e.g., Virtuix Omni).
Spherical Devices:
o Users move inside a sphere to simulate walking (e.g., VirtuSphere).
Comprehensive Summary of VR-06: Stereoscopy and
Perception
1. Stereoscopic Vision
Concept:
o Combines images from two perspectives (left and right eyes) to create a 3D
effect.
Key Elements:
o Parallax:
Positive Parallax: Objects appear behind the screen.
Negative Parallax: Objects appear in front of the screen.
o Binocular Disparity:
The difference in views between the two eyes, essential for depth
perception.
2. Depth Perception
Monocular Cues:
o Texture gradients, shadowing, motion parallax.
Binocular Cues:
o Disparity and convergence.
Challenges:
o VR often distorts perceived distances, leading to underestimation of object
proximity.
3. Cybersickness
Causes:
o Conflicts between visual and vestibular systems.
o Latency or inconsistent motion.
Mitigation:
o Reduce field of view during motion.
o Use teleportation instead of continuous movement.
4. Accommodation-Vergence Conflict
Problem:
o In VR, the focal distance (accommodation) and convergence (eye alignment)
are mismatched.
Effects:
o Causes visual fatigue or discomfort.
Solutions:
o Use varifocal displays or light field displays.
Abstract Interaction:
o Indirect manipulation techniques, such as using controllers or menus.
Natural Interaction:
o Direct interaction mimicking real-world actions (e.g., hand gestures, voice
commands).
2. Interaction Techniques
Selection:
o Ray-Casting:
Projects a virtual ray from the user’s controller or gaze to select
objects.
Pros: Effective for distant objects.
Cons: Difficult to select small or crowded objects.
o Virtual Hand:
Simulates a hand interacting with objects.
Pros: Intuitive for close objects.
Cons: Ineffective for distant selections.
o Gaze-Based Selection:
Uses eye or head tracking to select objects by looking at them.
Challenges: "Midas Touch" problem (unintentional selections).
Manipulation:
o Techniques like HOMER combine ray-casting for selection and virtual hand
for manipulation.
o Scaling and rotation allow users to adjust object dimensions and orientation.
Gestures:
o Recognized through devices like Leap Motion.
o Types:
Static Gestures: Pose-based interactions.
Dynamic Gestures: Movement-based commands.
o Challenges: User fatigue during prolonged use.
3. Navigation
Wayfinding Aids:
o Breadcrumbs, minimaps, landmarks to guide users.
Techniques:
o Fly-Through:
Continuous movement in 3D space.
Cons: High potential for motion sickness.
o Teleportation:
Instantly move to a target location.
Reduces cybersickness, but may break immersion.
o Redirected Walking:
Adjusts the virtual path subtly to fit the real-world space.
o Scaling Navigation:
Resizes the user or environment to traverse large spaces.
4. Comfort Modes
Purpose:
o Translates data into visual representations to enhance comprehension and
decision-making.
Types of Visualization:
o Scientific Visualization:
Focuses on physical phenomena like weather patterns or biological
processes.
Emphasizes accuracy and detail.
o Information Visualization:
Abstract data such as financial trends or social networks.
Tools: Heat maps, scatter plots.
2. Visualization Challenges
3. Immersive Analytics
Definition:
o Combines virtual or augmented environments with analytics tools.
o Example: A 3D heat map visualized in a VR environment.
Techniques:
o Linked Views:
Synchronizes multiple visualizations to provide context.
Example: A 2D chart linked with a 3D representation.
o Globe Visualization:
Projects geospatial data onto a 3D sphere for intuitive exploration.
Applications:
o Logistics optimization, disaster response planning, collaborative data analysis.
2. Challenges
Latency:
o Delays between input and output disrupt synchronization and immersion.
Bandwidth:
o High data transmission for audio, video, and tracking.
Scalability:
o Managing large numbers of users without performance degradation.
3. Design Approaches
Database Topologies:
o Centralized: A single server manages data.
o Distributed: Data is shared across multiple servers.
Connection Topologies:
o Peer-to-peer: Direct connections among users.
o Hybrid: Combines centralized and peer-to-peer approaches.
4. Example Systems
SIMNET:
o Early military simulator using distributed architecture.
MASSIVE:
o Introduced the concept of auras for efficient interaction zones.
2. Measuring Presence
3. Evaluation Guidelines
What is HOMER?
HOMER stands for Hand-centered Object Manipulation Extending Ray-casting. It is a
hybrid interaction technique designed to enable intuitive selection and manipulation of
objects in a virtual environment, especially when objects are beyond the user’s immediate
reach.
Core Principles
Workflow
1. Object Selection:
o A ray is emitted from the user’s input device (e.g., controller or hand).
o The ray intersects with an object, and the system highlights it to confirm
selection.
2. Object Attachment:
o The selected object is virtually "grabbed" by a virtual hand that appears at the
ray’s endpoint.
3. Object Manipulation:
o Users can move the object by repositioning the controller or hand.
o Rotations are achieved by orienting the input device.
o Scaling may involve gestures or specific controller inputs.
4. Object Release:
o The object is "dropped" by terminating the interaction, typically via a button
press or gesture.
Advantages of HOMER
1. Natural Interaction:
o Combines the familiarity of ray-casting (for selection) and direct manipulation
(via virtual hand).
o Supports a wide range of object sizes and distances.
2. Scalable:
o Works seamlessly in small or large virtual environments.
o No physical reach constraints due to the extension capability.
3. Precision:
o Offers high precision for both distant and nearby objects.
o The dynamic scaling ensures the user can fine-tune object placement.
Challenges
1. Occlusion:
o Objects can be obscured if multiple items are along the ray path.
o Requires efficient selection mechanisms like filtering or prioritization.
2. Complexity:
o Initial learning curve for users unfamiliar with ray-casting or virtual hand
manipulation.
3. Physical Fatigue:
o Extended use of ray-casting or virtual hand gestures can tire users, especially
in prolonged VR sessions.
Applications
Gaming:
o Picking up and manipulating objects in adventure or puzzle games.
Industrial Design:
o Assembling virtual prototypes or adjusting components in a 3D workspace.
Education and Training:
o Teaching physics concepts or performing tasks like surgical training with
distant elements.
Architecture and Interior Design:
o Placing and resizing furniture or modifying structural elements.
HOMER exemplifies the balance between natural and abstract interaction techniques,
making it a versatile tool in virtual environments. It enables users to interact intuitively with
both near and far objects, enhancing usability and immersion in VR applications.
3. Sensory Fidelity:
o Realistic replication of sensory inputs (e.g., high-resolution visuals,
spatial audio).
4. Latency:
o Lower latency enhances the sense of presence by reducing the
disconnect between user actions and system response.
3. Use of VR in Psychology
Therapeutic Applications:
o Phobia Treatment: Virtual Reality Exposure Therapy (VRET) for
desensitization (e.g., heights, flying).
o Rehabilitation: Stroke patients use VR to improve motor skills.
Applications:
o Used for precise tracking in VR for headsets, controllers, and full-
body motion capture.
Limitations:
o Requires well-lit environments and clear line of sight.
How It Works:
o Combines ray-casting for selection and virtual hand techniques for
manipulation.
o Once the object is selected, a virtual hand is used to manipulate it,
even at a distance.
Advantages:
o Intuitive for both close and distant object interactions.
Challenges:
o Struggles with small or very distant objects due to limited precision.
Teleporting:
o Moves users instantly to a location; reduces motion sickness
compared to continuous movement.
Redirected Walking:
o Subtly alters the virtual path to make users believe they are walking
straight in a constrained space.
Evaluation Tools
NASA-TLX:
o Measures workload through six subscales (e.g., mental demand,
physical demand).
System Usability Scale (SUS):
o Rates usability based on user feedback across ten questions.
Presence Questionnaires (PQ):
o Evaluates the user’s sense of presence during the VR experience.
Linked Views:
o Synchronize multiple visualizations for real-time comparative
analysis.