VR Patterns
VR Patterns
W I T H E X A M P L E S U S I N G U N I T Y S O F T WA R E
S H A U N B A N G AY
The Monster Fun Book of Patterns for Developing Virtual Reality Applications
With examples using Unity software
Shaun Bangay
ISBN: 9781729142400
Contents
2.4.3 Example . . . . . . . . . . . . . . . . . 40
2.5 Quaternions: Representing Orientation . . . 43
2.5.1 Description . . . . . . . . . . . . . . . 43
2.5.2 Pattern . . . . . . . . . . . . . . . . . . 45
2.5.3 Example . . . . . . . . . . . . . . . . . 45
2.6 Where is everybody? Transformations, cam-
eras and coordinate systems . . . . . . . . . . 47
2.6.1 Description . . . . . . . . . . . . . . . 47
2.6.2 Pattern . . . . . . . . . . . . . . . . . . 51
2.6.3 Example . . . . . . . . . . . . . . . . . 51
2.7 The Game Loop: Frame by Frame . . . . . . 59
2.7.1 Description . . . . . . . . . . . . . . . 59
2.7.2 Pattern . . . . . . . . . . . . . . . . . . 59
2.7.3 Example . . . . . . . . . . . . . . . . . 60
2.8 Having children . . . . . . . . . . . . . . . . . 63
2.8.1 Description . . . . . . . . . . . . . . . 63
2.8.2 Pattern . . . . . . . . . . . . . . . . . . 64
2.8.3 Example . . . . . . . . . . . . . . . . . 66
2.9 Who’s the boss: Managing Managers . . . . 69
2.9.1 Description . . . . . . . . . . . . . . . 69
2.9.2 Pattern . . . . . . . . . . . . . . . . . . 69
2.9.3 Example . . . . . . . . . . . . . . . . . 71
2.10 Exercises . . . . . . . . . . . . . . . . . . . . . 75
3 Algorithmic Twiddles 79
3.1 All The Programming Skills You Will Ever
Need . . . . . . . . . . . . . . . . . . . . . 79
3.2 Variable Assignment . . . . . . . . . . . . . . 80
3.2.1 Description . . . . . . . . . . . . . . . 80
3.2.2 Pattern . . . . . . . . . . . . . . . . . . 81
3.2.3 Example . . . . . . . . . . . . . . . . . 82
3.3 Conditionals . . . . . . . . . . . . . . . . . . . 90
3.3.1 Description . . . . . . . . . . . . . . . 90
3.3.2 Pattern . . . . . . . . . . . . . . . . . . 91
3.3.3 Example . . . . . . . . . . . . . . . . . 93
3.4 Iteration . . . . . . . . . . . . . . . . . . . . . 95
3.4.1 Description . . . . . . . . . . . . . . . 95
3.4.2 Pattern . . . . . . . . . . . . . . . . . . 96
patterns for virtual reality 5
3.4.3 Example . . . . . . . . . . . . . . . . . 97
3.5 Abstraction . . . . . . . . . . . . . . . . . . . . 104
3.5.1 Description . . . . . . . . . . . . . . . 104
3.5.2 Pattern . . . . . . . . . . . . . . . . . . 105
3.5.3 Example . . . . . . . . . . . . . . . . . 106
3.6 Collections . . . . . . . . . . . . . . . . . . . . 115
3.6.1 Description . . . . . . . . . . . . . . . 115
3.6.2 Pattern . . . . . . . . . . . . . . . . . . 120
3.6.3 Example . . . . . . . . . . . . . . . . . 123
3.7 Fading with distance: Visibility, Scope and Life-
time . . . . . . . . . . . . . . . . . . . . . . . . 126
3.7.1 Description . . . . . . . . . . . . . . . 126
3.7.2 Pattern . . . . . . . . . . . . . . . . . . 128
3.7.3 Example . . . . . . . . . . . . . . . . . 129
3.8 Recursion or I’ve seen this somewhere before 131
3.8.1 Description . . . . . . . . . . . . . . . 131
3.8.2 Pattern . . . . . . . . . . . . . . . . . . 134
3.8.3 Example . . . . . . . . . . . . . . . . . 135
12 Resources 365
13 Bibliography 369
patterns for virtual reality 9
2014].
1.1.1 Description
A virtual reality application incorporates most of the
following components:
1.1.2 Pattern
The core of the virtual reality application is the scene
component. This provides:
Figure 1.1.1: Typical Some activities in a virtual world are not associated with
structure of virtual
any particular object but rather require consideration of
reality applications.
information from multiple sources at once, or at a global
level. These are often handled by managers. Managers
may also register with the scene graph (to obtain some
processing time) but often has an invisible representa-
tion. Managers are responsible for tasks such as collision
detection, physics simulation, or network synchroniza-
tion.
patterns for virtual reality 15
1.2 Manipulation
1.2.1 Description
One view of virtual reality is that it is a user interface (or
user experience) paradigm. If that is the case then the
key role of a virtual reality experience is to provide a way
to allow users to manipulate computational constructs
in order to achieve particular goals. For example, a vir-
tual reality experience is just an interface to allow a game
to be played more effectively, to allow data to be visu-
alized and interacted with directly, or to manage some
underlying software process or physical system mediated
through the virtual reality overlay.
The ways of interacting with virtual worlds have not
yet reached a stage of being completely standardized.
There are some benefits in standardization; users can
transfer skills to new applications and become immedi-
ately productive. However some of the benefits of the
various forms of virtual reality and the range of hard-
ware available to support them is that novel forms of
manipulation are still being developed. A more valu-
able skill at present is to be able to conceive of, design,
develop and evaluate forms of interaction with, and ma-
nipulation of, virtual worlds that make most effective
use of the technology in the specific context of the virtual
reality application that you are developing.
The patterns described represent some of the ways
in which manipulation strategies of virtual worlds have
been designed.1 1
Some sources of
guidelines for designing
virtual reality interfaces:
1.2.2 Pattern • Oculus: https:
//ocul.us/2R9YR7V
Typical manipulation operations in a virtual world in-
volve selecting items, or invoking actions on items that
may have been selected. A range of patterns exist to per-
form such manipulation.
Manipulation pattern: Action at a distance. The con-
troller is able to make things happen at distances beyond
16 shaun bangay
1.3 Locomotion
1.3.1 Description
Movement around a virtual world takes on different
forms, often constrained by the space available in the
physical setting and on the mobility of the participant
encumbered with the virtual reality equipment being
used. Locomotion is also associated with many of the
hazards of using virtual reality applications, including
motion sickness, and opportunities for physical injury.
As with manipulation of virtual environments, the
patterns below represent some of the known design
strategies for achieving effective, safe, and comfortable
motion.2 2
Some sources of
guidelines for designing
virtual reality loco-
1.3.2 Pattern motion:
• https://ocul.us/
Humans can sense forces. Unbalanced forces manifest 2Ap3xBB
2.1.1 Description
A virtual world consists of a collection of virtual objects.
All these objects co-exist in a common space; the scene.
Objects also have relationships to one another. For
example some objects may be attached to (or parts of)
other objects. As such, when the parent object moves, the
attached child objects also change location.
All objects have properties. All objects typically have
information about their location (including position,
orientation and size). Most objects have a visual repres-
entation of some kind to define their shape. Some objects
may have additional components defining extra prop-
erties; such as how they respond to the physics of the
virtual world, or specific behaviors unique to objects of
that type.
The scene graph provides a way of representing all
these details in a consistent and systematic fashion. As
a side effect it provides a way of addressing every single
piece of information stored in the graph i.e. it allows you
to easily name each piece of information and to modify
this information by referring to it by this name.
A graph consists of a number of nodes connected by
edges. Graphically this is drawn as a number of circles
(the nodes), connected by lines (the edges). A common
type of graph is a tree where every node (apart from the
root node) is connected by edges to one parent node, and
to a number of child nodes. Most scene graphs have a
tree-like structure although occasional edges may exist
between two child nodes.
2.1.2 Pattern
A scene graph consists of a tree-like graph where:
patterns for virtual reality 23
• The root node represents the scene. The root node Root node
usually do not have a label.
A
• The child nodes of the root node are the objects in the B
Position Rotation
• Some scene graphs may have sub-component (com-
Figure 2.1.2: Some of
ponent has several child components) or sub-property the different categories
(property is a collection of several child properties) of relationship between
nodes in a scene graph.
nodes.
Scene
In practice different VR engines may have spe-
... Avatar ... cific conventions for using a particular label. For ex-
Transform ample, retrieving a node A after following an object-
Position Rotation Scale component edge may require the name to be represented
x y z ... ... as GetComponent ( A).
Avatar.Transform.Position.y
Mesh
CubeMesh Filter
Box Collider
Default-Material
Shader Standard
Add Component
Project Console
_Scenes
Materials Prefabs Resources Scripts
Prefabs
Resources
Scripts
Shaders
patterns for virtual reality 25
2.2.1 Description
The word “object” has multiple meanings when it comes
to developing virtual reality systems. The different inter-
26 shaun bangay
1 using System.Collections;
2 using System.Collections.Generic;
3 using UnityEngine;
4
5 public class PropertyModifier : MonoBehaviour {
6
7 // Use this for initialization
8 void Start () {
9 GameObject cubeObject = GameObject.Find ("Cube");
10 GameObject sphereObject = GameObject.Find ("Sphere
");
11 GameObject cylinderObject = GameObject.Find ("
Cylinder");
12
13 // Move the cube 2 units to the left.
14 cubeObject.transform.position = new Vector3 (-2.0f
, 0.0f, 0.0f);
15 // Move the sphere 3 units upwards.
16 sphereObject.transform.position = new Vector3 (0.0
f, 3.0f, 0.0f);
17 // Set the cylinder material colour to red.
18 cylinderObject.GetComponent<Renderer>().materials
[0].color = new Color (1.0f, 0.0f, 0.0f);
19 // Copy the shape of the cylinder to the shape of
the cube.
20 cubeObject.GetComponent<MeshFilter>().mesh =
cylinderObject.GetComponent<MeshFilter>().
mesh;
21 // Rotate the cube object (which now looks like a
cylinder).
22 cubeObject.transform.rotation = Quaternion.
AngleAxis (45.0f, new Vector3 (1.0f, 1.0f,
1.0f));
23 }
24
25 // Update is called once per frame
26 void Update () {
27
28 }
29 }
patterns for virtual reality 27
IsTrigger
Material
Edit Collider
of properties have been
Center
Size
MeshRenderer
Lighting
changed.
Materials
Dynamic Occluded
Default-Material
Shader Standard
Add Component
Project Console
_Scenes
Materials Prefabs Resources Scripts
Prefabs
Resources
Scripts
Shaders
2.2.2 Pattern
class SceneObject
{
Attributes:
List of components
Functions:
function setup ()
{
}
function update (deltaTime)
{
component.update (deltaTime)
}
function addComponent (component)
{
components.add (component)
component.setup ()
}
30 shaun bangay
2.2.3 Example
Example 2. Applying the pattern to Unity software
Rotation
Mesh
CubeMesh Filter
Box Collider
Center
Size
MeshRenderer
Lighting
Materials
Dynamic Occluded
Change Object Script
Script ChangeObject
Surface Colour
Default-Material
Shader Standard
Add Component
Project Console
Prefabs
Resources
Scripts
Shaders
2.3.1 Description
There are several steps involved in creating new objects.
New objects are usually an instance of a particular object
template that has been previously prepared. Internal
storage managed by the programming environment must
be allocated for the instance. The new object must be
registered with the scene graph so that it becomes visible
in rendered images, and so that events generated in the
scene can trigger functions in the object. The way the
34 shaun bangay
1 using System.Collections;
2 using System.Collections.Generic;
3 using UnityEngine;
4
5 public class ChangeObject : MonoBehaviour {
6
7 public Color surfaceColour = new Color (0, 1, 0);
8
9 // Use this for initialization
10 void Start () {
11 this.GetComponent<MeshRenderer> ().material.color
= surfaceColour;
12 }
13
14 // Update is called once per frame
15 void Update () {
16 this.GetComponent<MeshRenderer> ().material.color
=
17 new Color
18 (Mathf.Abs (Mathf.Sin (0.37f * Time.time)),
19 Mathf.Abs (Mathf.Sin (0.71f * Time.time)),
20 Mathf.Abs (Mathf.Sin (0.52f * Time.time)));
21 }
22 }
patterns for virtual reality 35
2.3.2 Pattern
1. Create object Entity
Assume that a template exists for the new object, called 2. Register with scene graph
Scene
EntityTemplate. We may also have a reference to another ... ... Entity
3. PLace under desired parent in
object parentEntity that is the parent object for our newly the scene Scene
Car
... ...
created entity. The process of adding a new object to the Entity
Scene
4. Place in world (position)
virtual world makes use of the pattern: Position Car
Rotation Transform
Entity
Scale
// Allocate storage for a new object.
Figure 2.3.1: The steps
myEntity = new EntityTemplate involved in creating a
// Register object with the scene graph. new object and adding it
to the virtual world.
Register (myEntity)
// Optional: make new object a child of
// ParentEntity. Set the parent property
// of the new object.
myEntity.parent = parentEntity
// Set properties specific to the newly
// created object. Replace ♦ with
// appropriate values.
myEntity.position = Vector (♦, ♦, ♦)
myEntity.orientation =
Quaternion (♦, Vector (♦, ♦, ♦))
2.3.3 Example
Example 3. Applying the pattern to Unity software
36 shaun bangay
Project Console
_Scenes
Materials
Tree
Prefabs
Resources
Scripts
.
Shaders
Project Console
_Scenes
Materials
Tree
Prefabs
Resources
Scripts
Shaders
2.4.1 Description
Vectors get used interchangeably in two different ways
during virtual reality application development:
2.4.2 Pattern
The following are some of the most common patterns in
which a vector is used:
patterns for virtual reality 39
rotations are applied, and this may depend on choices 1. Position (x, y, z)
y
z y
made by the developers of the VR engine that you are (x,y,z)
y
y x
using. z
x x z
x
x
z z
2. Orientation with
• Scale: the three element vector represents the amount Euler angles (x, y, z)
3. Scale (x, y, z)
y
of stretch (or shrinkage) of the object relative to each y
x
x z z
of the x, y and z axes. As a rule of thumb you should
scale by the same amount in each direction (uniform Figure 2.4.2: A vector of
scaling). If you do use non-uniform scaling then 3 elements is variously
used to represent po-
ideally apply this only to leaf-node (the last child sition, orientation and
object in the parent-child sequences) even if you have scale, as well as other
quantities relevant to
to create an extra child node to do so. Otherwise you virtual reality systems.
may be surprised by the effects achieved when chil-
dren inherit this distortion from their parent objects.
2.4.3 Example
Example 4. Applying the pattern to Unity software
1 using System.Collections;
2 using System.Collections.Generic;
3 using UnityEngine;
4
5 public class ApplyVectors : MonoBehaviour {
6
7 // Use this for initialization
8 void Start () {
9 this.gameObject.transform.position = new Vector3
(1, 2, 3);
10 this.gameObject.transform.localScale = new Vector3
(0.5f, 2.0f, 0.1f);
11 }
12
13 // Update is called once per frame
14 void Update () {
15
16 }
17 }
Mesh
x1y1z1
CylinderMesh Filter
Cylinder
Capsule Collider
Physic Material
x0y0z0
Default-Material
Shader Standard
Add Component
2.5.1 Description
There are several ways of representing the orientation of
objects in a virtual environment.
A common approach is based on Euler angles, the
amount of rotation about the x, y and z angles respect-
ively. The approach is dependent on the order in which
the 3 rotations are applied. Since there are 6 valid per-
mutations achieving consistent results across different
platforms using this approach is difficult. Euler angle
representations also suffer from gimbal lock, where in-
cremental rotations may end up in scenarios where axes
line up and results in a scenario where it is non-trivial
to continue to rotate in particular directions. As a result,
Euler angle representations for manipulating orientation
are avoided.
In all cases any angles and directions are relative to
the rest position of the object, and the origin and axis
directions in the current coordinate frame. It is worth
keeping this point in mind when working with parent-
child hierarchies as this coordinate frame may be subject
to the movements of the parent object.
An alternative approach is to represent changes of
orientation as a single rotation by a given angle about
a particular direction in space. This is associated with a
44 shaun bangay
2.5.2 Pattern
Rotation:
object.transform.orientation =
object.transform.orientation *
Quaternion (angle, Vector (x,y,z))
2π
radians = 360 degrees
π
= 180 degrees
360
degrees = 2π radians
180
= π radians
2.5.3 Example
Example 5. Applying the pattern to Unity software
Add Component
scene, with a tree-like
object with default
position and orientation.
Project Console
Tree
_Scenes
Materials
Prefabs
Resources
Scripts
46 shaun bangay
1 void Start () {
2 this.transform.rotation = Quaternion.
AngleAxis (37.0f, new Vector3 (0.4f,
0.5f, -0.3f));
3 }
Leaves Position x 0 y 0 z 0 yo
Project Console
_Scenes
Materials
Tree
Prefabs
Resources
Scripts
Shaders
2.6.1 Description
The position and orientation values assigned to indi-
vidual objects in the scene graph all get used during the
rendering process to produce a graphical representation
of the scene as seen from the perspective of the camera
in the scene, and suitably projected so that the resulting
image conforms to the perspective and resolution re-
quirements of the display device. The process of working
out where each element is positioned relative to all other
objects is known as the transformation pipeline.
To understand the transformation pipeline it is ne-
cessary to be aware that the process of moving a rigid
48 shaun bangay
2.6.2 Pattern
The transformation pipeline is a standardized process
within the rendering subsystem of the virtual reality en-
gine. Control over parameters of the process is achieving
by modifying the properties of the transformations:
2.6.3 Example
Example 6. Applying the pattern to Unity software
IsTrigger
Edit Collider
Red
Shader Standard
We avoid using x, y
and z since we are Project Console
_Scenes
Project Console
_Scenes
Materials
Prefabs
Resources Scenes Materials Prefabs Resources Scripts Shaders
Scripts
Shaders
Project Console 4
_Scenes
Materials
Prefabs
Resources Scenes Materials Prefabs Resources Scripts Shaders
Scripts
Shaders
54 shaun bangay
jArrowHead
Project Console
_Scenes
Materials
Prefabs
Resources Scenes Materials Prefabs Resources Scripts Shaders
Scripts
Shaders
Add Component
included, transformed
Origin
kArrowHead
jArrowHead
ObjectBCoordinateFrame
french fries
into a common world
jAxis
iAxis
kAxis
Origin
coordinate system. The
iArrowHead
kArrowHead
jArrowHead
ObjectACoordinateFrame
Pineapple_01
object representing the
jAxis
kAxis
iAxis
kArrowHead
Origin
world coordinate frame
iArrowHead
jArrowHead
ObjectCCoordinateFrame
greenPepper
is visible, as are the
iAxis
jAxis
kAxis
Origin transformed frames
Project Console for each of the objects
_Scenes
Materials
Prefabs
Resources
Scripts
Scenes Materials Prefabs Resources Scripts Shaders
inhabiting this world.
Shaders
coordinates.
Origin Background
kArrowHead Culling Mask
jArrowHead
ObjectBCoordinateFrame Projection
french fries Fieldof View
jAxis Physical Camera
iAxis
Clipping Planes
kAxis
Origin
iArrowHead Viewport Rect
kArrowHead
jArrowHead Depth
ObjectACoordinateFrame Rendering Path
Pineapple_01
TargetTexture None Render Texture
jAxis
kAxis Occlusion Culling
iAxis AllowHDR
kArrowHead AllowMSAA
Origin Allow Dynamic Resolution
iArrowHead
jArrowHead TargetDisplay
ObjectCCoordinateFrame FlareLayer
greenPepper
iAxis
jAxis
kAxis
Origin
Add Component
Project Console
_Scenes
Materials
Prefabs
Resources Scenes Materials Prefabs Resources Scripts Shaders
Scripts
Shaders
1 using System.Collections;
2 using System.Collections.Generic;
3 using UnityEngine;
4
5 public class DriveableObject : MonoBehaviour {
6
7 public InputSystem inputSystem;
8
9 public float speed = 1.0f;
10 public float turnspeed = 0.1f;
11
12 // Use this for initialization
13 void Start () {
14 inputSystem.registerForInput (InputSystem.
ControlTypes.AxisAction, InputSystem.
ActionHandlers.MoveForward,
forwardEventHandler);
15 inputSystem.registerForInput (InputSystem.
ControlTypes.AxisAction, InputSystem.
ActionHandlers.TurnSideway,
turnEventHandler);
16 }
17
18 // Update is called once per frame
19 void Update () {
20
21 }
22
23 void forwardEventHandler (float v)
24 {
25 this.transform.position = this.transform.position
+ speed * v * this.transform.forward * Time.
deltaTime;
26 }
27
28 void turnEventHandler (float v)
29 {
30 this.transform.rotation = this.transform.rotation
* Quaternion.AngleAxis (turnspeed * v * Time.
deltaTime, this.transform.up);
31 }
32
33 }
patterns for virtual reality 59
2.7.1 Description
Patterns for designing and developing game loops can be
found in other sources [Nystrom, 2014]. We assume the
game loop is implemented in the VR engine and that our
applications just need to make use of it.
2.7.2 Pattern
The typical game loop, from our point of view, has the
structure:
handle input |
update all objects |
render scene graph
• That objects are updated in a particular order. Game Loop (Engine perspective)
Initialise world
• That an update of objects occurs at the same rate as Handle Input
Repeat
Update objects
input is processed or individual frames of output are forever
Render world
Setup Update
(initialize (update
• That the rate at which updates occur is constant, or myself) myself)
even consistent.
Figure 2.7.1: The game
loop is standard across
As a consequence, our typical objects include the follow- virtual reality engines
ing pattern defining essential functionality: although the application
developer may only have
to contribute particular
VR Object
portions when they
include their own objects
{
in the scene.
60 shaun bangay
function initialize
{
2.7.3 Example
Example 8. Applying the pattern to Unity software
1 using System.Collections;
2 using System.Collections.Generic;
3 using UnityEngine;
4
5 public class TimerTracker : MonoBehaviour {
6
7 // The starting value for the timer. This will be
set in the Unity software editor.
8 public float startingTimeValue = 120.0f;
9
10 // The current time value in seconds that the timer
displays.
11 private float timeLeft;
12
13 // Use this for initialization
14 void Start () {
15 // initialize the timer value from the starting
value provided.
16 timeLeft = startingTimeValue;
17 showTime ();
18 }
19
20 // Update is called once per frame
21 void Update () {
22 // decrease timer value according to the time that
has passed
23 // since the last Update.
24 timeLeft = timeLeft - Time.deltaTime;
25 showTime ();
26 }
27
28 // Convert the timeLeft into hours, minutes and
seconds
29 // and update the text display to show this.
30 private void showTime ()
31 {
32 int hours = ((int) timeLeft) / 3600;
33 int minutes = (((int) timeLeft) % 3600) / 60;
34 int seconds = ((int) timeLeft) % 60;
35 this.GetComponent <TextMesh>().text = hours.
ToString ("D2") + ":" + minutes.ToString ("D2
") + ":" + seconds.ToString ("D2");
36 }
37 }
patterns for virtual reality 63
Text
Occluded
Text Mesh
000154
the scene graph structure
Offset
CharacterSize
LineSpacing
0
1
1
and TimerTracker
Anchor Middle center
Alignment
Tab Size
FontSize
Center
4
0
component with the
FontStyle Normal
RichText
Font
Color
public Starting Time
Timer Tracker Script
Script
StartingTimeValue 120
TimerTracker
Value property in the
Add Component
Inspector pane.
Project Console
_Scenes
Materials _Scenes Materials Prefabs Resources Scripts Shaders
Prefabs
Resources
Scripts
Shaders
2.8.1 Description
The scene graph represents the variety of relationships
between the information in the virtual world. For any ob-
ject in the virtual world, just being attached to the scene
graph in any way ensures that the object is visible in the
rendered view of the scene (assuming that it should be
visible and is not behind you, or behind another object).
Objects have components and components have prop-
erties. However most scene graphs are visualized by
showing the parent-child relationship that exists between
objects in the scene graph.
This parent-child relationship is specifically intended
to create a transformation hierarchy in the scene. Any
child objects behave as if they were effectively anchored
to their parent object. They adopt the position, orienta-
tion and scale of the parent object before applying their
own position, orientation and scale relative to that. Be-
hind the scenes, the transformation for the child object is
combined with the transformation for the parent object to
determines the child’s position, orientation and scale.
Hierarchies are used frequently in a number of com-
mon scenarios:
Scene graph
representation of all parts children of a common parent. The complete
Head hierarchy
Neck
Shoulders
Scene object can then be manipulated directly by just oper-
RArmU
LArmU ... Human ...
LArmL Hips
Torso
RArmL
Hips
ating on the parent object. The parent object need not
LLegU RLegU LLegU
LLegL
Torso RLegU
even have a visible shape of its own and can just be a
RLegL
LLegL RLegL
LFoot RFoot
LFoot Shoulders RFoot
placeholder object used to organize things.
LArmU Neck RArmU
childObject.parent = parentObject
// or if parent property stores transformations
childObject.parent = parentObject.transform
childObject.parent = null
childObject = object.findChildObject
(childObjectName)
2.8.3 Example
Example 9. Applying the pattern to Unity software
Add first the horse tag and then add the newly created
tag to the horse object). Now scatter a set of the horse
objects throughout the scene as shown in Figure 2.8.3.
with a scattering of
Horse Rotation
Horse1
Horse2
Horse3
Horse4
Scale
CubeMesh Filter
Box Collider
horse objects, each with
Edit Collider
IsTrigger
Material
Center
the Horse tag (see top of
Inspector pane).
Size
MeshRenderer
Lighting
Materials
Dynamic Occluded
Rotate Object Script
Script RotateObject
Default-Material
Shader Standard
Add Component
Project Console
1 using System.Collections;
2 using System.Collections.Generic;
3 using UnityEngine;
4
5 public class RotateObject : MonoBehaviour {
6
7 // Use this for initialization
8 void Start () {
9 }
10
11 // Update is called once per frame
12 void Update () {
13 this.transform.Rotate (0.0f, 1.0f, 0.0f);
14
15 GameObject horse = GameObject.FindWithTag ("Horse"
);
16 if (horse != null)
17 {
18 horse.tag = "Untagged";
19 horse.gameObject.transform.parent = this.
gameObject.transform;
20 }
21 }
22 }
patterns for virtual reality 69
2.9.1 Description
Objects in a virtual environment can independently
sense, reason and act on the virtual world around them.
This provides a responsive and reactive environment
without a central control process that has to understand
and enforce all possible combinations of interactions,
allowing possibilities for emergent effects. Occasionally,
however, there are scenarios where an element of over-
sight might be necessary. For example, a collection of
similar objects would benefit from a global manager for
a coordination role, since such a task is not obviously
associated with any single element in the collection.
Consider the example of a traffic simulation consist-
ing of a number of taxis. A user calls for a taxi, and the
closest one is dispatched to pick them up. There are po-
tentially ways in which the taxis can negotiate amongst
Manager
themselves to see who is closest. However contention
could arise if two are at exactly the same distance. This
process would also require all taxis to communicate with TAXI
2.9.2 Pattern
The virtual reality manager pattern involves creating a
single instance of a manager class and adding this to the
scene graph. This is typically included as a component of
an invisible object.
The manager class would usually keep track of the
70 shaun bangay
class Manager
{
private:
List managedEntities
function setup ()
{
// create and add entities
}
public:
function undertakeAction ()
{
...
}
function register (entity)
{
managedEntities.add (entity)
}
2.9.3 Example
Add Component
Project Console
Materials
Models
Prefabs
Resources
Scripts
Shaders
Terrain
3. Test the scene with the single traffic light, and make
sure it works.
Then drag the traffic light into the Project’s Prefabs
folder to turn it into a traffic light template that we can
make multiple instances of.
Layout a representation of the intersection, making 4
instances of the traffic light prefab and placing them at
the appropriate locations. The resulting scene should
resemble that shown in Figure 2.9.3.
1 using System.Collections;
2 using System.Collections.Generic;
3 using UnityEngine;
4
5 public class TrafficLightOperations : MonoBehaviour {
6
7 public Material redMaterial;
8 public Material greenMaterial;
9 public Material yellowMaterial;
10 public Material offMaterial;
11
12 public GameObject redLight;
13 public GameObject greenLight;
14 public GameObject yellowLight;
15
16 // Use this for initialization
17 void Start () {
18 setColour ("red");
19 }
20
21 // Update is called once per frame
22 void Update () {
23
24 }
25
26 public void setColour (string colour)
27 {
28 // provided separately.
29 }
30 }
Pole Position x -2 y 0 z -2
manager pattern.
Add Component
Project Console
Materials
Models
Prefabs
Resources
Scripts
Shaders
Terrain
74 shaun bangay
2.10 Exercises
1 ...
2 public class TrafficManagement : MonoBehaviour {
3
4 public GameObject lightA;
5 public GameObject lightB;
6 public GameObject lightC;
7 public GameObject lightD;
8
9 // Use this for initialization
10 void Start () {
11 StartCoroutine (cycleLights ());
12 }
13 IEnumerator cycleLights() {
14 while (true)
15 {
16 lightA.GetComponent <TrafficLightOperations> ().
setColour ("green");
17 lightC.GetComponent <TrafficLightOperations> ().
setColour ("green");
18 lightB.GetComponent <TrafficLightOperations> ().
setColour ("red");
19 lightD.GetComponent <TrafficLightOperations> ().
setColour ("red");
20 yield return new WaitForSeconds (1);
21 lightA.GetComponent <TrafficLightOperations> ().
setColour ("yellow");
22 lightC.GetComponent <TrafficLightOperations> ().
setColour ("yellow");
23 lightB.GetComponent <TrafficLightOperations> ().
setColour ("red");
24 lightD.GetComponent <TrafficLightOperations> ().
setColour ("red");
25 yield return new WaitForSeconds (1);
26 lightA.GetComponent <TrafficLightOperations> ().
setColour ("red");
27 lightC.GetComponent <TrafficLightOperations> ().
setColour ("red");
28 lightB.GetComponent <TrafficLightOperations> ().
setColour ("red");
29 lightD.GetComponent <TrafficLightOperations> ().
setColour ("red");
30 yield return new WaitForSeconds (1);
31 // continued ...
patterns for virtual reality 77
1 // continuation ...
2 lightA.GetComponent <TrafficLightOperations> ().
setColour ("red");
3 lightC.GetComponent <TrafficLightOperations> ().
setColour ("red");
4 lightB.GetComponent <TrafficLightOperations> ().
setColour ("green");
5 lightD.GetComponent <TrafficLightOperations> ().
setColour ("green");
6 yield return new WaitForSeconds (1);
7 lightA.GetComponent <TrafficLightOperations> ().
setColour ("red");
8 lightC.GetComponent <TrafficLightOperations> ().
setColour ("red");
9 lightB.GetComponent <TrafficLightOperations> ().
setColour ("yellow");
10 lightD.GetComponent <TrafficLightOperations> ().
setColour ("yellow");
11 yield return new WaitForSeconds (1);
12 lightA.GetComponent <TrafficLightOperations> ().
setColour ("red");
13 lightC.GetComponent <TrafficLightOperations> ().
setColour ("red");
14 lightB.GetComponent <TrafficLightOperations> ().
setColour ("red");
15 lightD.GetComponent <TrafficLightOperations> ().
setColour ("red");
16 yield return new WaitForSeconds (1);
17 }
18 }
19 }
78 shaun bangay
2. Conditional if Condition
(virtual) reality. true if false
Repeat
that are commonly used and which are worth being n times A X=
B
C
Y(Z)=
Z
X
B
C A
4. Abstraction
familiar with. These include: D
B
C
X
D
X
Y(A)
Y(D)
Y(E)
E E
B X
C
3.2.1 Description
Your programs store the information representing vari-
ous properties in the scene, as well as any additional
information used to support any custom operations re-
quired for your application.
Information is stored in variables. You can visualize
this as a box into which you can place information. This
box is stored in a portion of the memory of the computer
that is allocated for your use. The length of time that this
box is allocated to you is defined by scope rules.
Each particular box has its own name, so it can be
uniquely identified. Visualize the name written on the
top of the box.
The box stores information. There are different types
of information: numbers, text, colours, vectors, geometric
shapes, sounds, images to name a few. All of these are
converted to information when they are assigned to the
variable (stored in the box). Visualize this by seeing your
patterns for virtual reality 81
r
Vecto
information
Visualize the mushy gray goop being poured through a
sieve of a particular type that miraculously reconstructs Figure 3.2.1: A sug-
the original information you assigned to (stored in) the gested mental model
for understanding the
variable. representation of in-
formation in the form of
variables.
3.2.2 Pattern
Purely as examples, the types of variables you are likely
to encounter include:
Assignment: a = b (or a ← b)
assignment
Providing a new value to the variable is done with an
1. a b
int
happens
3.2.3 Example
Example 12. Applying the pattern to Unity software
patterns for virtual reality 83
1 using System.Collections;
2 using System.Collections.Generic;
3 using UnityEngine;
4
5 public class VariableDemo : MonoBehaviour {
6
7 class Ray
8 {
9 Vector3 origin;
10 Vector3 direction;
11
12 public Ray (Vector3 o, Vector3 d)
13 {
14 origin = o;
15 direction = d;
16 }
17 public override string ToString ()
18 {
19 return "[" + origin + "->" + direction + "]";
20 }
21 }
22
23 // Use this for initialization
24 void Start () {
25 int varThatIsInt = 7;
26 float timeCounter = 37.4f;
27 bool onNotOff = true;
28 print ("Integer variable value: " + varThatIsInt);
29 print ("TimeCounter value: " + timeCounter);
30 print ("Boolean variable’s value: " + onNotOff);
31
32 Vector3 myDirection = new Vector3 (1.0f, 1.5f,
-0.3f);
33 Color red = new Color (1, 0, 0);
34 print ("Direction Y: " + myDirection.y);
35 print ("Colour bits: " + red);
36
37 Ray pointerDirection = new Ray (new Vector3
(0,0,0), new Vector3 (1,0,0));
38 print ("Ray value: " + pointerDirection);
39 }
40
41 // Update is called once per frame
42 void Update () {
43 }
44 }
88 shaun bangay
x = r sin(θ )
y = r cos(θ )
3.3 Conditionals
3.3.1 Description
Some actions in a virtual reality application are condi-
tional, and are only appropriate if a particular condition
is true. For example, an object may move downwards but
only if the resulting position is not below ground level.
The height of the object needs to be compared to the
height of the ground to determine whether the condition
for movement has been satisfied.
Conditions are expressions that evaluate to either true
or false. Most computing languages support a Boolean
type which allows variables of this type to only assume
one of these two values (true or false). Various operators
exist that evaluate an expression and return a Boolean
value, including:
• && and & (similarly with || and |). The symbol && is
the Boolean operator to check that both arguments are
true at the same time. The symbol & operates on the
binary representation of numeric values. In particular
circumstances they can appear to behave in the same
way when tested with simple data (particularly 1s and
0s) so the effect of the difference may only become
apparent much later.
true Condition false
valueN when valueN break statement. This allow individual responses to a number
default any other break
value of different values associated with an expression. The
typical form of the switch statement is:
Figure 3.3.2: The switch
statement extends the switch (expression)
power of the condi-
tional by designating case value1:
different blocks of code {
corresponding to each
designated value of a program statements to run if
variable. the expression == value1
}
break
case value2:
{
}
break
case value3:
{
}
break
...
default:
{
}
break
3.3.3 Example
Example 14. Applying the pattern to Unity software
simulation.
Dynamic Occluded
Rigidbody
Mass 1
Drag 0
AngularDrag 0.05
UseGravity ✓
Is Kinematic
Interpolate
CollisionDetection
Constraints
Virtual Barrier Script
Script VirtualBarrier
_Scenes
Materials
Prefabs
Resources
Scripts
Shaders
3.4 Iteration
3.4.1 Description
The value in writing programs is that it allows us to
automate repetitive tasks and to sit back and relax while
the computer does all our work for us.1 There are a 1
There are some other
number of programming constructs that repeat a block benefits as well which
ideally become clear as
of instructions a given number of times, or until some we progress through this
condition is satisfied. We mainly use a pattern for a fixed book.
number of repetitions in our virtual reality applications.
Iterating in a virtual reality application can stop the
application from responding to user interaction until the
iteration is complete and so it is advisable to confine long
loops to start up phases of the application. We may also
use iteration patterns for repeating a process for each
element in a list.
96 shaun bangay
3.4.2 Pattern
This is a pattern for a loop that repeats exactly n times.
for (i = 0; i < n; i++) We use the variable i as a loop counter and this takes on
i=0 the values 0, 1, 2, ... n − 1 in each successive iteration of
false
i<n the body of the loop. The convention of starting to count
true
body of
loop
from 0 is peculiar to computer programming but does
i++
support other idiosyncrasies such as referring to the first
element in a list as element 0.
Figure 3.4.1: The for loop for (i = 0; i < n; i++)
supports iteration that
occurs a fixed number {
of times, including a
version that repeats // Body of the loop.
exactly n times. // Any statements here are repeated
Outer loop
Loop over grid
inner loop over elements
// exactly n times.
over rows in each row
}
} }
Figure 3.4.2: The nes-
ted for loop supports Iterating over multidimensional structures such as grids
iterating over complex require a loop to iterate over every row, and for each row,
structures since the
an additional loop to iterate over every column. Each
entire inner loop is re-
peated on every iteration iteration of the row loop requires a complete repetition
of the outer loop. of the entire column loop. This is achieved by placing
the column loop in the body of the row loop, a process
described as nesting.
3.4.3 Example
Example 15. Applying the pattern to Unity software
x = r sin(θ )
y = r cos(θ )
1 using System.Collections;
2 using System.Collections.Generic;
3 using UnityEngine;
4
5 public class SpiralPlacement : MonoBehaviour {
6
7 public GameObject objectTemplate;
8
9 public int totalNumberOfObjects = 100;
10 public int objectsPerRotation = 20;
11 public float initialRadius = 5.0f;
12 public float radiusIncreasePerRotation = 3.0f;
13
14 // Use this for initialization
15 void Start () {
16 float anglePerStep = 2 * Mathf.PI /
objectsPerRotation;
17 float radius = initialRadius;
18 float radiusChangePerStep =
radiusIncreasePerRotation /
objectsPerRotation;
19
20 for (int i = 0; i < totalNumberOfObjects; i++)
21 {
22 float x = radius * Mathf.Sin (i * anglePerStep);
23 float y = radius * Mathf.Cos (i * anglePerStep);
24
25 GameObject instance = Instantiate (
objectTemplate, new Vector3 (x, 0, y),
Quaternion.identity);
26 instance.transform.SetParent (this.transform);
27
28 radius = radius + radiusChangePerStep;
29 }
30 }
31 }
patterns for virtual reality 101
1 void Start () {
2 float anglePerStep = 2 * Mathf.PI /
objectsPerRotation;
3 float radiusChangePerStep =
radiusIncreasePerRotation / objectsPerRotation;
4
5 int i;
6 float radius;
7 float angle;
8 for (i = 0, radius = initialRadius, angle = 0;
9 i < totalNumberOfObjects;
10 i++, radius += radiusChangePerStep, angle +=
anglePerStep)
11 {
12 float x = radius * Mathf.Sin (angle);
13 float y = radius * Mathf.Cos (angle);
14
15 GameObject instance = Instantiate (objectTemplate,
new Vector3 (x, 0, y), Quaternion.identity);
16 instance.transform.SetParent (this.transform);
17 }
18 }
102 shaun bangay
RockClone Rotation
Script
SpiralPlacement Script
SpiralPlacement
Object Template Rock
RockClone
RockClone
counter. RockClone
RockClone
RockClone
RockClone
RockClone
RockClone
RockClone
Project Console
_Scenes
Materials
Prefabs
Resources
Scripts
Shaders
RockClone Rotation
Script
SpiralPlacement
SpiralPlacement
RockClone ObjectTemplate Rock
RockClone Total NumberOf Objects 144
RockClone Distance Apart 5
RockClone Initial Radius 5
RockClone Radius Increase Per Rota 1
RockClone
RockClone Add Component
RockClone
RockClone
RockClone
RockClone
RockClone
RockClone
RockClone
RockClone
RockClone
RockClone
RockClone
RockClone
RockClone
RockClone
RockClone
Project Console
_Scenes
Materials
Prefabs
Resources
Scripts
Shaders
3.5 Abstraction
3.5.1 Description
3.5.2 Pattern
The pattern for function is:
called with:
assigned to parameter2, and the string “hello world” is
FunctionName (5, "hello", new Vector (1,2,3)) assigned to parameter3. The type of each parameter must
Figure 3.5.1: The ar-
agree with the value that is passed to it.
guments provided to We can also pass variables as parameters to a func-
a function when it is tion. In this case, the value in the variable is copied into
invoked are copied to
the parameters in the (assigned to) the parameter variable. Changes to the
corresponding positions. parameter variable in the function do not affect the value
in the variable used when the function is invoked. This
reasoning is a bit harder to understand when passing
objects as parameters; a reference to the same object is
copied and so properties of the object that are changed
are then visible when viewing the object via the original
reference.
3.5.3 Example
Example 16. Applying the pattern to Unity software
1 void Update () {
2 listOfNeighbours = findNeighbours ();
3 targetDirection = calculateDirection (
listOfNeighbours);
4 moveDirection = checkBounds (targetDirection);
5 moveOneStep (moveDirection);
6 }
7
8 // Return the list of neighbours within a particular
distance of the current object.
9 NeighbourList findNeighbours ()
10 {
11 }
12
13 // Use the boids direction calculation based on the
given neighbour’s positions.
14 Vector3 calculateDirection (NeighbourList neighbours)
15 {
16 }
17
18 // Ensure that a movement in the given direction does
not take the object beyonds the bounds of the
movement area.
19 Vector3 checkBounds (Vector3 direction)
20 {
21 }
22
23 // Take a step in the given direction at a particular
speed.
24 void moveOneStep (Vector3 direction)
25 {
26 }
patterns for virtual reality 109
sky layer. I’ll check this out by removing the sky layer
temporarily and see if the result changes. Having
confirmed that theory, we need to find boids only. The
OverlapSphere function supports selecting only objects
on a particular layer. Create a new layer (under the
Inspector pane) and set the boid prefab to use this.
My version uses layer 9. We can use the version of
OverlapSphere that takes a layer as parameter. This
update seems to work as desired, so we update the
comments for the findNeighbours function to indicate
the layer restriction that it uses.
Project Console
_Scenes
Materials
Prefabs
Resources
Scripts
Shaders
3.6 Collections
3.6.1 Description
The variables encountered so far are good for storing a
single value. When we assign a new value to them, the
old value is discarded and overwritten with the new.
There are occasions when we need to keep track of a
collection of related items. A popular example is keeping
track of the position of treasure items that a user needs to
collect while exploring a virtual world.
There are variable types that can store collections
of objects. The simplest of these is the array which is
supported by most programming languages. Other col-
lection types include lists, queues, vectors and sets. The
differences between these relate to the ease with which
elements can be added and removed, and the order in
which elements are stored and retrieved. Most collections
do support common conventions for declaring the col-
116 shaun bangay
1 using System.Collections;
2 using System.Collections.Generic;
3 using UnityEngine;
4
5 public class BoidMovement : MonoBehaviour {
6
7 // the radius of a neighbourhood.
8 public float neighbourRadius = 0.5f;
9
10 // size of boundary area.
11 public float boardSize = 5.0f;
12
13 // an upper bound on the speed of boid movement
14 public float speed = 1.0f;
15
16 // internal variables needed for kinematic
simulation.
17 private Vector3 velocity;
18
19 void Start () {
20 velocity = new Vector3 (0,0,0);
21 }
22
23 // Update is called once per frame
24 void Update () {
25 Collider [] listOfNeighbours = findNeighbours (
neighbourRadius);
26 Vector3 targetDirection = calculateDirection (
listOfNeighbours, 0.45f, 0.4f, 0.2f);
27 Vector3 moveDirection = checkBounds (
targetDirection, boardSize);
28 moveOneStep (moveDirection, speed);
29 }
30
31 // Return the list of neighbours within a particular
32 // distance of the current object. Only objects on
layer 9
33 // are selected.
34 Collider [] findNeighbours (float radius)
35 {
36 Collider [] hitColliders = Physics.OverlapSphere (
this.transform.position, radius, 1 << 9);
37 return hitColliders;
38 }
patterns for virtual reality 117
3.6.2 Pattern
A collection would be defined by declaring a variable of
one of the collection types. The pattern would be:
CollectionType collectionName =
new CollectionType [CollectionSize]
patterns for virtual reality 121
A collection of 6 elements
This refers to the ith element of the collection. Collections Variable name refers to the entire
collection.
Types:
are usually numbered from 0 upto CollectionSize - 1. • type of element
• type of a collection of
Thus the first element is collectionName[0]. a type of element
Rather find the element once and store the result, than
search for the same element on every frame update.
foundElement = element
elementExists = true
break
}
// Code below checks that the element was found
// before doing anything with it
if (elementExists)
{
}
else
{
int numberOfElements = 0
collectionName[numberOfElements] = element
numberOfElements = numberOfElements + 1
3.6.3 Example
Example 17. Applying the pattern to Unity software
124 shaun bangay
3.7.1 Description
It is possible to declare variables with the same names as
variables that have been used elsewhere in the program,
but in different blocks. While the version in the closest
encompassing block is the one that is used in such cases,
it is then worth being aware of the rules around access to
variables within the program.
Visibility and scope refer to the parts of the program
that are able to access a particular variable. Usually
patterns for virtual reality 127
}
variable
}
version may be in scope, both versions are still alive.
When the new variable comes to the end of its scope, and Figure 3.7.1: Scope
lifetime, the old variable returns to scope with its original and lifetime may ap-
pear to be similar but
value intact as it was alive the entire time. scope identifies if a
A variable’s lifetime usually comes to an end when ex- variable can be accessed
ecution reaches the end of the block in which it was de- (or which version is
accessed) at a particu-
clared. Special modifiers exist in some languages (static) lar point in the code,
to preserve the variable (extend its lifetime) until the next and lifetime indicates
whether the variable
time that particular block of code is run. retains its value at that
The lifetime of global variables is the entire program’s point.
execution time.
Objects that are created with the new operator have a
lifetime that extends from their creation time until they
128 shaun bangay
3.7.2 Pattern
Various examples of scope are illustrated in the pattern
below:
int globalVariable
class MyClass
{
private:
int classLimitedVariable
function classLimitedFunction ()
{
}
public:
int globalVariableInClass
function globalFunctionInClass ()
{
}
function generalFunction (int parameter)
{
int localToGeneralFunction
classInstance = new MyClass
}
}
MyClass classInstance // file or global scope
classInstance.globalVariableInClass
classInstance.globalFunctionInClass ()
patterns for virtual reality 129
3.7.3 Example
Example 18. Applying the pattern to Unity software
3.8.1 Description
The solution to some problems can involve solving smal-
ler versions of the same problem. For example, given the
remarkable self-similarity in structures such as trees, the
132 shaun bangay
3.8.2 Pattern
The pattern for recursion involves defining a function
Figure 3.8.1: A tree
is one structure that that represents a solution to the problem. This function:
contains self-similarity
encouraging recursive • contains a base case: this is the stopping condition.
strategies for creating In the base case we perform non-recursive actions;
and manipulating such
structures.
actions that do not lead to invoking the function itself.
{
// recursive path
do something else simple
recursiveFunction (scaleParameter - 1,
otherParameterValues1)
// Can recurse multiple times if the
// problem requires e.g. if we have
// to draw two smaller trees.
recursiveFunction (scaleParameter - 1,
otherParameterValues2)
}
3.8.3 Example
Example 19. Applying the pattern to Unity software
1 ...
2 public class TreeGenerator : MonoBehaviour {
3
4 [Tooltip ("For example, use a unit scale sphere at
origin.")]
5 public GameObject leafPrefab;
6 [Tooltip ("For example, use a unit cylinder from
origin to y=1.")]
7 public GameObject branchPrefab;
8
9 [Tooltip ("The number of levels of recursion (number
of generations of branches before we get to a
leaf)")]
10 public int levels = 5;
11 [Tooltip ("How many banches branch out of a branch."
)]
12 public int branchFactor = 4;
13 [Tooltip ("The angle between a branch and the next (
sub)branch.")]
14 public float branchAngle = 45.0f;
15 [Tooltip ("The initial scale (length) of the trunk."
)]
16 public float trunkLength = 3.0f;
17 [Tooltip ("The length of a subbranch relative to the
length of its parent.")]
18 public float trunkDecayFactor = 0.5f;
19
20 void Start () {
21 generateTree (levels, transform.position,
transform.rotation, trunkLength);
22 }
23 // continued ...
patterns for virtual reality 137
1 // continuation ...
2 void generateTree (int levels, Vector3 position,
Quaternion direction, float trunkLength)
3 {
4 if (levels == 0)
5 {
6 GameObject leaves = Instantiate (leafPrefab,
position, direction);
7 }
8 else
9 {
10 GameObject branch = Instantiate (branchPrefab,
position, direction);
11 branch.transform.localScale = new Vector3 (1,
trunkLength, 1);
12 Vector3 endOfBranchPosition = branch.transform.
position + trunkLength * branch.transform.
up;
13
14 for (int i = 0; i < branchFactor; i++)
15 {
16 Quaternion newBranchDirection = direction *
Quaternion.AngleAxis (i * 360 /
branchFactor, Vector3.up);
17 newBranchDirection = newBranchDirection *
Quaternion.AngleAxis (branchAngle,
Vector3.right);
18 generateTree (levels - 1, endOfBranchPosition,
newBranchDirection, trunkLength *
trunkDecayFactor);
19 }
20 }
21 }
22 }
138 shaun bangay
4.1 Debugging
4.2 Printing
4.2.1 Description
4.2.2 Pattern
The print statement is used for debugging by using this
particular pattern:
4.2.3 Example
Example 20. Applying the pattern to Unity software
1 using System.Collections;
2 using System.Collections.Generic;
3 using UnityEngine;
4
5 // Move the object up and down between the bounds
defined.
6 public class MovingUpAndDown : MonoBehaviour {
7
8 public float upperBound = 3;
9 public float lowerBound = -2;
10
11 private bool directionUp = true;
12
13 // Use this for initialization
14 void Start () {
15 }
16
17 // Update is called once per frame
18 void Update () {
19 if (this.transform.position.y > upperBound)
20 {
21 directionUp = false;
22 this.transform.position += new Vector3 (0, -0.1f
, 0);
23 }
24 if (this.transform.position.y < upperBound)
25 {
26 directionUp = true;
27 this.transform.position += new Vector3 (0, 0.1f,
0);
28 }
29 }
30 }
4.3.1 Description
How to use a debugger: Identify the line of code or func- Figure 4.3.1: When
tion that you believe contains the error. If you follow the the debugging seems
impossible, explain your
philosophy of testing your code regularly, then this is
problem to another
probably a piece of the program code that you’ve added person or a rubber duck.
or changed recently. Place a break point here. Once the 1
Rubber ducking in-
program stops at this break point, think. What should volves explaining the
problem to another per-
the values of the properties be at this stage in the pro- son, which is remarkably
gram’s life? How will they change with each line? Then effective in identifying
the issue even before the
test your theories. If the results don’t match your expect- other person responds.
ations - why not? Your theory may be wrong, in which
case you may need to think about how you designed 2
If you don’t have
your solution. Or the program may be wrong, and you’ve another person handy,
any inanimate object
identified the line that is misbehaving. If the mistake will do. Even the rubber
is obvious, fix it. If you don’t understand why it is not duck from your bath
time rituals.
working, rubber duck it.1 , 2
Virtual reality development environments offer an-
other form of live debugging with the ability to watch
properties. Most environments include some form of
property inspector which shows live values of the se-
152 shaun bangay
4.3.2 Example
Example 21. Applying the pattern to Unity software
1 using System.Collections;
2 using System.Collections.Generic;
3 using UnityEngine;
4
5 // Move the object up and down between the bounds
defined.
6 public class MovingUpAndDown : MonoBehaviour {
7
8 public float upperBound = 3;
9 public float lowerBound = -2;
10
11 private bool directionUp = true;
12
13 // Use this for initialization
14 void Start () {
15 }
16
17 // Update is called once per frame
18 void Update () {
19 if (this.transform.position.y > upperBound)
20 {
21 directionUp = false;
22 this.transform.position += new Vector3 (0, -0.1f
, 0);
23 }
24 if (this.transform.position.y < lowerBound)
25 {
26 directionUp = true;
27 this.transform.position += new Vector3 (0, 0.1f,
0);
28 }
29 }
30 }
154 shaun bangay
4.4.1 Description
Sometimes some bugs in your code are incredibly per-
sistent and you get to the stage where you start to sus-
pect the compiler is not interpreting your code correctly,
the engine has an error in it, or the universe has determ-
ined that you shall never be a programmer. In practice,
this is rarely the case. However this might be the point
at which you want to get someone else to have a look
at your problem, or where you might want to file a bug
report with your engine manufacturer.
Don’t just bundle up your entire project and email it
off to someone else. Apart from leaking potentially sens-
itive information about the project that you’re working
on, nobody wants to try to pick up a complex semi-
working project and figure out what is going on with it.
There are conventions to reporting bugs, whether you’re
taking advantage of the person sitting next to you, rely-
158 shaun bangay
1 using System.Collections;
2 using System.Collections.Generic;
3 using UnityEngine;
4
5 // Move the object up and down between the bounds
defined.
6 public class MovingUpAndDown : MonoBehaviour {
7
8 public float upperBound = 3;
9 public float lowerBound = -2;
10
11 private Vector3 directionIncrement = new Vector3 (0,
0.1f, 0);
12
13 // Use this for initialization
14 void Start () {
15 }
16
17 // Update is called once per frame
18 void Update () {
19 if (this.transform.position.y > upperBound)
20 {
21 directionIncrement = new Vector3 (0, -0.1f, 0);
22 }
23 if (this.transform.position.y < lowerBound)
24 {
25 directionIncrement = new Vector3 (0, 0.1f, 0);
26 }
27 this.transform.position += directionIncrement;
28 }
29 }
patterns for virtual reality 159
Even then remove any sections of code that are not relev- • I am trying to do Y,
in order to produce
ant to recreating the issue. Even better, start from scratch a program to do X.
and recreate the handful of lines needed to recreate the Here is a short code
fragment that I’m
problem.
using but it produces
There is a good chance that you will spot the prob- A instead of B. What
lem while doing this. The chance is even greater if you can I do to fix it?
recreate the project from scratch rather than copying in Which do you think
would receive the most
material from the problem case. Recreating prevents in- help? Interestingly,
visible and accidental typos (such as O rather than 0) asking for help from
your tutors can produce
from being simply perpetuated. a similar outcome.
You will also meet with a more sympathetic attitude
from those volunteering to help you. It is clear that you
are putting in the effort to help yourself and not relying
on others to do your homework for you.4
4.4.2 Pattern
The pattern for producing the smallest non-working
example, and a meaningful bug report:
4.5.1 Description
You found a bug in your code and fixed it. Well done!
Three months later you had to modify portion of a par-
ticular function and now your program is misbehaving
again. Now it is a lot harder to debug because you’ve
written more scripts in the meantime, and you’ve long
since forgotten about how you built some of the older
functions and what input they might have assumed.
The process of testing and quality assurance is hope-
fully something you’re starting to do implicitly while
you write your programs. You’ve realised that writing a
complete program from top to bottom in a single session
and getting it to run first time is an impossible and un-
desirable dream. You now write only one function, or a
portion of a function, before compiling and running your
application and testing that specific code.
This is still a reasonably ad-hoc process. You might
add in a few lines of code to test your function with
some common parameter values but will delete these
lines once you’re satisfied that everything is behaving
correctly. We need to make this process more rigorous,
and we need to have some way of revalidating a function
if we have to modify it later.
The testing processes described in this pattern focus
on testing individual functions that you develop. There
are further testing processes required to ensure that your
application works, once all of these functions are integ-
rated into the application. For virtual reality systems,
there are further testing stages that assess the user’s re-
sponse to the environment that is created to ensure that it
is effective and usable.
162 shaun bangay
4.5.2 Pattern
Function testing: consider a function that takes a number
of parameters:
function myFunction (Type parameter1,
Type parameter2, ...)
{
4.5.3 Example
Example 22. Applying the pattern to Unity software
1 void testSeaCheckClass ()
2 {
3 SeaCheck seaCheck;
4
5 seaCheck = new SeaCheck (-5, 5);
6 if (seaCheck.inSea (0.0f) == true)
7 {
8 print ("SeaCheck, test A successful");
9 }
10 else
11 {
12 print ("SeaCheck, test A failed");
13 }
14 if (seaCheck.inSea (-10.0f) == false)
15 {
16 print ("SeaCheck, test B successful");
17 }
18 else
19 {
20 print ("SeaCheck, test B failed");
21 }
22 if (seaCheck.inSea (10.0f) == false)
23 {
24 print ("SeaCheck, test C successful");
25 }
26 else
27 {
28 print ("SeaCheck, test C failed");
29 }
30
31 seaCheck = new SeaCheck (0, 1);
32 if (seaCheck.inSea (0.5f) == true)
33 {
34 print ("SeaCheck, test D successful");
35 }
36 else
37 {
38 print ("SeaCheck, test D failed");
39 }
40 }
patterns for virtual reality 167
5.1.1 Description
In other programming environments the event handler
pattern is usually integrated with other patterns that
register and invoke it. Since most of this complexity
has been integrated into the virtual reality engine, we
confine ourselves to just the aspect related to responding
to events that occur in the virtual world.
As background, consider how objects in a virtual
world interact with one another. There are a few obvious
options:
• Each object knows about every other object and calls
specific functions associated with components on the
other object when some interaction is required. This
requires a large number of potential functions that
must be supported, every object must be aware of
every other object, and the application designer must
have considered all likely scenarios right at the start.
5.1.2 Pattern
Typically an event handler pattern takes the form:
5.1.3 Example
Example 23. Applying the pattern to Unity software
Edit Collider
Radius
Lighting
Materials
Dynamic Occluded
Rigidbody
Mass 1
Drag
AngularDrag
UseGravity
Is Kinematic
Interpolate
CollisionDetection
Constraints
Default-Material
Shader Standard
Add Component
_Scenes
Materials
Prefabs
Resources
Scripts
Shaders
5.2.1 Description
Activities in virtual reality applications are sensitive to
time. Usage of time related information falls into the
following categories:
5.2.2 Pattern
Retrieving the current time involves access to the system
clock, usually involving just a single call to a system
function:
function run ()
{
sleep (t);
// do whatever is required after t seconds.
...
5.2.3 Example
Example 24. Applying the pattern to Unity software
1 using System.Collections;
2 using System.Collections.Generic;
3 using UnityEngine;
4
5 public class DelayedResponse : MonoBehaviour {
6
7 public float delayTime = 5.0f;
8
9 private bool coRoutineRunning = false;
10
11 // Use this for initialization
12 void Start () {
13 }
14
15 IEnumerator DelayAndChange (float delayPeriod)
16 {
17 if (!coRoutineRunning)
18 {
19 coRoutineRunning = true;
20 yield return new WaitForSeconds (delayPeriod);
21 Color oldColour = this.GetComponent <
MeshRenderer> ().materials[0].color;
22 this.GetComponent <MeshRenderer> ().materials
[0].color = new Color (1, 0, 1);
23 yield return new WaitForSeconds (0.5f);
24 this.GetComponent <MeshRenderer> ().materials
[0].color = oldColour;
25 coRoutineRunning = false;
26 }
27 }
28
29 // Update is called once per frame
30 void Update () {
31 if (Input.GetAxis ("Fire1") > 0)
32 {
33 StartCoroutine (DelayAndChange (delayTime));
34 }
35 }
36 }
patterns for virtual reality 179
5.3.1 Description
Ideally the virtual reality engine should manage the
input that the user provides through the various virtual
reality controller devices that may be attached. This input
system could provide facilities such as:
5.3.2 Pattern
The input abstraction layer maps the individual controls
on a range of different input devices into actions suppor-
ted by the application. So to start off, we need a list of
the different types of controls available:
actionHandlers = [ SelectPressed,
ControllerMoved,
ActivatePressed,
MenuMoved ]
controlMapping =
{
SelectionPressed : Button :
[MouseLeftClicked,
ControllerTriggerPressed ],
ControllerMoved : 3DLocation :
[ControllerTracking,
KeyboardMovementEmulator ],
ActivatePressed : Button :
[MouseRightClicked,
ControllerGripPressed,
ControllerTouchpadClicked ],
MenuMoved : 2DRange :
[MouseTracking,
ControllerTouchpadMove ]
5.3.3 Example
Example 25. Applying the pattern to Unity software
1 void pollUnityInput ()
2 {
3 float h = Input.GetAxis ("Horizontal");
4 registerInput (DeviceSources.UnityInputHorizontal
, h);
5 float v = Input.GetAxis ("Vertical");
6 registerInput (DeviceSources.UnityInputVertical,
v);
7 float mx = Input.GetAxis ("Mouse X");
8 registerInput (DeviceSources.UnityInputMouseX, mx
);
9 float my = Input.GetAxis ("Mouse Y");
10 registerInput (DeviceSources.UnityInputMouseY, my
);
11 }
5.4.1 Description
The controllers and the head mounted display in a vir-
tual world are tracked so that their position and orient-
ation properties are regularly updated. Gaze or control-
ler directed locomotion can use the direction indicated
by one of these tracked objects to support locomotion
through the virtual world. This form of locomotion is
not recommended for use in any non-trivial virtual real-
ity application due to the sudden and non-physically
relevant accelerations experienced by the user. Rather
consider locomotion patterns such as teleportation (sec-
tion 5.6 on page 198)
patterns for virtual reality 189
5.4.2 Pattern
Given a tracked object in the virtual world identified by
the variable directionObject, and the object representing
the user’s avatar (containing the camera) in the variable Figure 5.4.1: Simple
kinematic motion is
userObject the pattern for locomotion is: achieved by adding
a direction vector to
function update (deltaTime) the current position.
{ The time since the last
position update is used
if (controller.triggerPressed ()) to adjust the distance
moved in a single step.
{
userObject.position = userObject.position +
speed * directionObject.forward * deltaTime
}
if (controller.triggerPressed () and
speed < speedMax)
{
5.4.3 Example
Example 26. Applying the pattern to Unity software
®
Steam VR
Figure 5.4.4.
Code such as that below now reads the controller
triggers.
1 float v1 = Input.GetAxis ("LeftTrigger");
2 float v2 = Input.GetAxis ("RightTrigger");
3 print ("Triggers: " + v1 + " " + v2);
5.5.1 Description
Selecting objects within reach is achieved by adding a
collision event handler to either the controller or the
object. Selecting objects that are out of reach is achieved
patterns for virtual reality 195
1 using System.Collections;
2 using System.Collections.Generic;
3 using UnityEngine;
4 using Valve.VR;
5
6 public class MoveInDirectionOfTrackedObject :
MonoBehaviour {
7
8 // Suggest setting this to gesture: GrabPinch
9 public SteamVR_Action_Boolean trigger;
10
11 public GameObject steeringObject;
12
13 private float speed = 3.0f;
14
15 // Use this for initialization
16 void Start () {
17 }
18
19 // Update is called once per frame
20 void Update () {
21 float v1 = Input.GetAxis ("LeftTrigger") + (
trigger.GetState (SteamVR_Input_Sources.
LeftHand) ? 1 : 0);
22 float v2 = Input.GetAxis ("RightTrigger") + (
trigger.GetState (SteamVR_Input_Sources.
RightHand) ? 1 : 0);
23 if ((v1 > 0) || (v2 > 0)) {
24 this.transform.position = this.transform.
position + speed * steeringObject.transform
.forward * Time.deltaTime;
25 }
26 }
27 }
196 shaun bangay
[SteamvR] Rotation
Script MovelnDirectionOfTrackedObject
Controller right Steering Object Camera (eye)
Script MovelnDirectionOfTrackedObject
Script MovelnDirectionOfTrackedObject
Add Component
Project Console
_Scenes
Materials
Prefabs
_Scenes Materials Prefabs Resources Scripts Shaders
Resources
Scripts
Shaders
Ray
ection (whereas the line passes through the point in both
the forward and reverse directions). A ray is thus defined
Ray origin O
by two key properties:
Figure 5.5.1: A ray is
defined by its starting
point (origin) and its
• The origin of the ray, O. This is the point it is emitted
direction provided as a from.
vector.
• The direction of the ray, D. This is a vector (usually
unit length) indicating the direction that the ray fol-
lows.
5.5.2 Pattern
This pattern calls a selectObject function whenever the
ray from the controller touches an object (alternatively, a
selectObject event could be generated). The selection pro- Figure 5.5.2: The raycast-
ing operation is used for
cess is started by pressing the trigger on the controller.
selection of objects in the
virtual world, with the
function update (deltaTime) position and direction of
{ a controller as origin and
direction for the ray. The
rayLimit = limit operation can return: the
object intersected by the
if (controller.triggerPressed ()) ray, the specific point
{ of intersection, and the
distance along the ray to
(hitObject, distance) = this intersection.
RayCast (Ray (controller.position,
controller.forward), rayLimit)
if (hitObject != null)
{
rayLimit = distance
selectObject (hitObject)
}
show laser beam (from controller.position
in direction controller.forward for
distance rayLimit)
}
5.5.3 Example
Example 27. Applying the pattern to Unity software
5.6 Teleportation
5.6.1 Description
Teleporting is a preferred method of locomotion. This
is achieved by utilizing some previous patterns: use a
raycast based selection to nominate a destination point
(section 5.5 on page 194), check that the target point is
patterns for virtual reality 199
1 using System.Collections;
2 using System.Collections.Generic;
3 using UnityEngine;
4 using Valve.VR;
5
6 public class FireRayFromDevice : MonoBehaviour {
7 // Suggest setting this to gesture: GrabPinch
8 public SteamVR_Action_Boolean trigger;
9 public GameObject controller;
10 public GameObject targetPrefab;
11 public GameObject avatar;
12
13 private GameObject targetMarker;
14 private Vector3 lastRayPoint;
15
16 void Start () {
17 targetMarker = Instantiate (targetPrefab);
18 targetMarker.SetActive (false);
19 }
20
21 void Update () {
22 // distance to search along ray.
23 float distance = 100.0f;
24
25 float v1 = Input.GetAxis ("LeftTrigger") + (
trigger.GetState (SteamVR_Input_Sources.
LeftHand) ? 1 : 0);
26 float v2 = Input.GetAxis ("RightTrigger") + (
trigger.GetState (SteamVR_Input_Sources.
RightHand) ? 1 : 0);
27 if ((v1 > 0) || (v2 > 0)) {
28 RaycastHit hit;
29 if (Physics.Raycast (new Ray (controller.
transform.position, controller.transform.
forward), out hit, distance)) {
30 lastRayPoint = hit.point;
31 targetMarker.transform.position = lastRayPoint
;
32 targetMarker.SetActive (true);
33 }
34 }
35 }
36 }
200 shaun bangay
5.6.2 Pattern
The pattern for teleportation is:
selectedObject = hitObject
patterns for virtual reality 201
selectedPosition = hitPosition
place teleportation target on selectedObject
...
if (selectedObject != null and
controller.teleportButtonPressed ())
{
5.6.3 Example
Example 28. Applying the pattern to Unity software
1 using System.Collections;
2 using System.Collections.Generic;
3 using UnityEngine;
4 using Valve.VR;
5
6 public class TeleportWithRay: MonoBehaviour {
7 // Suggest setting this to gesture: GrabPinch
8 public SteamVR_Action_Boolean trigger;
9 // Suggest setting this to gesture: GrabGrip
10 public SteamVR_Action_Boolean grip;
11
12 public GameObject controller;
13 public GameObject targetPrefab;
14 public GameObject avatar;
15
16 private GameObject targetMarker;
17 private Vector3 lastRayPoint;
18
19 void Start () {
20 targetMarker = Instantiate (targetPrefab);
21 targetMarker.SetActive (false);
22 }
23
24 void Update () {
25 // distance to search along ray.
26 float distance = 100.0f;
27 float v1 = Input.GetAxis ("LeftTrigger") + (
trigger.GetState (SteamVR_Input_Sources.
LeftHand) ? 1 : 0);
28 float v2 = Input.GetAxis ("RightTrigger") + (
trigger.GetState (SteamVR_Input_Sources.
RightHand) ? 1 : 0);
29 if ((v1 > 0) || (v2 > 0)) {
30 RaycastHit hit;
31 if (Physics.Raycast (new Ray (controller.
transform.position, controller.transform.
forward), out hit, distance)) {
32 lastRayPoint = hit.point;
33 targetMarker.transform.position = lastRayPoint
;
34 targetMarker.SetActive (true);
35 }
36 }
37 ...
204 shaun bangay
1 ...
2 float v3 = Input.GetAxis ("LeftGrip") + (grip.
GetState (SteamVR_Input_Sources.LeftHand) ? 1
: 0);
3 float v4 = Input.GetAxis ("RightGrip") + (grip.
GetState (SteamVR_Input_Sources.RightHand) ?
1 : 0);
4 if ((v3 > 0) || (v4 > 0)) {
5 avatar.transform.position = lastRayPoint + new
Vector3 (0, 1, 0);
6 }
7 }
8 }
6
A Working Virtual World
6.1.1 Description
6.1.2 Pattern
Start with a state transition diagram corresponding to
the behaviour of an object. This is transformed into script
elements as follows.
We defined variables to represent the state of the ob-
ject. The only variables needed are those that are relevant
to the state transitions covered by the diagram. In the
interests of readability we use an enumerated type so
that the program code corresponds directly with the state
transition diagram.
oldState = state
state = newState
// The code below is optional.
sendEvent
(stateChanged (oldState, newState),
self)
switch (state):
6.1.3 Example
Example 29. Applying the pattern to Unity software
• A ground plane.
• Marker objects for key positions of the lawyer:
chair, lectern, evidence table, and jury box.
• A lawyer. Attach the LawyerBehaviour script to
this object. Make sure the lawyer has a collider
component defined. Add a number of Audio Source
components to the lawyer, and attach suitable audio
clips corresponding to the judge giving instructions,
the lawyer’s argument, the presentation to the jury
and the judge’s verdict.
• A UI/Text object to hold the instructions that are
shown on the screen to indicate the purposes of the
lawyer’s current state.
• A semi-transparent cube of approximately the pro-
portions of the lawyer object, to designate the point
that the lawyer is moving towards, and to provide
the object to collide with. This should be a prefab.
Add a “TargetPoint” tag to this. Add a Rigidbody,
and switch gravity off, so that it is able to register
collision events. Enable the trigger checkbox in the
collider component.
patterns for virtual reality 213
1 using System.Collections;
2 using System.Collections.Generic;
3 using UnityEngine;
4 using UnityEngine.UI;
5
6 // As a lawyer, the user must complete activities:
7 // - move to seat. Until judge ready.
8 // - move to lectern, present. Until complete.
9 // - move to evidence table, present evidence.
10 // - move to jury, address jury. Until complete.
11 // - move to seat. Until verdict delivered.
12 // - task complete.
13 public class LawyerBehaviour : MonoBehaviour {
14 // States of the lawyer.
15 enum LawyerState { NoStateSet, Starting,
InSeatGettingReady, ReadyToPresent,
PresentingAtLectern, ReadyForEvidence, Evidence
, MoveToJury, AddressJury, JuryAddressOver,
WaitingVerdict, AllDone };
16 // Current state in the state transition graph.
17 private LawyerState state = LawyerState.NoStateSet;
18 // An object used to visually show the next point
that has to be visited.
19 private GameObject targetPoint;
20 // External values required.
21 // A text object to hold instructions being provided
to the user.
22 public Text instructionText;
23 // A prefab object to represent points of interest.
24 public GameObject targetPointMarkerPrefab;
25 // Positions of key elements to be visited.
26 public GameObject chairPosition;
27 public GameObject lecternPosition;
28 public GameObject evidenceTablePosition;
29 public GameObject juryPosition;
30 // The movie texture played when presenting evidence
31 public GameObject screenMovie;
32 // The sound clips for the various states.
33 public AudioSource judgeInstructions;
34 public AudioSource legalArgument;
35 public AudioSource juryPresentation;
36 public AudioSource judgeVerdict;
37 private float speedWhenAutonomous = 1.0f;
214 shaun bangay
1 // continuation...
2 else if (state == LawyerState.Evidence) {
3 targetPoint.SetActive (false);
4 instructionText.text = "Show evidence.";
5 StartCoroutine (showMovie (screenMovie,
LawyerState.MoveToJury));
6 }
7 else if (state == LawyerState.MoveToJury) {
8 targetPoint.transform.position = juryPosition.
transform.position;
9 instructionText.text = "Address the jury.";
10 targetPoint.SetActive (true);
11 }
12 else if (state == LawyerState.AddressJury) {
13 targetPoint.SetActive (false);
14 instructionText.text = "Summarize your case.";
15 StartCoroutine (giveInstructions (
juryPresentation, LawyerState.
JuryAddressOver));
16 }
17 else if (state == LawyerState.JuryAddressOver) {
18 targetPoint.transform.position = chairPosition
.transform.position;
19 instructionText.text = "Return to your chair."
;
20 targetPoint.SetActive (true);
21 }
22 else if (state == LawyerState.WaitingVerdict) {
23 targetPoint.SetActive (false);
24 instructionText.text = "Judge presents
findings.";
25 StartCoroutine (giveInstructions (judgeVerdict
, LawyerState.AllDone));
26 }
27 else if (state == LawyerState.AllDone) {
28 targetPoint.SetActive (false);
29 instructionText.text = "The trial is now over.
";
30 }
31 }
patterns for virtual reality 217
1 void Start () {
2 targetPoint = Instantiate (targetPointMarkerPrefab
);
3 targetPoint.SetActive (false);
4
5 setState (LawyerState.Starting);
6 }
7
8 void Update () {
9
10 // handle any updates specific to the current
state.
11 switch (state)
12 {
13 // do this for all states.
14 default:
15 {
16 Vector3 dir = targetPoint.transform.position -
transform.position;
17 if (dir.magnitude >= 1)
18 {
19 dir = dir / dir.magnitude;
20 }
21 transform.position += Time.deltaTime *
speedWhenAutonomous * dir;
22 transform.forward = dir;
23 }
24 break;
25 }
26 }
27 }
z -0.022824
UserMovement Rotation x0 Y89.452 Z0
JudgeControl EditCollider
SpectatorClone
Starting is heading
Radius 0.5
Height 2
Direction Y-Axis
MeshRenderer
LecternPosition LecternPosition
_Scenes
_Scenes Audio Materials Prefabs Resources Scripts Shaders
Audio
Jury Position BJuryPosition
Materials
Prefabs ScreenMovie Screen
Resources Judge Instructions LawyerBody Audio Source
Scripts Legal Argument LawyerBody Audio Source
Shaders Jury Presentation LawyerBody Audio Source
JudgeVerdict LawyerBody Audio Source
Audio Source
AudioClip JudgeComments
6.2.1 Description
Virtual worlds are more exciting when the objects are
active and dynamic than when they are passive or purely
static. The state machine pattern is employed to provide
active objects, or agents, in the virtual world that under-
take predefined actions, but also respond to things that
happen in the world as well.
This pattern is also useful for agents that may be user
controlled in a multi-user context, but that need fall-back
behaviour for situations where there are not sufficient
users to control them. This way the virtual reality applic-
ation can still provide a reasonable experience to smaller
numbers of users.
The pattern for an autonomous agent assumes that we
desire certain facilities in our agent:
6.2.2 Pattern
switch (state)
switch (state)
if ev.other == boundaryB
{
}
break;
default: // no nothing.
switch (state)
// no action required.
break;
Vector v = B - position;
position = position + normalized (v) *
min (v.length, deltaTime)
// Alternative check for reaching B
if (v.length < distanceThreshold)
{
6.2.3 Example
Example 30. Applying the pattern to Unity software
1 using System.Collections;
2 using System.Collections.Generic;
3 using UnityEngine;
4
5 public class SteerObject : MonoBehaviour {
6
7 public float speed = 0.1f;
8 public float turnspeed = 1.0f;
9
10 // Use this for initialization
11 void Start () {
12
13 }
14
15 // Update is called once per frame
16 void Update () {
17 float h = Input.GetAxis ("Horizontal");
18 float v = Input.GetAxis ("Vertical");
19
20 this.transform.position += speed * Time.deltaTime
* v * this.transform.forward;
21 this.transform.rotation *= Quaternion.AngleAxis (
turnspeed * Time.deltaTime * h, this.
transform.up);
22 }
23 }
1 using System.Collections;
2 using System.Collections.Generic;
3 using UnityEngine;
4
5 public class AngelGenerate : MonoBehaviour {
6
7 public int numberOfAngels = 5;
8
9 public float radius = 10.0f;
10
11 public GameObject angelTemplate;
12
13 // Use this for initialization
14 void Start () {
15 for (int i = 0; i < numberOfAngels; i++) {
16 Instantiate (angelTemplate, new Vector3 (Random.
Range (-radius, radius), 0, Random.Range (-
radius, radius)), Quaternion.identity);
17 }
18 }
19 }
1 using System.Collections;
2 using System.Collections.Generic;
3 using UnityEngine;
4
5 public class AngelBehaviour : MonoBehaviour {
6
7 enum AngelStates { NotInit, Follow, Freeze, Retreat,
Bounce, Attack };
8
9 // Current state in the state transition graph.
10 private AngelStates state = AngelStates.NotInit;
11
12 // speed of movement while following.
13 private float speedFollow = 0.5f;
14
15 // proximity threshold
16 private float proximityLimit = 3.0f;
17
18 // time limit on being frozen.
19 private float freezeTimeLimit = 2.0f;
20
21 // time limit on bouncing before attack.
22 private float bounceTimeLimit = 5.0f;
23
24 // the player that is being chased.
25 private GameObject player;
26
27 // coroutine used for the timer, so we can abort it
if required.
28 IEnumerator timerCoroutine;
29
30 // This function updates the lawyer simulation
according to the state involved.
31 void setState (AngelStates newState)
32 {
33 if (state == newState)
34 {
35 return; // nothing changed.
36 }
37
38 AngelStates oldState = state;
39 state = newState;
40
41 handleStateChangedEvent (oldState, newState);
42 }
patterns for virtual reality 231
1 void Start () {
2 player = GameObject.Find ("Avatar");
3 setState (AngelStates.Follow);
4 }
5
6 // Update is called once per frame
7 void Update () {
8 // generate any events required.
9 // Proximity.
10 Vector3 dir = player.transform.position -
transform.position;
11 if (dir.magnitude < proximityLimit) {
12 handleProximityEvent ();
13 } else if (dir.magnitude > 1.5f * proximityLimit)
{
14 // actually have to move a significant distance
beyond proximity limit, to avoid change
being too sensitive to small movements.
15 handleUnproximityEvent ();
16 }
17
18 // Visibility.
19 if (isVisible (40.0f)) {
20 handleLookedAtEvent ();
21 } else {
22 handleUnlookedAtEvent ();
23 }
24
25 // handle any updates specific to the current
state.
26 switch (state)
27 {
28 case AngelStates.Follow:
29 {
30 dir = player.transform.position - transform.
position;
31 dir.Normalize ();
32 transform.position += speedFollow * Time.
deltaTime * dir;
33 transform.forward = dir;
34 }
35 break;
36 case AngelStates.Freeze:
37 // continued ...
patterns for virtual reality 235
1 // continuation ...
2 case AngelStates.Freeze:
3 break;
4 case AngelStates.Retreat:
5 {
6 dir = -(player.transform.position - transform.
position);
7 dir.Normalize ();
8 transform.position += speedFollow * Time.
deltaTime * dir;
9 transform.forward = dir;
10 }
11 break;
12 case AngelStates.Bounce:
13 {
14 transform.position = new Vector3 (transform.
position.x, Mathf.Abs (Mathf.Sin (Time.
time)), transform.position.z);
15 }
16 break;
17 }
18 }
19 }
236 shaun bangay
Project Console
_Scenes
Audio
Materials
Prefabs
Resources
Scripts
Shaders
6.3.1 Description
The autonomous behaviours covered in the previous pat-
terns have allowed complex behaviours which interact
with, and respond to, other elements in the scene. This
all contributes to rich virtual environment but can ap-
pear repetitive and predictable after a while. While not
a replacement for artificial intelligence, a bit of random-
ness can create the illusion of more complex behaviour.
Randomness is also a key ingredient in managing emer-
gent systems and improving their robustness by ensuring
that they can recover from failure. For example, an agent
that directly follows the user may get trapped by some
configurations of obstacles, but a bit of randomness in its
movement may give it a chance to free itself.
Sources of randomness in software applications typ-
ically come from a random number generator. This is
just a function that produces a long sequence of seem-
238 shaun bangay
6.3.2 Pattern
Random numbers are retrieved from the random num-
ber generator provided by the programming language
and libraries used by the VR engine. Typical functions
include:
r = randomFloat ()
r = randomRange (l, u)
r = l + (random () % (u - l))
rx = 2 * randomFloat () - 1
ry = 2 * randomFloat () - 1
position = position + speed *
Vector (rx, ry) * deltaTime
240 shaun bangay
}
Figure 6.3.1: A given
probability distribution Simple motion might keep only one way point at a time
is achieved by decom-
and move directly towards that. The agent then changes
posing the range [0, 1]
of a typical random direction sharply when it reaches that point and starts
floating point value. moving to the next. Several way points are used to define
a spline path that passes through each way point and
provide smoother motion.
The actions in the state machine can also be modified
to include some random elements. For example, when
a particular event X is triggered we can move to state A
30% of the time, to state B 25% of the time, and to state
C the remaining 45% of the time. The code would like
something like:
patterns for virtual reality 241
switch (state)
...
case stateS:
r = randomFloat ()
if r <= 0.3 // 30%
{
changeState (StateA)
}
else if r <= 0.55 // 25% beyond the 30%
{
changeState (StateB)
}
else // remaining 45%
{
changeState (StateC)
...
6.3.3 Example
Example 31. Applying the pattern to Unity software
1 using System.Collections;
2 using System.Collections.Generic;
3 using UnityEngine;
4
5 // Place pegs in the arrangement of a bean machine/
Galton Board/quincunx
6 public class QuincunxGenerator : MonoBehaviour {
7 [Tooltip ("Position of the apex peg in the
triangular formation")]
8 public Vector3 topPegPosition = new Vector3 (0, 20,
0);
9 [Tooltip ("Horizontal gap between pegs")]
10 public float horizontalgap = 2.0f;
11 [Tooltip ("Vertical gap between pegs")]
12 public float verticalgap = 3.0f;
13 [Tooltip ("Number of rows of pegs")]
14 public int numberOfRows = 10;
15 [Tooltip ("The template for one peg")]
16 public GameObject pegTemplate;
17 // Place pegs in a staggered triangular formation,
from the top point, with spacing defined by
hgap horizontally, and vgap vertically. X axis
is horizontal, Y axis is vertical.
18 void createQuincunx (Vector3 top, float hgap, float
vgap, int numRows, GameObject peg)
19 {
20 int numElementsInCurrentRow = 1;
21 float startingX = top.x;
22 // lay down one row at a time.
23 for (int i = 0; i < numRows; i++) {
24 // lay down a row.
25 for (int j = 0; j < numElementsInCurrentRow; j
++) {
26 GameObject pegPiece = Instantiate (peg, new
Vector3 (startingX + j * hgap, top.y - i
* vgap, 0.0f), Quaternion.identity);
27 pegPiece.transform.SetParent (this.transform);
28 }
29 // shift left edge.
30 startingX -= (hgap / 2.0f);
31 numElementsInCurrentRow += 1;
32 }
33 }
34 void Start () {
35 createQuincunx (topPegPosition, horizontalgap,
verticalgap, numberOfRows, pegTemplate);
36 }
37 }
244 shaun bangay
Field of View
DropPieceClone Physical Camera
Flare Layer
Audio Listener
Project Console
_Scenes
Materials
Prefabs
Resources
Scripts
Shaders
1 using System.Collections;
2 using System.Collections.Generic;
3 using UnityEngine;
4
5 // Drop an object at regular intervals.
6 public class RegularRelease : MonoBehaviour {
7 [Tooltip ("Prefab of object for dropping")]
8 public GameObject dropObjectTemplate;
9
10 [Tooltip ("Time intervals in seconds between
dropping objects")]
11 public float timeInterval = 2.0f;
12
13 [Tooltip ("Position to drop the object from")]
14 public Vector3 dropPosition = new Vector3 (0, 200,
0);
15
16 // count time until next object ready for dropping.
17 private float counter;
18
19 // Use this for initialization
20 void Start () {
21 counter = 0.0f;
22 }
23
24 // Update is called once per frame
25 void Update () {
26 counter += Time.deltaTime;
27
28 if (counter > timeInterval) {
29 // drop object, potentially with random offet.
30 Instantiate (dropObjectTemplate, new Vector3 (0 *
Random.Range (-0.01f, 0.01f),0,0) +
dropPosition, Quaternion.identity);
31 counter = 0.0f;
32 }
33 }
34 }
246 shaun bangay
Field of View
DropPieceClone Physical Camera
Project Console
from the triangular peg
region. _Scenes
Materials
Prefabs
Resources
Scripts
Shaders
7
Physical in Virtual
7.1.1 Description
7.1.2 Pattern
Some common configurations are suggested by each of
the following patterns:
patterns for virtual reality 249
7.1.3 Example
Example 32. Applying the pattern to Unity software
Source Rotation
Add Component
BallClone
Project Console
_Scenes
Materials
Prefabs
Resources
Scripts
Shaders
Imported Object
Open... Execution Order...
ified collision event
Source
CollisionBehaviour
BallClone
BallClone
bullclones
BallClone
AssemblyInformation
handler can custom-
BallClone Filename Assembly-CSharp.dll
BallClone
BallClone
BallClone
BallClone
using System.Collections
using System.Collections.Generic
using UnityEngine
ize the behaviour in
public class CollisionBehaviour MonoBehaviour
BallClone
BallClone
BallClone
BallClone
Use thisfor initialization
voidStart response to a collision.
BallClone
BallClone Update is called once
perframe
BallClone voidUpdate
BallClone
BallClone
BallClone voidOnTriggerEnterColliderother
BallClone
BallClone this.GetComponent
ltRigidbodygt.velocity new
BallClone Vector3Random.Range2.3f 2.3f 3.1fRandom.Range-2.3f 2.3f
BallClone
BallClone voidOnCollisionEnterCollisioncollision
BallClone
BallClone this.GetComponentltRigidbodygt.velocitynew
Vector3Random.Range-0.1f -0.1f 0Random.Range-0.1f 0.16
this.transform.localScale0.99fthis.transform.localScale
Project Console
C# C#
CollisionBehaviour ObjectGeneration
_Scenes
Materials
Prefabs
Resources
Scripts
Shaders
7.2.1 Description
7.2.2 Pattern
Assume we have three values; a position p, a velocity v
and an acceleration, a. The velocity represents the rate of
change of p over time, and the acceleration represents the
rate of change of v over time. The goal is to determine
future values of p and v, assuming that acceleration a(t)
is an input, potentially varying over time.
Given the discrete nature of the simulation, values are
updated at discrete intervals, typically once per frame.
A typical Euler step update uses the following pattern:
∆t = deltaTime
256 shaun bangay
a = getCurrentAcceleration ()
v = v + a∆t
p = p + v∆t
}
7.2.3 Example
Example 33. Applying the pattern to Unity software
1 using System.Collections;
2 using System.Collections.Generic;
3 using UnityEngine;
4
5 public class CannonControl : MonoBehaviour {
6
7 // limit of side to side movement.
8 public float sideAngleRange = 30.0f;
9
10 // limit of elevation
11 public float minElevation = 5.0f;
12 public float maxElevation = 80.0f;
13
14 // control sensitivity factor.
15 public float sensitivity = 10.0f;
16
17 private float elevation = 0.0f;
18 private float side = 0.0f;
19
20 public GameObject avianPrefab;
21
22 void Update () {
23 float v = Input.GetAxis ("Vertical");
24 float h = Input.GetAxis ("Horizontal");
25
26 side = Mathf.Max (Mathf.Min (side + h *
sensitivity, sideAngleRange), -sideAngleRange
);
27 elevation = Mathf.Max (Mathf.Min (elevation + v *
sensitivity, maxElevation), minElevation);
28 this.transform.rotation = Quaternion.AngleAxis (
side, new Vector3 (0,1,0)) * Quaternion.
AngleAxis (elevation, new Vector3 (1,0,0));
29
30 if (Input.GetAxis ("Fire1") > 0)
31 {
32 GameObject avian = Instantiate (avianPrefab,
this.transform.position, this.transform.
rotation);
33 avian.GetComponent <AvianFlu> ().velocity =
100.0f * this.transform.up;
34 }
35 }
36 }
258 shaun bangay
1 using System.Collections;
2 using System.Collections.Generic;
3 using UnityEngine;
4
5 public class AvianFlu : MonoBehaviour {
6
7 private Vector3 acceleration = new Vector3 (0,-10,0)
;
8 public Vector3 velocity = new Vector3 (0,0,0);
9
10 // Update is called once per frame
11 void Update () {
12 velocity = velocity + Time.deltaTime *
acceleration;
13 this.transform.position = this.transform.position
+ Time.deltaTime * velocity;
14 }
15 }
7.3.1 Description
In a scene containing a parent-child hierarchy (section 2.8
on page 63) such as a human armature, the kinematic
and dynamics processes typically affect the root nodes,
and pass on these effects through the parent-child links
to each of the child nodes in turn.
In virtual reality applications we directly control some
of the leaf nodes (those child nodes with no children of
their own) in these hierarchies, specifically the head and
the hands. To provide realistic behaviour for the avatar
it is desirable to be able to calculate the effect of moving
the hand on the intermediate and root nodes. Specifically,
if the hand moves what happens to the lower arm, elbow,
upper arm, shoulder and torso?
Inverse kinematics solves this problem.
The inverse kinematics problem is not one that has a
single unique answer. There may be many configurations
of shoulder and elbow position that all leave the hand
in the same place. Inverse kinematics solvers try to use
additional constraints to determine the most feasible
solution (e.g. the ones that don’t have the elbow bending
the wrong way).
There are also cases where there is no valid solution.
For example, most humans are unable to move their
hands more than an arms length away from their bodies.
In those cases, the inverse kinematics solver returns the
best solution, such as placing the hand as close to the
target point as possible, but at the limit of the arm’s
extension.
7.3.2 Pattern
Accessing the inverse kinematics functionality in the
virtual reality engine involves:
7.3.3 Example
Example 34. Applying the pattern to Unity software
Add Component
Project Console
_Scenes
Animation
Materials
Prefabs
Resources
Scripts
Shaders
1 using System.Collections;
2 using System.Collections.Generic;
3 using UnityEngine;
4
5 public class MoveTargets : MonoBehaviour {
6 public GameObject LFootTarget;
7 public GameObject RFootTarget;
8 public GameObject LHandTarget;
9 public GameObject RHandTarget;
10 public GameObject HeadTarget;
11
12 public float speed = 1.0f;
13 public float stretch = 1.0f;
14
15 void Update () {
16 LFootTarget.transform.position = new Vector3 (0.5f
, 0, stretch * Mathf.Sin (speed * Time.time))
;
17 RFootTarget.transform.position = new Vector3 (-0.5
f, 0, -stretch * Mathf.Sin (speed * Time.time
));
18 LHandTarget.transform.position = new Vector3 (1.5f
, 1.0f + stretch * Mathf.Cos (speed * Time.
time), stretch * Mathf.Sin (speed * Time.time
));
19 RHandTarget.transform.position = new Vector3 (-1.5
f, 1.0f + stretch * Mathf.Sin (speed * Time.
time), stretch * Mathf.Cos (speed * Time.time
));
20 HeadTarget.transform.position = new Vector3 (
stretch * Mathf.Sin (speed * Time.time), 1.5f
+ 0.5f * stretch * Mathf.Sin (speed * Time.
time), stretch * Mathf.Cos (speed * Time.time
));
21 }
22 }
patterns for virtual reality 263
1 using System.Collections;
2 using System.Collections.Generic;
3 using UnityEngine;
4
5 public class IKSolver : MonoBehaviour {
6 public GameObject LFootTarget;
7 public GameObject RFootTarget;
8 public GameObject LHandTarget;
9 public GameObject RHandTarget;
10 public GameObject HeadTarget;
11 public bool enable = false;
12
13 void trackPart (AvatarIKGoal part, GameObject target
) {
14 Animator a = GetComponent <Animator> ();
15 a.SetIKPositionWeight (part, 1);
16 a.SetIKRotationWeight (part, 1);
17 a.SetIKPosition (part, target.transform.position);
18 a.SetIKRotation (part, target.transform.rotation);
19 }
20
21 void untrackPart (AvatarIKGoal part) {
22 Animator a = GetComponent <Animator> ();
23 a.SetIKPositionWeight (part, 0);
24 a.SetIKRotationWeight (part, 0);
25 }
26
27 void OnAnimatorIK () {
28 if (enable) {
29 trackPart (AvatarIKGoal.LeftFoot, LFootTarget);
30 trackPart (AvatarIKGoal.RightFoot, RFootTarget);
31 trackPart (AvatarIKGoal.LeftHand, LHandTarget);
32 trackPart (AvatarIKGoal.RightHand, RHandTarget);
33 GetComponent <Animator> ().SetLookAtWeight (1);
34 GetComponent <Animator> ().SetLookAtPosition (
HeadTarget.transform.position);
35 } else {
36 untrackPart (AvatarIKGoal.LeftFoot);
37 untrackPart (AvatarIKGoal.RightFoot);
38 untrackPart (AvatarIKGoal.LeftHand);
39 untrackPart (AvatarIKGoal.RightHand);
40 GetComponent <Animator> ().SetLookAtWeight (0);
41 }
42 }
43 }
8
All Together
8.1.1 Description
8.1.2 Pattern
Developing a multi-user virtual reality application re-
quires the cooperation of the networking facilities in the
virtual reality engine. This must be activated in a way
specific to the particular engine being used, such as by
adding a particular (invisible) object to the scene.
function update ()
{
if (isLocalAvatar)
{
get controller input
update position and orientation properties
}
270 shaun bangay
function update ()
...
8.1.3 Example
Example 35. Applying the pattern to Unity software
1 using System.Collections;
2 using System.Collections.Generic;
3 using UnityEngine;
4
5 public class SteerCar : MonoBehaviour {
6 void Update () {
7 float h = Input.GetAxis ("Horizontal");
8 float v = Input.GetAxis ("Vertical");
9
10 float forceEffect = 100.1f;
11 float angleRate = 10.0f;
12 GetComponent <Rigidbody> ().AddForce (forceEffect
* v * transform.forward);
13 transform.localRotation *= Quaternion.AngleAxis (
angleRate * h, transform.up);
14 }
15 }
Mass
Rigidbody
x1y1z1
1
with one of the vehicle
MainCamera Drag 0.05
DontDestroyOnLoad
NetworkManager
AngularDrag
UseGravity
Is Kinematic
0.05
wv
ie
prefabs placed in the
Interpolate
CollisionDetection
Constraints
Steer Car Script
center.
Script SteerCar
ForceEffect 10
AngleRate 10
NetworkChannel 0
NetworkSendInterval0.1
Network Identity Script
ServerOnlycannotbesetforLocalPlayer Authorityobjects
LocalPlayer Authority
Observers
NetworkTransform Script
NetworkSendRate
TransformSyncMode Sync Rigidbody3D
_Scenes
Materials
Prefabs
Resources
Scripts
Shaders
Add Component
running in a built ap-
plication and the other
through the Unity soft-
ware editor.
Project Console Q
Car
_Scenes
Materials
Prefabs
Resources
Scripts
Shaders
Bake
paused
inplaymode
1 using System.Collections;
2 using System.Collections.Generic;
3 using UnityEngine;
4 using UnityEngine.Networking;
5
6 public class SteerCar : NetworkBehaviour {
7
8 public float forceEffect = 10.0f;
9 public float angleRate = 10.0f;
10
11 void Update () {
12 if (isLocalPlayer)
13 {
14 float h = Input.GetAxis ("Horizontal");
15 float v = Input.GetAxis ("Vertical");
16
17 GetComponent <Rigidbody> ().AddForce (
forceEffect * v * transform.forward);
18 transform.localRotation *= Quaternion.AngleAxis
(angleRate * h, transform.up);
19 }
20 }
21 }
patterns for virtual reality 275
8.2.1 Description
Adding an object to a multi-user virtual world is more
complex than just adding it to a single user environment.
Following in the conventions of some virtual reality
engines, we instantiate objects in a single user context
(section 2.3 on page 33), and spawn them in a multi-user
context.
In the multi-user situation, creating shared objects is
accomplished by sending a request to the server which
then creates the object in its master representation of
the world. Imposters for that object are then created on
each of the clients, including the one which requested the
object creation, as the server shares updates with each
client.
8.2.2 Pattern
The typical pattern for spawning an object in a net-
worked virtual reality is:
8.2.3 Example
Example 36. Applying the pattern to Unity software
1 using System.Collections;
2 using System.Collections.Generic;
3 using UnityEngine;
4
5 public class LaunchBlock : MonoBehaviour {
6
7 public GameObject blockPrefab;
8
9 public float turnSpeed = 10.0f;
10
11 public float launchForce = 5.0f;
12
13 public float fireInterval = 0.5f;
14
15 private float timeTillNextFire = 0.0f;
16
17 // Update is called once per frame
18 void Update () {
19 float hori = Input.GetAxis ("Horizontal");
20 float fire = Input.GetAxis ("Fire1");
21
22 transform.rotation *= Quaternion.AngleAxis (hori *
turnSpeed * Time.deltaTime, transform.up);
23 if ((timeTillNextFire < 0.0f) && (fire > 0.5f))
24 {
25 GameObject block = Instantiate (blockPrefab,
transform.position + transform.forward,
Quaternion.identity);
26 block.GetComponent <Rigidbody> ().AddForce (
launchForce * (transform.forward +
transform.up), ForceMode.Impulse);
27 timeTillNextFire = fireInterval;
28 }
29 timeTillNextFire -= Time.deltaTime;
30 }
31 }
patterns for virtual reality 279
1 using System.Collections;
2 using System.Collections.Generic;
3 using UnityEngine;
4 using UnityEngine.Networking;
5
6 public class LaunchBlock : NetworkBehaviour {
7 public GameObject blockPrefab;
8 public float turnSpeed = 10.0f;
9 public float launchForce = 5.0f;
10 public float fireInterval = 0.5f;
11 private float timeTillNextFire = 0.0f;
12 // Give each player their own colour.
13 private Color blockColour;
14
15 void Start () {
16 blockColour = new Color (Random.Range (0.0f, 1.0f)
, Random.Range (0.0f, 1.0f), Random.Range
(0.0f, 1.0f));
17 }
18
19 [Command]
20 void CmdSpawn ()
21 {
22 GameObject block = Instantiate (blockPrefab,
transform.position + transform.forward,
Quaternion.identity);
23 block.GetComponent <Rigidbody> ().AddForce (
launchForce * (transform.forward + transform.
up), ForceMode.Impulse);
24 block.GetComponent <MeshRenderer> ().material.
color = blockColour;
25 NetworkServer.Spawn (block);
26 }
27 // continued ...
1 // continuation ...
2 // Update is called once per frame
3 void Update () {
4 if (isLocalPlayer)
5 {
6 float hori = Input.GetAxis ("Horizontal");
7 float fire = Input.GetAxis ("Fire1");
8
9 transform.rotation *= Quaternion.AngleAxis (hori
* turnSpeed * Time.deltaTime, transform.up
);
10 if ((timeTillNextFire < 0.0f) && (fire > 0.5f))
11 {
12 CmdSpawn ();
13 timeTillNextFire = fireInterval;
14 }
15 timeTillNextFire -= Time.deltaTime;
16 }
17 }
18 }
8.3.1 Description
Changes to objects in a multi-user virtual world need to
be immediately visible to all users on every client in the
virtual world. These changes include updates to position
and orientation, but also any other update to the object’s
properties that are pertinent to other users in the virtual
world. In practices, changes are not immediate as they
have to be passed onto the server and then relayed to all
other clients.
The process of updating properties of shared objects
now involves several steps:
8.3.2 Pattern
The pattern below shows the process of updating
property p of an object using a remote procedure call
paradigm.
Let us assume that the object supports an interface (an
externally visible function or event handler) that other
objects or users on any client can call to update property
p. This avoids modifying the object’s property directly,
but instead relays the change to the server.
serverChangeProperty (newValueForP)
[ServerRPC]
284 shaun bangay
if (isServer)
{
syncP = newValueForP
}
[SynchronizedVariable: eventHandler =
clientChangeProperty]
Type syncP
p = newValueForP
8.3.3 Example
Example 37. Applying the pattern to Unity software
patterns for virtual reality 285
1 using System.Collections;
2 using System.Collections.Generic;
3 using UnityEngine;
4 using UnityEngine.UI;
5
6 public class UpdateText : MonoBehaviour {
7
8 // Use this for initialization
9 void Start () {
10 InputField inputField = GameObject.Find ("
InputField").GetComponent <InputField> ();
11 inputField.onEndEdit.AddListener (onTextAdded);
12 }
13
14 public void onTextAdded (string message)
15 {
16 GetComponent <TextMesh> ().text += "\n" + message;
17 }
18 }
1 using System.Collections;
2 using System.Collections.Generic;
3 using UnityEngine;
4 using UnityEngine.Networking;
5
6 public class MakeMessageBoard : NetworkBehaviour {
7 public GameObject messageBoardPrefab;
8
9 void Start ()
10 {
11 if (isServer)
12 {
13 GameObject board = Instantiate (
messageBoardPrefab);
14 NetworkServer.Spawn (board);
15 }
16 }
17 }
1 if (isServer)
2 {
3 NetworkServer.RegisterHandler (
MessageBoardMessageType, MessageOnTextAdded);
4 }
Project Console
Materials
Prefabs
Resources
Scripts
NetworkInformation
Shaders
290 shaun bangay
1 using System.Collections;
2 using System.Collections.Generic;
3 using UnityEngine;
4 using UnityEngine.UI;
5 using UnityEngine.Networking;
6
7 public class UpdateText : NetworkBehaviour {
8 void Start () {
9 InputField inputField = GameObject.Find ("
InputField").GetComponent <InputField> ();
10 inputField.onEndEdit.AddListener (onTextAdded);
11
12 if (isServer)
13 {
14 NetworkServer.RegisterHandler (
MessageBoardMessageType, MessageOnTextAdded
);
15 }
16 }
17
18 [SyncVar (hook = "onSyncTextChanged")]
19 private string messageChange;
20
21 void onSyncTextChanged (string message)
22 {
23 GetComponent <TextMesh> ().text += "\n::" +
message;
24 }
25
26 [Command]
27 void CmdOnTextAdded (string message)
28 {
29 messageChange = message;
30 }
31
32 void MessageOnTextAdded (NetworkMessage m)
33 {
34 messageChange = m.ReadMessage <MessageBoardMessage
> ().messageUpdate;
35 }
36 // continued ...
patterns for virtual reality 291
1 // continuation ...
2 public void onTextAdded (string message)
3 {
4 MessageBoardMessage networkMessage = new
MessageBoardMessage ();
5 networkMessage.messageUpdate = message;
6 NetworkClient.allClients[0].Send (
MessageBoardMessageType, networkMessage);
7 }
8
9 private short MessageBoardMessageType = MsgType.
Highest + 1;
10 class MessageBoardMessage : MessageBase
11 {
12 public string messageUpdate;
13 }
14 }
9
Dynamic Geometry
9.1.1 Description
9.1.2 Pattern
Objects are procedurally generated using a top-down
approach starting with a function whose task is to create
the complete object. This function would identify the
components of the object, and call further functions to
generate each of these in turn. This process is repeated
until the functions are simple enough to return indi-
vidual geometry elements (either primitives such as
cubes or spheres, or non-decomposable elements that are
manually modelled in a 3D-modelling package).
Each function would take a number of parameters.
These would typically include properties such as relat-
ive position, or size and shape. Parameters should be
chosen to be descriptive of the object being created (num-
ber of floors being more useful for creating meaningful
skyscrapers than height in meters).
It is very tempting to use (pseudo)random numbers
within a function to provide the variation in the output
content. This should be avoided. Ideally these functions
are deterministic so that they return the same output
for a given set of parameters. This ensures consistency
if content needs to be recreated or refined during the
course of a virtual reality experience. Seeding a random
number generator to provide deterministic random num-
bers is problematic if other parts of the application might
be sharing the same random number generator. A recom-
mended approach is to include one or more identifier
parameters. These are just extra numeric parameters that
act as unique identifiers for a particular instance of the
content returned. Random(ish) values can then be pro-
duced through use of different hashes of these identifiers,
through functions such as Perlin noise.
A typical component generator function would have
the form:
296 shaun bangay
childIdentifer1 =
hash1 (identifier parameter)
childIdentifer2 =
hash2 (identifier parameter)
...
childContentParameter1 =
f1 (content parameters)
childContentParameter2 =
f2 (content parameters)
...
childComponentX = generateComponentB
(childContentParameter1, childIdentifier1)
childComponentY = generateComponentB
(childContentParameter2, childIdentifier2)
...
any additional procedural generation code
that utilizes content and identifier
parameters
...
return (childComponentX, childComponentY, ...)
9.1.3 Example
Example 38. Applying the pattern to Unity software
9.2.1 Description
Textures are frequently used to add detail to virtual
worlds. Their most common incarnation is as a 2D image
used to add colour patterns to surfaces but the concept of
a texture is far more flexible. Even the 2D image version
is regarded as a function that converts some coordinates
300 shaun bangay
1 using System.Collections;
2 using System.Collections.Generic;
3 using UnityEngine;
4
5 public class HouseBuilder : MonoBehaviour {
6 public GameObject brickPrefab;
7 public float brickSize = 0.5f;
8 public float length = 5.0f;
9 public float width = 3.0f;
10 public float height = 2.0f;
11 public int floors = 5;
12
13 // continued ...
y = f ( x1 , x2 , . . . , x n )
y = f lookup [ x1 , x2 , . . . , xn ]
9.2.2 Pattern
The patterns covered in this section include a standard
template for defining and populating a texture lookup
table, and some common primitive functions used to
create particular categories of texture pattern.
The texture is represented by an array, a common col-
lection structure in most languages which maps directly
into a memory efficient structure easily used on virtual
reality rendering hardware. For the sake of generality, a
n-dimensional texture pattern is shown, although most
image based uses of textures are limited to 2 dimensions.
310 shaun bangay
// declare texture.
Texture texture[Size1, Size2, ..., SizeN]
// populate texture from a function
for (x1 = 0; x1 < Size1; x1++)
{
return A * sin (f * xi + φ)
return A * noise
(f * xi + φ,
f * xj + φ,
...,
f * xk + φ)
9.2.3 Example
Example 39. Applying the pattern to Unity software
1 using System.Collections;
2 using System.Collections.Generic;
3 using UnityEngine;
4
5 public class TextureGenerate : MonoBehaviour {
6
7 void On_TextureGenerate ()
8 {
9 int texWidth = 256;
10 int texHeight = 256;
11
12 Texture2D synthTexture = new Texture2D (texWidth,
texHeight);
13
14 for (int i = 0; i < texWidth; i++)
15 {
16 for (int j = 0; j < texHeight; j++)
17 {
18 float u = 1.0f * i / texWidth;
19 float v = 1.0f * j / texHeight;
20 Color col = stripes (u, v);
21 // Color col = noise (u, v);
22 // Color col = wood (u, v);
23 synthTexture.SetPixel (i, j, col);
24 }
25 }
26
27 synthTexture.Apply ();
28
29 GetComponent <Renderer> ().materials[0].SetTexture
("_MainTex", synthTexture);
30 }
31
32 void Start () {
33 On_TextureGenerate ();
34 }
35 }
patterns for virtual reality 315
9.3 Shaping up
9.3.1 Description
In addition to generating structures out of primitive
shapes and other pre-prepared content (section 9.1 on
page 293) it is also possible to generate and manipulate
the shape and structure of the actual geometric mesh
used for individual objects. This provides all the benefits
of procedural generation with the added advantage of
patterns for virtual reality 317
9.3.2 Pattern
Mesh creation proceeds by first defining the list of ver-
tices. Most engines prefer to receive these as an array
of vertices. We may need to define which properties we
want in a vertex. The structure below contains some of
patterns for virtual reality 319
class Vertex
{
// Common properties
Vector position
Vector normal
Vector textureCoordinates
// Less common properties
Vector tangent
Colour colour
float height
}
Figure 9.3.2: A grid of
A common vertex configuration is a grid. We can create values is represented
a 2D array of vertices for this and use a nested for loop, using a single index
(and thus no need to 2D
but since most virtual reality engines prefer the 1D array arrays or nested loops)
there are some strategies for achieving the same result. by decomposing the
index value into row and
Assume we have a grid with NumberOfRows rows and column numbers.
NumberOfColumns columns.
numberOfVertices =
NumberOfRows * NumberOfColumns
for (i = 0; i < numberOfVertices; i++)
{
320 shaun bangay
row = i / numberOfColumns
// using integer division
column = i % numberOfColumns
vertices[i].position = Vector (row, column)
vertices[i].normal = Vector (...)
...
class Triangle
{
int v[3]
class Triangle
{
Vertex v[3]
numberOfTriangles = numberOfVertices - 2
Triangles t [numberOfTriangles]
for (i = 0; i < numberOfTriangles; i++)
{
if (i % 2 == 0)
{
t[i].v[0] = i
// index of first vertex of triangle
t[i].v[1] = i + 1
// index of second vertex of triangle
t[i].v[2] = i + 2
// index of third vertex of triangle
}
else
{
t[i].v[0] = i + 2
// index of first vertex of triangle
t[i].v[1] = i + 1
// index of second vertex of triangle
t[i].v[2] = i
// index of third vertex of triangle
}
mesh.vertices = v
mesh.faces = t
mesh.reload ()
322 shaun bangay
9.3.3 Example
Example 40. Applying the pattern to Unity software
1 void On_UpdateMesh ()
2 {
3 int i;
4 int j;
5
6 int meshx = 250;
7 int meshy = 250;
8
9 Vector3 [] vertices = new Vector3 [(meshx + 1) * (
meshy + 1)];
10 for (i = 0; i <= meshx; i++)
11 {
12 for (j = 0; j <= meshy; j++)
13 {
14 float xp = 1.0f * (i - (meshx / 2.0f)) / meshx;
15 float yp = 1.0f * (j - (meshy / 2.0f)) / meshy;
16 vertices[indexOfVertex (i, j, meshx, meshy)] =
new Vector3 (xp, getHeight (xp, yp), yp);
17 }
18 }
19
20 Vector2 [] uv = new Vector2[(meshx + 1) * (meshy +
1)];
21 for (i = 0; i <= meshx; i++)
22 {
23 for (j = 0; j <= meshy; j++)
24 {
25 float xp = 1.0f * (i - (meshx / 2.0f)) / meshx;
26 float yp = 1.0f * (j - (meshy / 2.0f)) / meshy;
27 float u = Mathf.PerlinNoise (patternFrequency *
(xp + seed), patternFrequency * (yp + seed)
);
28 float v = 0.3f * (vertices[indexOfVertex (i, j,
meshx, meshy)].y - 0.2f);
29 uv[indexOfVertex (i, j, meshx, meshy)] = new
Vector2 (u, v);
30 }
31 }
32 // continued ...
patterns for virtual reality 325
1 // continuation ...
2 int [] triangles = new int [3 * 2 * meshx * meshy];
3 for (i = 0; i < meshx; i++)
4 {
5 for (j = 0; j < meshy; j++)
6 {
7 triangles[indexOfTriangle (i, j, meshx, meshy) +
0] = indexOfVertex (i + 0, j + 0, meshx,
meshy);
8 triangles[indexOfTriangle (i, j, meshx, meshy) +
1] = indexOfVertex (i + 0, j + 1, meshx,
meshy);
9 triangles[indexOfTriangle (i, j, meshx, meshy) +
2] = indexOfVertex (i + 1, j + 1, meshx,
meshy);
10
11 triangles[indexOfTriangle (i, j, meshx, meshy) +
3] = indexOfVertex (i + 0, j + 0, meshx,
meshy);
12 triangles[indexOfTriangle (i, j, meshx, meshy) +
4] = indexOfVertex (i + 1, j + 1, meshx,
meshy);
13 triangles[indexOfTriangle (i, j, meshx, meshy) +
5] = indexOfVertex (i + 1, j + 0, meshx,
meshy);
14 }
15 }
16
17 Mesh mesh = GetComponent <MeshFilter> ().mesh;
18 mesh.Clear (false);
19 mesh.vertices = vertices;
20 mesh.triangles = triangles;
21 mesh.uv = uv;
22 mesh.RecalculateNormals ();
23 }
326 shaun bangay
10.1.1 Description
10.1.2 Pattern
Attempts to replicate lighting in computer generated
imagery range from simple models (such as no-lighting,
or constant colour), to approximations that take into ac-
count relative positioning of light and surface, all the
way through to lookup tables using data recorded from
scenes in physical reality. Partly due to tradition and
partly for efficiency common shaders in virtual reality
engines tend to model three major lighting effects: ambi-
ent, diffuse and specular reflection.
Ambient reflection assumes a single brightness for a
surface, affected only by the colour of the surface and
a global level of illumination originating from nowhere
in particular. The surface colour is determined from
a material property of the surface (assume this colour
variable is named k a ) and the illumination level (Ia ) is
a global property which may be a setting in the engine,
or a property of a specific world properties node in the
scene graph.
The ambient illumination pattern is present when this
expression pattern is present:
pixel colour = k a Ia
Figure 10.1.1: Diffuse Colour values are added for each of the diffuse light
lighting combines the
geometry of the surface
sources and for other aspects of the light model.
(direction indicated The term N · L is equivalent to the cosine of the angle
by the normal vector, between N and L, provided both of the two vectors have
N) with the direction
towards the light source, unit length (have been normalized). The max expression
L, calculated by subtract- is used to clamp intensity values to zero when the light
ing the surface position
from the light’s position.
is behind the surface since this would otherwise reflect
negative amounts of light.
Specular illumination models shiny surfaces where a
reflected image of the light source is visible in the sur-
face. This reflected direction is just the vector from the
surface to the light source, L, reflected about the vector
perpendicular to the surface, N, and is represented with
the value R. Many shader environments have a function
(probably called “reflect”) to calculate R from N and L,
but it can also be generated from the expression:
R = 2 ( N · L) N − L
patterns for virtual reality 333
n
pixel colour = k s IL ( R · V ) Camera
V N L
There are a range of specular reflectance models (this R
P
one is attributed to Phong [Phong, 1975]) all of which
approximate the appearance of the specular highlights in Figure 10.1.2: The
slightly different ways. specular lighting model
considers how much of
Image textures can also be applied to the surface to
the light reflected from
provide the base material colour (k a , k d or k s ). Texture a shiny surface along
access makes use of texture coordinates, attributes of the reflected vector, R,
is sent in the direction
the vertex which are passed into the vertex and pixel of the camera (viewer)
shaders. These texture coordinates are normally named along vector V.
(u, v) or (s, t). They represent the coordinates in the 2D
image for which the colour of that point on the surface
must be retrieved. Texture coordinates are often created
through processes such as texture unwrapping in the 3D
modelling package used to construct the geometric rep-
resentation. They can also be procedurally generated as
well based on other properties of the vertex or fragment.
Procedurally generated texture coordinates are useful
for adapting the way the texture is used dynamically to
properties of the scene, such as for producing shadows.
The texture image itself is provided to the shader as
one of the uniform (“global”) parameters to the shader.
It may frequently (pun intended) be referred to as a
sampler. The name is appropriate in several ways: retriev-
ing a value from the texture involves sampling. This may
involve filters that aggregate regions of the texture if the
sampled points are far apart in the texture space, or that
perform interpolation between neighbouring values if the
334 shaun bangay
10.1.3 Example
Example 41. Applying the pattern to Unity software
12 else
13 {
14 col = col + ks * 0.7;
15 }
16 }
10.2.1 Description
Creating or modifying shaders provides opportunities to
incorporate custom visual effects into your virtual real-
ity applications. These may be as trivial as just toggling
a default setting that cannot be modified in any other
way (such as producing two sided materials in one well
known engine), to including particular lighting and ma-
terial effects that are specific to a significant class of the
objects in your application.
Bump mapping is a facility that is already supported
in some form in many engines. It does, however, provide
an excellent example of the interaction of lighting and
texturing in less conventional ways to achieve dramatic
effects. The form of bump mapping described here usu-
ally falls under the title of normal mapping.
Key points about the process involved:
10.2.2 Pattern
The pattern for a bump mapping shader is:
342 shaun bangay
Sampler normalMapSampler
transform v.position
transform v.normal
transform v.tangent
v.binormal = v.normal × v.tangent
}
Default Maps
Texture
Import Settings
Open...
Imported Object
the colour channel of the
UnlitNormalMapShader
Surface shader no
surface of an object.
Fixed function no
Compiled code
Cast shadows no
Render queue 2000
LOD 100
Ignore projector no
Disable batching no
Properties
_MainTex 2D Texture
Project Console
_Scenes
Materials
Prefabs
Resources
Scripts
Shaders
10.3.1 Description
10.3.2 Pattern
The process of linear interpolation involves two quantit-
ies, A and B. We need a mixture of these two, controlled
by factor f such that:
(1 − f ) A + f B
10.3.3 Example
Example 43. Applying the pattern to Unity software
Add Component
Transform
moving between two
defined endpoints.
Project Console
_Scenes
Materials
Prefabs
Resources
Scripts
Shaders
1 using System.Collections;
2 using System.Collections.Generic;
3 using UnityEngine;
4
5 public class Patrol : MonoBehaviour {
6
7 public Transform startPoint;
8 public Transform endPoint;
9
10 void Update () {
11 float time = Time.time;
12 float t = time;
13
14 transform.position = (1.0f - t) * startPoint.
position + (t) * endPoint.position;
15 transform.forward = -1.0f * startPoint.position +
1.0f * endPoint.position;
16 }
17 }
dt
terpolation expression dtime , but the astute reader will
note some missing terms. Since these are purely for
scaling and the forward vector is normalized (forced
to unit length), we can leave these elements out.
Figure 10.3.2 shows a frame from the patrolling pro-
cess.
Add Component
Project Console
_Scenes
Materials
Prefabs
Resources
Scripts
Shaders
11
Finishing Touches
11.1.1 Description
11.1.2 Pattern
A typical project structure is normally broken down
into a number of standardized containers or folders.
These should be named for what they are. Each of these
folders should then have subfolders so that the various
subcategories are clearly distinguished. As a suggestion,
no folder should have more than 10 elements in it unless
all elements are clearly just anonymous parts of a single
element (e.g. all 100 frames for an animation sequence
could probably live in a single subfolder, assuming they
needed to be stored as separate objects).
The collection of scenes is probably the most accessed
folder and relevant when first starting to work on an
existing project. Placing it as the first folder in the project
provides convenient access. There are tricks that can be
played by pre-pending particular symbols to the folder
name to force it to the top when sorted alphabetically.
A typical top level folder structure pattern would be:
11.1.3 Example
Example 44. Applying the pattern to Unity software
Project Console
Assets
Albedo
Normal Map
asset. Emission
Tiling
Offset
x1y1
x0y0
Secondary Maps
Detail Albedo x2
NormalMap 1
Tiling x1y1
Offset x0y0
UVSet uvo
Forward Rendering Options
Specular Highlights
Reflections
Advanced Options
Enable GPU Instancing
ShinyRed
_Scenes
Materials
Prefabs
Resources
Scripts
Shaders
Transform
Layer Default
Static
Position x3y0z0
IsTrigger
Edit Collider
ShinyRed
Shader Standard
Add Component
Project Console
_Scenes
Materials
Prefabs
Resources
Scripts
Shaders
IsTrigger
Collider
Edit Collider a component of the car
Material
Center
Radius
Height
Physic Material
object.
Direction Y-Axis
MeshRenderer
Lighting
Materials
Dynamic Occluded
Move ForwardScript
Script MoveForward
ShinyRed
Shader Standard
Add Component
Project Console
C#
MoveForward
_Scenes
Materials
Models
Prefabs
Resources
Scripts
Shaders
Standard Assets
Terrain
362 shaun bangay
form of externally
produced car models,
and a terrain object with
a texture image added.
Project Console
_Scenes
Materials
C#
Models
Prefabs MoveForward
Resources
Scripts
Shaders
Standard Assets
Terrain
12
Resources
Software
Credits