Lab Assignement Ue22 2020 PDF
Lab Assignement Ue22 2020 PDF
Benoit ZERR
FISE 2022 - S2 - 2020
C ONTENTS
1 Sensor Actuator Loop - Problem Statement 2
1.1 Automatic System vs. Autonomous System . . . . . . . . . . . . . . . . . . . . . . . 2
1.2 The Sensor Actuator Loop . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
1.3 Sensor Actuator Loop in Python . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
1.4 Dynamic modeling of Rob1A . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
1.4.1 Brief description of Rob1A design . . . . . . . . . . . . . . . . . . . . . . . . 4
1.4.2 Controlling Rob1A using sensors and actuators . . . . . . . . . . . . . . . . 7
4 Control 16
4.1 In place turn . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
4.1.1 Measuring odometers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
4.1.2 Measuring heading with the compass . . . . . . . . . . . . . . . . . . . . . . 16
1
4.2 Performing in-place turn . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
4.2.1 In-place turn with odometers . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
4.2.2 In-place turn with compass . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
4.2.3 Python functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
4.3 Measuring distances with sonars . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
4.4 Performing a linear motion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
4.4.1 Python functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
4.5 design a controller for wall following . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
4.5.1 "bang-bang" controller . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
4.5.2 proportional (P) controller . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
4.5.3 proportional-derivative (PD) controller . . . . . . . . . . . . . . . . . . . . . 19
Appendices 23
Installing the simulator . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
Installing the simulator on your own computer . . . . . . . . . . . . . . . . . . . . . . . 24
2
Autonomous System Set-Point
+
_
Sensors Filtering
Environment
Actuators Command
3
21 o t h e r V a l F i l t = doSomeFiltering ( otherValRaw )
22 endCondition = checkIfLoopEndSensors ( o t h e r V a l F i l t )
23 i f endCondition :
24 break # leave the loop
25
26 # compute c o n t r o l e r r o r
27 controlError = setPoint − v a l F i l t
28
29 # checking i f end of loop using c o n t r o l e r r o r s
30 # ( e . g . stop the robot a f t e r t r a v e l l i n g a given distance )
31 endCondition = checkIfLoopEndErrors ( controlError )
32 i f endCondition :
33 break # leave the loop
34
35 # defining the new commands based the e r r o r value
36 cmd = computeCommand ( controlError )
37 # applying the commands of the actuators
38 applyCommand (cmd)
39
40 # wait f o r a clean end of the loop i t e r a t i o n
41 execTime = time . time ( ) − t0 # measuring execution time
42 # compute how much to wait f o r the end of t h i s i t e r a t i o n
43 deltaTime = loopIterationTime − execTime
44 # wait only i f deltaTime i s p o s i t i v e
45 i f deltaTime > 0 :
46 time . sleep ( deltaTime )
47 # i f deltaTime < 0 , your computation takes too much time
48 # you have e i t h e r to s i m p l i f y i t or to i n c r e a s e loopIterTime
As you can see in the code and the flowchart, there are two possible outputs (exits) of the con-
trol loop, one using the measurements of another sensor and the other using the control error
between the set-point and the measurement. This can be very useful in certain situations, such
as following a wall on one side and at the same time, stopping when an obstacle arises in front
of the vehicle. In this example, the exit of the loop can occur at the end if the wall (exit on con-
trol error) or when the front sensor detects an obstacle (exit on another sensor measurement).
At the end of the loop code, it is important to wait for the exact duration of an iteration of
the loop. In a control loop, measurements and actions should take place at perfectly repeated
times. Therefore, at the end of the loop code, we measure the computation time ("execTime")
and subtract it from the "loopIterationTime" in order to set the "deltaTime" duration until the
precise end of the iteration. A negative "deltaTime" means that the calculation takes too much
time. Two solutions can solve this problem: (1) simplify the calculation or (2) increase "loo-
pIterationTime". Solution (1) should be preferred because solution (2) makes the system less
responsive by increasing the time between two commands.
As existing robots (NAO humanoids and DART 4WD) are too complex to perform their low
level control in 8 hours, Rob1A has been specifically designed for this course. Rob1A has been
designed to be easy to control. However, Rob1A will not perform real missions this year be-
cause its construction (mainly 3D printing) has not started yet. It was designed on Blender but
FreeCad, Solidworks or even Catia could have been used. When designing a robot, it is impor-
tant to be able to export it to a dynamic modeling software. The dynamic modeling software
must solve the differential equations of the robot’s motion in real time and detect all possible
collisions between the robot and its environment. Game engines which perfect for these tasks
4
Start of Control Loop
setPoint = someReferenceValue
loopIterTime = someDuration
while true
t0 = time.time()
valRaw = acquireSensors()
valFilt = filter(valRaw)
no
cmd = computeCom-
End of Control Loop
mand(error)
execTime = time.time() - t0
deltaTime = loopIterTime-execTime
no
deltaTime > 0 ?
yes
time.sleep(deltaTime)
5
Figure 1.3: Rob1A
6
are generally used. In this tutorial course, we will use V-REP for dynamic modeling, but we may
have used WEBOTS, GAZEBO, EUREKA, MORSE ... The design of the robot is imported into
V-REP in Collada format. Real-time simulation of the dynamics of the robot requires a lot of
calculation to render the scene, to solve the differential equations of the movement, to check
the collisions, to simulate the sensors, etc. Hence, to determine the dynamic response of the
robot in real time, the exact design is replaced by a simplified form defined with primitive ob-
jects such as cylinders, cubes, and so on. This simplified form is hidden so as not to disturb the
display. The figure (Fig. 1.3) shows the Rob1A design.
A communication program (in Lua) allows your Python program to control the movement of
Rob1A and get the measurements of its sensors. The robot uses sensors to obtain its own state
and acquire knowledge of its environment. Proprioceptive sensors measure values internal
of the system while exteroceptive sensors acquire information from the robot’s environment.
Rob1A has only two actuators: the engines of its two main wheels.
Rob1A is a very basic robot, it is equipped with:
The robot’s reference point is the yellow square on top of the chassis.
7
Figure 2.1: The mark on a waypoint is maximum if the robot enters the green disk. The mark is
null when the robot is outside of the red circle. Between the green disk and the red
circle the mark is decreasing.
2.2 Assessment
The assessment of the work is made all along the lab in six steps :
1. Mark F (0 or 1) - Fill the Moodle Quiz to define your team. The team is generally a pair,
but a team of 3 students can be allowed to do not let a student alone.
2. Task 1 : mark Q1 (2 pts) - Qualify 1
3. Task 2 : mark Q2 (2 pts) - Qualify 2
4. Task 3 : mark L (8 pts) - Lite challenge
5. Task 4 : mark A (6 pts) - Advanced challenge
6. Mark C (-2 to 2 pts) - Analysis of the python code of the advanced challenge and oral
exam.
The final mark is M = F*(Q1+Q2+L+A+C). As the advanced challenge is quite complex to achieve,
you can decide to stop after the lite challenge. If you choose to stop after the lite challenge, your
python code will not be analyzed and you will not have to attend an oral session to explain your
code.
8
https://moodle.ensta-bretagne.fr/mod/feedback/view.php?id=43432
It is extremely important to fill this quiz before the 24th of february. If this is not done your
mark will be null.
https://moodle.ensta-bretagne.fr/course/view.php?id=1439#section-4
In the folder /scenes a file called path_log.lua has to be modified with the path of the folder
where the log file 1 will be stored.
1. find the direction to go using the 4 sonars, the right direction is the direction free of ob-
stacles(walls)
2. rotate the robot to place it in the right direction
3. move linearly to the finish
4. stop the robot on the finish waypoint using the distance to the front wall measured by
the front sonar
To get the mark, your work will be automatically evaluated. You will have to upload 2 files on
MOODLE : qualify1.py and control.py, at the following link :
https://moodle.ensta-bretagne.fr/mod/assign/view.php?id=43629
1. find the bias of the compass (if you use the compass for changing the orientation of the
robot)
2. define the filters for the sonars and the compass (if you use it)
3. find the direction to go using the 4 sonars, the right direction is the direction free of ob-
stacles(walls)
4. rotate the robot to place it in the right direction
5. move linearly to the first waypoint
6. stop the robot on the first waypoint waypoint using the distance to the front wall mea-
sured by the front sonar
1 The log file is a text file indicating was the robot has done during the mission.
9
Figure 2.2: Scene for qualify1.
7. find the dirtection to turn, left or right, using the left and the right sonar
8. rotate the robot to place it in the direction you ahve found
9. move linearly to the finish
10. stop the robot on the finish waypoint using the distance to the front wall measured by
the front sonar
Your code will be in 3 Python files : qualify2.py, control.py and filt.py To get the mark, your
work will be automatically evaluated. You will have to upload on MOODLE the 3 Python files :
qualify2.py, control.py and filt.py at the following link :
https://moodle.ensta-bretagne.fr/mod/assign/view.php?id=43630
https://moodle.ensta-bretagne.fr/mod/assign/view.php?id=43648
Your mark will be obtained by executing your code on another track lite4.ttt which is unknown
but similar to the 3 tracks you have used. The changes are the random changes in turn direc-
tions and the random lengths of the linear segments.
After completing the lite challenge you can decided to stop there. You will not have oral exam
to explain your code.
You can decided to stop after lite challenge.
10
Figure 2.3: Scene for qualify2.
11
Figure 2.5: Scene for advanced challenge.
linear noise consists of "spikes" or "outliers". You will have to use both linear and non-linear
filters and you will have to explain your code during an oral session. As the shape of the track
is defined by random parameters, you have 3 tracks for testing your control program : ad-
vanced1.ttt, advanced2.ttt and advanced3.ttt. To get the automatic part of the mark, you will
have to upload 3 files on MOODLE : advance.py, control.py and filt.py
https://moodle.ensta-bretagne.fr/mod/assign/view.php?id=43649
Your mark will be obtained by executing your code on another track advanced4.ttt. This track
is unknown but similar to the 3 tracks you have used.
12
3 PROGRAMMING R OB 1A IN P YTHON
3.1 Creating and controlling the robot
To create a Rob1A robot in your Python programs, you need first to import the "rob1a_v02"
module in your code :
Rob1A is a class simulating the robot. To use it, you just have to create an instance of this class,
for example :
rb = rob1a.Rob1A()
Then you will access to the functions of the robot with "rb.". For example, stopping the robot
will be done by the "stop()" function like this :
rb.stop()
rb.set_speed(60,-40)
rb.set_speed(0,0)
or :
rb.stop()
Note : At every new simulations, the battery level of the robot will change. As Rob1A is a low cost
robot, there is no sensor to measure the battery level. Therefore, to move the robot you cannot use
the idea to set the speed at a given value for a given time as the covered distance will change with
the battery level.
3.3 Odometers
The angular motion of the main wheels is measured with odometers. An odometer will make
200 ticks when the wheel completes a full revolution. The values of left and right odometers are
given by :
odoLeft,odoRight = rb.get_odometers()
Note : the measurements from odometers are considered perfect, there is no need to filter them.
13
3.4 Sonars
To get the measurement of a given sonar, you have to use the rb.get_sonar(name) function;
name can be "front", "back", "left" or "right". For example, for the front sonar, the command is
:
distFront = rb.get_sonar("front")
You can also use rb.get_multiple_sonars(names) to acquire simultaneously several sonars. For
example, if you need both front and left sonars, the command is:
names = ["front","left"]
distFront,distLeft = rb.get_multiple_sonars(names)
Note : A new sonar measurement is performed every 100 ms. If you acquire the sonar every 25 ms,
you will get 4 times the same value
import control
import rob1a_v02 as rob1a
and to create an instance of the robot controller before using its functions :
The control can be tested using the test_move() function to perform an in place rotation to the
left for a duration of 10 seconds :
The filtering functions are coded in the "filt.py" file. To use it you need to import "filt.py" in
your program to use these functions :
import filt
then you can modify the parameters. For example, if you use a MA filter (§ 5.2.1), you can set
its order to 4 with :
flt.set_ma_order(4)
14
When using multiple sonars, you need to have one filter per sonar. For example, if you need to
use front and left sonar, you will have to set both filters :
Taking again the left sonar MA filter example, if rawVal is the measured value, the filtered value,
filtVal, is simply given by:
filtVal = fltFront.ma_filter(rawVal)
Sonars do not always detect something. When the nearest obstacle in the sonar cone is more
than 1.5 meters away, the sonar gives zero distance. It is not a good idea to filter this data! The
best thing to do is to remove these null values. However, after a certain movement of the robot,
a new non-zero distance may occur. The filter contains in its memory old values that may be
very different from the new measurement. The filter can do strange things. If this is the case,
you can reset the filter using a reset function. For the left sonar filter defined above, this is done
by:
fltLeft.ma_reset()
Informations utiles
Lorsque le robot s’arrête plus de 5 secondes, la notation repart à zéro (les points acquis sont
perdus et les waypoints repassent au rouge). Attention lors de l’utilisation de la fonction Python
time.sleep(duree) à ne pas utiliser de valeurs négatives pour duree, cela arrête le programme
d’évaluation automatique qui établit votre note en exécutant les codes remis sous MOODLE.
Avant de démarrer la simulation, vous pouvez changer la position et l’orientation du robot afin
de le placer à l’endroit de votre choix sur le circuit. Les explications visuelles se trouvent à la fin
de la vidéo d’installation (https://moodle.ensta-bretagne.fr/mod/resource/view.php?id=40622).
15
4 C ONTROL
In this section, we will describe the main functions to control the autonomous motion of the
robot.
The odometer is a sensor measuring the distance traveled by a mobile. For this, Rob1A uses
rotary incremental encoders. Placed on the shaft of the wheel, the encoder produces pulses
that change with the rotation of the wheel. The detection and analysis of these pulses give a
number of "ticks" proportional to the angular movement of the wheel. The sensor can also
indicate the direction of rotation of the wheel (clockwise or anti-clockwise). We use a sensor
that delivers 200 ticks per complete wheel revolution. Pulse processing is considered perfect.
At the beginning of the simulation, the number of "ticks" is 0 for the left and right main wheels.
When the robot moves, the number of "ticks" is updated according to the rotation of the wheels.
The number of "ticks" increases as the wheel moves forward and decreases as the wheel moves
back.
The heading of the robot is an angle varrying from 0 to 360 degrees. The angle is 0 degree when
the robot is oriented to the North. The orientations to the East, South and West are respectively
90, 180 and 270 degrees. The measurements of the compass are biased and corrupted by noise.
The filtering of the data will be usefull to reduce the noise and to estimate the bias.
If the left wheel turns forward with a given number of "ticks" and, at the same time, the right
wheel turns backwards with the same number of "ticks", the robot will orient itself to the right
without changing position (rotation in place).
16
4.2.2 In-place turn with compass
The robot must rotate until the measured heading is close to the desired heading. The compass
measurments are noisy and they will have to be filtered before to use them (except for Qualify1
task).
1. The left and right wheel motors are controlled by the rb.set_speed () function.
2. The values of the odometers are acquired with the rb.get_odometers () function.
3. The value of the heading is acquired with the rb.get_heading () function.
The linear motion is therefore quite simple to perform, you only need to set the same speed
and the same direction to the two main wheels.
You have several ways to terminate the motion :
1. stop after a duration : this is very easy but the distance travelled will depend on the bat-
tery level, so it’s not a good idea !!! bu you can use it for testing purposes,
2. stop after a distance measured by the odemeters
3. stop at a given distance to the a wall using the front sonar
17
1. The left and right wheel motors are controlled by the rb.set_speed () function.
2. The values of the odometers are acquired with the rb.get_odometers () function.
3. The value of the front sonar with the rb.get_sonar("front")() function.
The simplest controller is the "bang-bang" controller. It applies a constant correction to the
left and right speeds, only the sign of the correction changes. The pseudo code looks like this :
define setPoint
define nominalSpeed
while True:
measure distWall
controlError = setPoint - distWall
if controlError > 0
set_speed (nominalSpeed-deltaSpeed, nominalSpeed+deltaSpeed)
else
set_speed (nominalSpeed+deltaSpeed, nominalSpeed-deltaSpeed)
wait for end of loop iteration
This code does not end. Another test must be done to stop the code when an obstacle occurs
or after a given covered distance. If "deltaSpeed" is too large, the risk is that the robot turns too
much so that the right sonar becomes unable to detect the wall. Wall following may work with
a "bang-bang" controller but it is very tricky to setup.
Instead of a constant correction, the proportional or P controller will apply a correction pro-
portional to the control error. The pseudo code of the P controller is :
define setPoint
define nominalSpeed
while True:
measure distWall
controlError = setPoint - distWall
deltaSpeed = kp * controlError
set_speed (nominalSpeed+deltaSpeed, nominalSpeed-deltaSpeed)
wait for end of loop iteration
18
The key issue is how to find the value kp. We will use an empirical trial and error method.
Note : Kp must also take into account the change of units between the error (in meters) and the
speed command (in %).
In wall following, the time derivative of the controller error can be extremely helpful. When
the robot is parallel to the wall, whatever the distance, the derivative of error will be very small.
When Rob1a gets closer to the wall, the derivative is negative ant when it gets farther, the deriva-
tive is positive. A correction that combines a proportion of the derivative of the error and a
proportion of the error is a PD controller or proportional-derivative controller.
The pseudo code of the PD controller is :
define setPoint
define nominalSpeed
lastError = 0
derivOk = False
while True:
measure distWall
controlError = setPoint - distWall
if derivOk:
derivError = controlError-lastError
deltaSpeed = kp * controlError + kd * derivError
else:
deltaSpeed = kp * controlError
set_speed (nominalSpeed+deltaSpeed, nominalSpeed-deltaSpeed)
lastError = controlError
derivOk = True
wait for end of loop iteration
The key issue is how to find the values for kp and kd. We will use an empirical trial and error
method. The derivOk boolean prevent to use undefined lastError in the first pass in the loop.
If Rob1A turns to much, the right sonar will stop detecting the wall. To avoid this problem, you
can limit the turn rate of the robot by limiting deltaSpeed to a given threshold deltaSpeedMax.
So before setting the speed, you can add :
19
5 F ILTERING THE SENSOR ’ S MEASURMENTS
The sonars and the compass measurments are limited by the accuracy of these sensors.
The measurements are obviously not perfect because the sensor has a given accuracy. The
accuracy is affected by two types of errors: a systematic error (called a bias) and random errors
(called noise). We use different approaches to deal with these errors.
The bias will be estimated over a large number of measurements and added to them. The noise
will be filtered in real time by a low-pass filter.
We consider x[n] the measurement and y[n] the filter output at discrete time nT with n integer,
T=1/Fs the sampling interval and Fs the sampling frequency. The generic formulation of finite
impulse response (FIR) filters is :
N
X
y[n] = b k x[n − k] (5.1)
k=0
The moving average (MA) filter is a particular case of FIR filter with all b k coefficients set to the
same value b k = 1/(N + 1) :
1 X N
y[n] = x[n − k] (5.2)
N + 1 k=0
N is the order of the filter. For example, a second order MA filter will take the average of the last
3 measurements.
Although very simple to implement, this filter will do the job in many robotic applications. The
only parameter to define is the order of the filter. This can be done empirically using a trial and
error method. It is also possible to precisely define the frequency response of a FIR filter:
N
b k exp − j ωk
X
H (ω) = (5.3)
k=0
with ω = 2π f . In the MA case, all b k = 1/(N + 1). Figure (Fig. 5.1) shows the frequency response
for MA filters of order 2 and order 10. The order 10 is a better low-pass filter, but it requires more
20
Figure 5.1: Frequency response of MA filter at a sampling frequency of 10 Hz (100 ms sampling
period). the cutoff frequency at half power attenuation is 1.55 Hz for order 2 and 0.4
Hz for order 10. Order 10 is therefore a better low pass filter.
calculations. Another problem is that the MA filter introduces a delay between the unfiltered
output and the filtered output, which increases with the order of the filter.
Note: In some applications, the MA filter may not work. Butterworth, Chebyshev or other FIR
filters can be used and for these filters, a synthesis tool is used to define the coefficients b k (which
are no longer equal to each other as in the MA case). In Python, Numpy offers tools for this
synthesis.
While the MA filter uses only measurements, the recursive filter uses both the last measures
and the last outputs of the filter. As the filter is recursive, it will use its output to infinity. That’s
why it’s called an infinite impulse response filter or IIR filter. The generic formulation of IIR
filters (infinite impulse response) is as follows:
M
X N
X
y[n] = a l y[n − l ] + b k x[n − k] (5.4)
l =1 k=0
To prevent the filter from changing the measure, its gain must be equal to one. To achieve this,
we need:
b 0 ∈]0, 1] (5.6)
a1 = 1 − b0 . (5.7)
21
The only parameter is b 0 . It can be defined empirically using trial and error method. Like
for MA filter, we can be more rigorous and use the frequency response to control the level of
filtering :
b0
H (ω) = (5.8)
1 − a 1 exp − j ω
with ω = 2π f . Figure (Fig. 5.2) shows the frequency response for b 0 equal 0.6 and 0.22. The
cutoff frequencies are similar those of MA filters of order 2 and order 10, respectively.
Figure 5.2: Frequency response of recursive IIR filter at a sampling frequency of 10 Hz (100 ms
sampling period). The cutoff frequency is 1.55 Hz for b 0 = 0.6 and 0.4 Hz for b 0 =
0.22. The lower is b 0 , the better is the low pass filtering.
Both lite and advanced challenges require a linear low-pass filter. Therefore, you will need to at
least implement one of the two filters studied previously: MA or IIR.
The code of the filter must be placed in the "sonar_filter.py" file. The ma_filter() and iir_filter()
functions in "sonar_filter.py" do nothing; the filtered value is just a copy of the measured value.
You will have to modified one of these two functions to implement your filter.
Note : If you do not know what filter to use, you can implement both and choose after. However,
it is better to choose one and spend time on improving its control parameters when doing the
challenge.
22
5.3 median filter - the anti spike filter
The median filter is commonly use to remove spike noise. It can replace the MA or IIR filter, or
it can be used before them. The median of a set of (2M+1) values is the value such that M values
have values greater or equal than this value, while the other M values are smaller or equal. For
example, we consider the set of 5 measured distances with a spike noise :
{0.87,0.91,0.89,2.51,0.92}
the median of this set is 0.91. It is obtained by ranking the values in ascending (or descending)
order :
{0.87,0.89,0.91,0.92,2.51}
and taking the value in the middle. The median filter will takes the median value of the last M
measurements. The median filter belongs to the family of rank order filters.
https://moodle.ensta-bretagne.fr/course/view.php?id=1439#section-4
cd ue22sal/V-REP_PLAYER_V3_5_0_Linux
./vrep.sh
• when the simulator is started, you have to load the scene clicking in the "File" menu on
the "Open scene ..." command. Then you have to click the green vertical arrow to move to
the parent directory, the click on the "scenes" folder, and then double-click on the scene
file "basic.ttt".
• The simulation is started by clicking in the "Start" menu on the "Start simulation" com-
mand.
• open a second terminal and use the cd command to go in your working directory
• go the python files directory by typing :
23
cd ue22sal/lab/test
python3.7 test.py
If you use another linux dustribution, the python3 command may differ. For example, on
Ubuntu, the command is:
python3 test.py
• go back to the simulator to see, if all is OK, the robot performs a sequence of in place
turns and linear paths.
• On the simulator window, to give more space to the scene you can close the two windows
"Scene hierarchy" and "Model browser"
Instead of running your Python programs from a terminal, it could be possible to use PyCharm
or Spyder3.
Note : you will find videos on MOODLE explaining how to install the simulator and how to use
either Spyder3 or PyCharm.
For Windows computers, using a linux virtual machine is not recommended as the simulation
can be too slow. The solution is to install the Windows version of the V-Rep player and then add
the Python files and the scenes files taken from MOODLE.
24