@DrBartRienties
Professor of Learning Analytics
Learning analytics at the Open University and
the UK
Online presentation 08 May 2020
eMadrid seminar on «Review and challenges in
Learning Analytics»
Adeniji, B. (2019). A Bibliometric Study on Learning Analytics. Long Island University. Retrieved from https://digitalcommons.liu.edu/post_fultext_dis/16/
“In the UK the Open University (OU) is a world leader in the collection, intelligent analysis and use of large scale student
analytics. It provides academic staff with systematic and high quality actionable analytics for student, academic and
institutional benefit (Rienties, Nguyen, Holmes, Reedy, 2017). Rienties and Toetenel’s, 2016 study (Rienties & Toetenel,
2016) identifies the importance of the linkage between LA outcomes, student satisfaction, retention and module learning
design. These analytics are often provided through dashboards tailored for each of academics and students
(Schwendimann et al., 2017).
The OU’s world-class Analytics4Action initiative (Rienties, Boroowa, Cross, Farrington-Flint et al., 2016) supports the
university-wide approach to LA. In particular, the initiative provided valuable insights into the identification of students and
modules where interventions would be beneficial, analysing over 90 large-scale modules over a two-year period…
The deployment of LA establishes the need and opportunity for student and module interventions (Clow, 2012). The
study concludes that the faster the feedback loop to students, the more effective the outcomes. This is often an iterative
process allowing institutions to understand and address systematic issues.
Legal, ethical and moral considerations in the deployment of LA and interventions are key challenges to institutions.
They include informed consent, transparency to students, the right to challenge the accuracy of data and resulting analyses
and prior consent to intervention processes and their execution (Slade & Tait, 2019)”
Wakelam, E., Jefferies, A., Davey, N., & Sun, Y. (2020). The potential for student performance
prediction in small cohorts with minimal available attributes. British Journal of Educational
Technology, 51(2), 347-370. doi: 10.1111/bjet.12836
Nguyen, Q., Rienties, B., Toetenel, L., Ferguson, R., Whitelock, D. (2017). Examining the designs of computer-based assessment and its impact on student
engagement, satisfaction, and pass rates. Computers in Human Behavior. DOI: 10.1016/j.chb.2017.03.028.
69% of what students are
doing in a week is
determined by us, teachers!
Constructivist
Learning Design
Assessment
Learning Design
Productive
Learning Design
Socio-construct.
Learning Design
VLE Engagement
Student
Satisfaction
Student
retention
150+ modules
Week 1 Week 2 Week30
+
Rienties, B., Toetenel, L., (2016). The impact of learning design on student behaviour, satisfaction and performance: a cross-institutional comparison across 151
modules. Computers in Human Behavior, 60 (2016), 333-341
Nguyen, Q., Rienties, B., Toetenel, L., Ferguson, R., Whitelock, D. (2017). Examining the designs of computer-based assessment and its impact on student
engagement, satisfaction, and pass rates. Computers in Human Behavior. DOI: 10.1016/j.chb.2017.03.028.
Communication
Start
FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
Pass Fail No submit TMA-1time
VLE opens
Start
Activity space
FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
Start
FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
Pass Fail No submit TMA-1time
VLE opens
Start
VLE trail: successful
student
FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
Start
FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
Pass Fail No submit TMA-1time
VLE opens
Start
VLE trail: student who
did not submit
Probabilistic model: all students
time
TMA1
VLE
start
OU Analyse demo http://analyse.kmi.open.ac.uk
Herodotou, C., Rienties, B., Hlosta, M., Boroowa, A., Mangafa, C., Zdrahal, Z., (2020). Scalable implementation of predictive learning analytics at a distance learning university:
Insights from a longitudinal case study. Internet and Higher Education, 45, 100725.
Herodotou, C., Rienties, B., Hlosta, M., Boroowa, A., Mangafa, C., Zdrahal, Z., (2020). Scalable implementation of predictive learning analytics at a distance learning university:
Insights from a longitudinal case study. Internet and Higher Education, 45, 100725.
• Amongst the factors shown to be critical to the scalable PLA implementation were:
Faculty's engagement with OUA, teachers as “champions”, evidence generation and
dissemination, digital literacy, and conceptions about teaching online.
Student Facing
Analytics
STUDENT SUCCESS ANALYTICS
15
O R G AN I S AT I O N AL C APAB I LT I E S
Productionised
output and MI
Strategic
analysis
Modelling /
AI
Data
collection
Data storage
and access
Technology
architecture
Learning
design and
delivery
Student
lifecycle
managemen
t
Continuous
improvemen
t and
innovation
Creation of
actionable
insight
Availability
of data
Impact the
student
experience
Adapted from Barton and Court (2012) - https://hbr.org/2012/10/making-advanced-analytics-work-for-you
Design
Inquiry First Module 2nd Module
NthModule
QualificationLife long learning
Design
Design
Rienties, B., Olney, T., Nichols, M., Herodotou, C. (2019). Effective usage of Learning Analytics: What do practitioners want and where should distance learning
institutions be going? Open Learning. DOI: 10.1080/02680513.2019.1690441
Design Inquiry First Module 2nd Module
NthModule
QualificationLife long learning
Communication Integrated Design
Personalisation Evidence based
Integrated learning analytics solution
Alumni
Rienties, B., Olney, T., Nichols, M., Herodotou, C. (2019). Effective usage of Learning Analytics: What do practitioners want and where should distance learning
institutions be going? Open Learning. DOI: 10.1080/02680513.2019.1690441
18
Foster, E. (2017) Nottingham Trent University. UCISA event A-Z of learning analytics 28/06/2017.
19
de Quincey, E., Briggs, C., Kyriacou, T., & Waller, R. (2019). Student Centred Design of a Learning Analytics System. Paper presented at the
Proceedings of the 9th International Conference of Learning Analytics Knowledge, Arizona.
Conclusions I
1. A lot of data is coming into (and out of)
education:
2. A lot of “semi-standardised” data is
gathered within and across institutions
3. Great opportunities to harvest fine-
grained and longitudinal data
Conclusions II
1. What about the ethics?
2. What can be standardised (and what not)?
3. Are we optimising the record player?
@DrBartRienties
Professor of Learning Analytics
Learning analytics at the Open University and
the UK
The power of learning analytics to visualise
evidence of learning
T: drBartRienties
E: bart.rienties@open.ac.uk
W: www.bartrienties.nl
W: https://www.organdonation.nhs.uk/
W: https://www.sportentransplantatie.nl/

«Learning Analytics at the Open University and the UK»

  • 1.
    @DrBartRienties Professor of LearningAnalytics Learning analytics at the Open University and the UK Online presentation 08 May 2020 eMadrid seminar on «Review and challenges in Learning Analytics»
  • 2.
    Adeniji, B. (2019).A Bibliometric Study on Learning Analytics. Long Island University. Retrieved from https://digitalcommons.liu.edu/post_fultext_dis/16/
  • 4.
    “In the UKthe Open University (OU) is a world leader in the collection, intelligent analysis and use of large scale student analytics. It provides academic staff with systematic and high quality actionable analytics for student, academic and institutional benefit (Rienties, Nguyen, Holmes, Reedy, 2017). Rienties and Toetenel’s, 2016 study (Rienties & Toetenel, 2016) identifies the importance of the linkage between LA outcomes, student satisfaction, retention and module learning design. These analytics are often provided through dashboards tailored for each of academics and students (Schwendimann et al., 2017). The OU’s world-class Analytics4Action initiative (Rienties, Boroowa, Cross, Farrington-Flint et al., 2016) supports the university-wide approach to LA. In particular, the initiative provided valuable insights into the identification of students and modules where interventions would be beneficial, analysing over 90 large-scale modules over a two-year period… The deployment of LA establishes the need and opportunity for student and module interventions (Clow, 2012). The study concludes that the faster the feedback loop to students, the more effective the outcomes. This is often an iterative process allowing institutions to understand and address systematic issues. Legal, ethical and moral considerations in the deployment of LA and interventions are key challenges to institutions. They include informed consent, transparency to students, the right to challenge the accuracy of data and resulting analyses and prior consent to intervention processes and their execution (Slade & Tait, 2019)” Wakelam, E., Jefferies, A., Davey, N., & Sun, Y. (2020). The potential for student performance prediction in small cohorts with minimal available attributes. British Journal of Educational Technology, 51(2), 347-370. doi: 10.1111/bjet.12836
  • 5.
    Nguyen, Q., Rienties,B., Toetenel, L., Ferguson, R., Whitelock, D. (2017). Examining the designs of computer-based assessment and its impact on student engagement, satisfaction, and pass rates. Computers in Human Behavior. DOI: 10.1016/j.chb.2017.03.028. 69% of what students are doing in a week is determined by us, teachers!
  • 6.
    Constructivist Learning Design Assessment Learning Design Productive LearningDesign Socio-construct. Learning Design VLE Engagement Student Satisfaction Student retention 150+ modules Week 1 Week 2 Week30 + Rienties, B., Toetenel, L., (2016). The impact of learning design on student behaviour, satisfaction and performance: a cross-institutional comparison across 151 modules. Computers in Human Behavior, 60 (2016), 333-341 Nguyen, Q., Rienties, B., Toetenel, L., Ferguson, R., Whitelock, D. (2017). Examining the designs of computer-based assessment and its impact on student engagement, satisfaction, and pass rates. Computers in Human Behavior. DOI: 10.1016/j.chb.2017.03.028. Communication
  • 7.
    Start FSF RFSOFS ORFNO SRFROF OR ORSORFS OS RS FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS Pass Fail No submit TMA-1time VLE opens Start Activity space FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
  • 8.
    FSF RFSOFS ORFNO SRFROF OR ORSORFS OS RS Start FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS Pass Fail No submit TMA-1time VLE opens Start VLE trail: successful student FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
  • 9.
    FSF RFSOFS ORFNO SRFROF OR ORSORFS OS RS Start FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS Pass Fail No submit TMA-1time VLE opens Start VLE trail: student who did not submit
  • 10.
    Probabilistic model: allstudents time TMA1 VLE start
  • 11.
    OU Analyse demohttp://analyse.kmi.open.ac.uk
  • 12.
    Herodotou, C., Rienties,B., Hlosta, M., Boroowa, A., Mangafa, C., Zdrahal, Z., (2020). Scalable implementation of predictive learning analytics at a distance learning university: Insights from a longitudinal case study. Internet and Higher Education, 45, 100725.
  • 13.
    Herodotou, C., Rienties,B., Hlosta, M., Boroowa, A., Mangafa, C., Zdrahal, Z., (2020). Scalable implementation of predictive learning analytics at a distance learning university: Insights from a longitudinal case study. Internet and Higher Education, 45, 100725. • Amongst the factors shown to be critical to the scalable PLA implementation were: Faculty's engagement with OUA, teachers as “champions”, evidence generation and dissemination, digital literacy, and conceptions about teaching online.
  • 14.
  • 15.
    STUDENT SUCCESS ANALYTICS 15 OR G AN I S AT I O N AL C APAB I LT I E S Productionised output and MI Strategic analysis Modelling / AI Data collection Data storage and access Technology architecture Learning design and delivery Student lifecycle managemen t Continuous improvemen t and innovation Creation of actionable insight Availability of data Impact the student experience Adapted from Barton and Court (2012) - https://hbr.org/2012/10/making-advanced-analytics-work-for-you
  • 16.
    Design Inquiry First Module2nd Module NthModule QualificationLife long learning Design Design Rienties, B., Olney, T., Nichols, M., Herodotou, C. (2019). Effective usage of Learning Analytics: What do practitioners want and where should distance learning institutions be going? Open Learning. DOI: 10.1080/02680513.2019.1690441
  • 17.
    Design Inquiry FirstModule 2nd Module NthModule QualificationLife long learning Communication Integrated Design Personalisation Evidence based Integrated learning analytics solution Alumni Rienties, B., Olney, T., Nichols, M., Herodotou, C. (2019). Effective usage of Learning Analytics: What do practitioners want and where should distance learning institutions be going? Open Learning. DOI: 10.1080/02680513.2019.1690441
  • 18.
    18 Foster, E. (2017)Nottingham Trent University. UCISA event A-Z of learning analytics 28/06/2017.
  • 19.
    19 de Quincey, E.,Briggs, C., Kyriacou, T., & Waller, R. (2019). Student Centred Design of a Learning Analytics System. Paper presented at the Proceedings of the 9th International Conference of Learning Analytics Knowledge, Arizona.
  • 20.
    Conclusions I 1. Alot of data is coming into (and out of) education: 2. A lot of “semi-standardised” data is gathered within and across institutions 3. Great opportunities to harvest fine- grained and longitudinal data
  • 21.
    Conclusions II 1. Whatabout the ethics? 2. What can be standardised (and what not)? 3. Are we optimising the record player?
  • 22.
    @DrBartRienties Professor of LearningAnalytics Learning analytics at the Open University and the UK
  • 23.
    The power oflearning analytics to visualise evidence of learning T: drBartRienties E: [email protected] W: www.bartrienties.nl W: https://www.organdonation.nhs.uk/ W: https://www.sportentransplantatie.nl/

Editor's Notes

  • #7 Cluster analysis of 40 modules (>19k students) indicate that module teams design four different types of modules: constructivist, assessment driven, balanced, or socio-constructivist. The LAK paper by Rienties and colleagues indicates that VLE engagement is higher in modules with socio-constructivist or balanced variety learning designs, and lower for constructivist designs. In terms of learning outcomes, students rate constructivist modules higher, and socio-constructivist modules lower. However, in terms of student retention (% of students passed) constructivist modules have lower retention, while socio-constructivist have higher. Thus, learning design strongly influences behaviour, experience and performance. (and we believe we are the first to have mapped this with such a large cohort).
  • #12 Case-based reasoning (reasoning from precedents, k-Nearest Neighbours) Based on demographic data Based on VLE activities Classification and Regression Trees (CART) Bayes networks (naïve and full) Final verdict decided by voting
  • #16   ‘Creating better opportunities with and for learners, enabling more to achieve their goals’     Creating– includes our content, people, systems and support services Better – means continuous improvement, always striving to be better than we were yesterday Opportunities – different learning options to meet different needs With and For Learners– all learners from all backgrounds etc. which recognises their diversity and emphasising partnership Enabling – making it easy to engage with us and our products More – again, continuous improvement, striving to attract more students To achieve their goals – which may be a new job or career, a qualification, an accreditation or learning for pleasure.