Assessment Futures: role for e-Assessment? Peter Hartley, University of Bradford [email_address] http :// www.brad.ac.uk /educational-development/ aboutus / team/Full_details_27414_en.php
This session Why bother? Why worry? Assessment trends and futures What can e-assessment contribute?
And the argument Assessment is a problem which will ‘get worse’.  Assessment futures: trends re strategy and environment. Significant potential for ‘e’: some examples. And so the role for e-assessment must be part of a broader strategic initiative.
Assessment is  a problem, #1  Ask a student of X: what makes a  ‘ First Class X?
Assessment is  a problem, #1  Ask a student of X: what makes a  ‘ First Class X? ’ How would they respond? Can they give you a coherent summary of the main programme outcomes (or just the algorithm that determines the marks)?
Assessment is  a problem, #1  See useful summaries of major issues: the the PASS Project Issues Paper Please comment/feedback and use. http://www.pebblepad.co.uk/bradford/viewasset.aspx?oid=260486&type=file   Recent article: Price, M., Carroll, J., O'Donovan, B. and Rust, C. (2011) ''If I was going there I wouldn't start from here: a critical commentary on current assessment practices',  Assessment & Evaluation in Higher Education . Vol. 36, No. 4, 479-492.
Assessment is  a problem, #2  From PASS, would highlight: Assessment  ‘drives and channels’ What/why are we measuring: the ‘slowly learnt’problem. Limitations of grading (e.g. marks are not numbers). Implications for course structures/regulations .
Problem #3: multi-purpose  & multi-audience
Problem #4:  The meaning of feedback Cannot we ‘recapture’ the ‘original’ meaning of feedback: enabling self-correcting behaviour towards a known goal. This means rediscovering the ‘feedback loop’ whereby information must be ‘fed back’ so that it: relates to the goal. is received. is correctly interpreted. enables corrective action cf. the work of Royce Sadler in Higher Education, e.g.  http://www.northumbria.ac.uk/sd/central/ar/academy/cetl_afl/earli2010/themes/rsadler/
Problem #5  More  demanding ‘consumers’: E.g. the NUS  Charter
Assessment futures: strategy and environment
Environment: TESTA project NTFS group project with 4 partners:  ‘ aims to improve the quality of student learning through addressing programme-level assessment. ’ starting from audit of current practice on nine programmes:  surveyed students using focus groups and AEQ – Assessment Experience Questionnaire – Graham Gibbs et al also using tool to identify programme level ‘assessment environments’ (Gibbs)
Consistent practice? Characterising programme-level assessment environments that support learning  by  Graham Gibbs and Harriet Dunbar-Goddet  Published in:   Assessment & Evaluation in Higher Education , Volume  34, Issue 4 August 2009 , pages 481 - 489
Data from TESTA
Assessment environment and impact Interim findings from TESTA variety of assessments can cause problems Issues over understanding assessment criteria, marker variation, and feedback variation across programmes QA ‘myths and traditions’ can get in the way
The need for strategy An example finding from Gibbs ‘ greater explicitness of goals and standards was not associated with students experiencing the goals and standards to be clearer’ And what did make a difference?
The need for strategy An example finding from Gibbs ‘ greater explicitness of goals and standards was not associated with students experiencing the goals and standards to be clearer’ And what did make a difference? Formative-only assessment More oral feedback Students ‘came to understand standards through many cycles of practice and feedback’
Programme-based assessment:  PASS  NTFS group project over 3 years development and investigation leading to pilots and implementation Consortium Led by Bradford 2 CETLs – ASKE and AfL Plus Exeter, Plymouth and Leeds Met.
What are we investigating? How to design an effective, efficient, inclusive and sustainable assessment strategy that delivers the key course/programme outcomes.
Also look out for:
And also … New JISC Programme  on  Assessment and Feedback See the Strand A projects.
Typical student concerns (based on PASS) perceptions of ‘the course’ variable. assessment experienced as ‘fragmented’. anxieties re move to more integrated assessment – perceived risk in terms of performance. Concerns about feedback and timing.
Searching for types
Searching for types
An example from PASS: Peninsula Medical School Includes: four assessment modules that run through the 5 year undergraduate medical programme and are not linked directly to specific areas of teaching focus on high-quality learning  (Mattick and Knight, 2007)
Further case studies  being explored Brunel New regulations which separate study and assessment blocks. Visit last week. Liverpool Hope New regulations which ‘abandon modules’ in all undergraduate programmes. ‘ Key Honours Assessment’ . Visit now arranged
Brunel: the regs 120 credits per year of study. Course/programme can include mix of study, assessment and modular blocks. Option blocks must be modular. Blocks must be in multiples of 5 credits Maximum assessment block is 40 points
Examples from Brunel Biomedical Sciences Study and assessment blocks in all years. Cut assessment load by 2/3rds; generated more time for class contact. Synoptic exam in all three years. Mathematics Conventional modules in final year only. Improved understanding and ‘carry-over’ of ‘the basics’ into year 2.
Do you PASS?
 
What can  e-assessment  achieve?
1. Processes and systems
Examples from the Curriculum Design Prog. eBioLabs  (Bristol) By combining interactive media with formative self-evaluation assessments students learn the methods and techniques they will use in the lab, without risking valuable time, equipment or materials. Because students first experiment on-line there is a reduced chance of cognitive overload during the practical and they are more able to concentrate on the wider aims of the experiment, rather than blindly following the lab instructions. Because eBiolabs includes tools that automatically mark student assignments, and tools that allow academics to easily track student attendance and achievement, the marking and administrative burden associated with running practicals is very significantly reduced.
Curriculum Design example CASCADE project (Oxford) Online assessment submission Students can now submit assignments much more easily at any time from anywhere in the world. It is also possible to predict significant efficiencies in assignment handling time for the Registry staff who deal with student submissions for approximately 260 course assignments across 48 course cohorts a year: a saving of 30 minutes per assignment or more soon cumulates savings in the order of half a day per week. Other advantages of the new online system are the reduction in paper handling and photocopying, as well as better auditing and control. Reduction in paper storage is a further advantage both in terms of less physical space being required and also in terms of less staff time being required to retrieve data from the archive. ’ 
Curriculum Design example ESCAPE project (Hertfordshire) Effectiveness vs efficiency. (watch  the video ) Next two slides from  Mark Russell,  Deputy Director of the Blended Learning Unit, University of Hertfordshire
ESCAPE Themes … Good assessment for learning: … Engages students with assessment criteria … Supports personalised learning … Focuses on student development … Ensures feedback leads to improvement … Stimulates dialogue … Considers staff and student effort
Numerous legacy resources
2. Feedback  and self-evaluation
Example 2.1: audio The ASEL project led by Bradford with Kingston as partner. various uses of audio, including feedback, in different disciplines. See the intro by Will Stewart as part of the recent Leeds University Building Capacity project: Jorum resource   Direct weblink
ASEL noted: Technology now easy and accessible. Positive student reactions. Different tutor styles and approaches. Serendipity – e.g. feedback stimulated podcasts. And A different form of communication?
ASEL main conclusions …  audio is a powerful tool, providing opportunities for personalising learning, promoting greater student engagement, and encouraging creativity. In introducing audio into their practice, lecturers were required to rethink their pedagogical approaches and learning design, adopting new and innovative ways to enable students to be more actively involved in the learning process.
ASEL main conclusions …  (audio) allowed lecturers to provide more personal and richer feedback to students, and increased the level of interaction and dialogue amongst students and between students and lecturers.  (Will Stewart and Chris Dearnley)
 
Example 2.2: audio and video Growing number of examples. ALT/Epigeum Awards 2010: see the  ALT Open Access Repository See the winning entry by Read and Brown from Southampton: Organic Chemistry. Use of tablets to show solutions and working. Focus on self-assessment.
Example 2.3: clickers are coming Student Response Systems at the moment? They work … they can change staff and student behaviour and performance. But can be cumbersome and fiddly. setup time. need strong commitment and support (e.g. see experience at Exeter Business School).
Example 2.3: clickers are coming Student Response Systems in the future? They will radically change staff and student behaviour. They will be flexible and easy to use. They will be on the student’s own device! e.g. experimentation at University of Bradford arising from our JISC  Building Capacity project  – work with TxtTools by John Fairhall et al)
Example 2.4: adaptive systems PBL with consequences – you get immediate feedback on the consequences of your decisions. e.g. The G4 project at St George’s http://www.generation4.co.uk/   iEthics  case online Adaptive assessment e.g. the work of Trevor Barker
3. Integrating systems
Example 3.1: integrating systems: CAA ITS4SEA project  at Bradford 100-seater facility, now plus break-out 30 seats. Thin client technology. QMP as University standard for summative assessment. Procedures agreed with Exam Office. Design of room (available as cluster outside assessment times) Teaching potential.
The main  CAA room at Bradford
 
And the growth …
And recent changes … growth in ‘hybrid exams’: mix of automatic marking (QMP) and open ended response items (e.g. short answer questions).  short answers to spreadsheet & marked by human. likely use of word-processing in future. Impact on teaching  - new flexibility.  Example of impact:  ‘ reduced my marking load for this module from 5 days to one day, whilst still enabling assessment of higher order cognitive skills.’
Example 3.2: integrating applications Use of mobile technology e.g. CampusM at Bradford: http://www.campusm.com/ http://www.techrepublic.com/software/university-of-bradford-about-uob-10-mobile/2194295?tag=content;selector-1
And finally …  back to the role E-assessment can play an important role re: Assessment authenticity and diversity Feedback quantity and quality Profiling and ‘mindset’. BUT ONLY IF Embedded in meaningful course/programme strategy/environment
And what is an ‘effective assessment strategy’? Will it explain to staff, students and external agencies: How the course/programme assesses the main learning outcomes? How assessment and teaching are linked? How assessment both supports ‘high-quality learning’ and develops it over the course?
And finally … assessment/identity interface Students as ‘conscientious consumers’ (Higgins et al, 2002). But: personal identity as ‘mediator’. e.g. apprentice (‘feedback is useful tool’) cf. victim (‘feedback is another burden’). So need to change the mindsets of some students.
And finally finally … some other contacts PASS Project Manager: Ruth Whitfield r.whitfield@ bradford.ac.uk ASEL Project Manager: Will Stewart w.stewart@ bradford.ac.uk CAA (building on ITS4SEA) Project Manager: John Dermo [email_address]
And maybe students will never again see … 59%  Excellent. This was the only tutor comment on a student assignment.  How do you think the student reacted and felt?
And some other interesting stuff Challenging students to search Wikipedia creatively:  C-Link http://www.conceptlinkage/clink /   Helping students review and evaluate their interview performance http://www.gowerpublishing.com/isbn/9781409411369   Helping research students to prepare for their viva: Interviewer Viva Contact me for further info/demo.
Thank you for listening Peter Hartley [email_address]

Assessment Futures: The Role for e-Assessment?

  • 1.
    Assessment Futures: rolefor e-Assessment? Peter Hartley, University of Bradford [email_address] http :// www.brad.ac.uk /educational-development/ aboutus / team/Full_details_27414_en.php
  • 2.
    This session Whybother? Why worry? Assessment trends and futures What can e-assessment contribute?
  • 3.
    And the argumentAssessment is a problem which will ‘get worse’. Assessment futures: trends re strategy and environment. Significant potential for ‘e’: some examples. And so the role for e-assessment must be part of a broader strategic initiative.
  • 4.
    Assessment is a problem, #1 Ask a student of X: what makes a ‘ First Class X?
  • 5.
    Assessment is a problem, #1 Ask a student of X: what makes a ‘ First Class X? ’ How would they respond? Can they give you a coherent summary of the main programme outcomes (or just the algorithm that determines the marks)?
  • 6.
    Assessment is a problem, #1 See useful summaries of major issues: the the PASS Project Issues Paper Please comment/feedback and use. http://www.pebblepad.co.uk/bradford/viewasset.aspx?oid=260486&type=file Recent article: Price, M., Carroll, J., O'Donovan, B. and Rust, C. (2011) ''If I was going there I wouldn't start from here: a critical commentary on current assessment practices', Assessment & Evaluation in Higher Education . Vol. 36, No. 4, 479-492.
  • 7.
    Assessment is a problem, #2 From PASS, would highlight: Assessment ‘drives and channels’ What/why are we measuring: the ‘slowly learnt’problem. Limitations of grading (e.g. marks are not numbers). Implications for course structures/regulations .
  • 8.
    Problem #3: multi-purpose & multi-audience
  • 9.
    Problem #4: The meaning of feedback Cannot we ‘recapture’ the ‘original’ meaning of feedback: enabling self-correcting behaviour towards a known goal. This means rediscovering the ‘feedback loop’ whereby information must be ‘fed back’ so that it: relates to the goal. is received. is correctly interpreted. enables corrective action cf. the work of Royce Sadler in Higher Education, e.g. http://www.northumbria.ac.uk/sd/central/ar/academy/cetl_afl/earli2010/themes/rsadler/
  • 10.
    Problem #5 More demanding ‘consumers’: E.g. the NUS Charter
  • 11.
  • 12.
    Environment: TESTA projectNTFS group project with 4 partners: ‘ aims to improve the quality of student learning through addressing programme-level assessment. ’ starting from audit of current practice on nine programmes: surveyed students using focus groups and AEQ – Assessment Experience Questionnaire – Graham Gibbs et al also using tool to identify programme level ‘assessment environments’ (Gibbs)
  • 13.
    Consistent practice? Characterisingprogramme-level assessment environments that support learning by Graham Gibbs and Harriet Dunbar-Goddet Published in: Assessment & Evaluation in Higher Education , Volume 34, Issue 4 August 2009 , pages 481 - 489
  • 14.
  • 15.
    Assessment environment andimpact Interim findings from TESTA variety of assessments can cause problems Issues over understanding assessment criteria, marker variation, and feedback variation across programmes QA ‘myths and traditions’ can get in the way
  • 16.
    The need forstrategy An example finding from Gibbs ‘ greater explicitness of goals and standards was not associated with students experiencing the goals and standards to be clearer’ And what did make a difference?
  • 17.
    The need forstrategy An example finding from Gibbs ‘ greater explicitness of goals and standards was not associated with students experiencing the goals and standards to be clearer’ And what did make a difference? Formative-only assessment More oral feedback Students ‘came to understand standards through many cycles of practice and feedback’
  • 18.
    Programme-based assessment: PASS NTFS group project over 3 years development and investigation leading to pilots and implementation Consortium Led by Bradford 2 CETLs – ASKE and AfL Plus Exeter, Plymouth and Leeds Met.
  • 19.
    What are weinvestigating? How to design an effective, efficient, inclusive and sustainable assessment strategy that delivers the key course/programme outcomes.
  • 20.
  • 21.
    And also …New JISC Programme on Assessment and Feedback See the Strand A projects.
  • 22.
    Typical student concerns(based on PASS) perceptions of ‘the course’ variable. assessment experienced as ‘fragmented’. anxieties re move to more integrated assessment – perceived risk in terms of performance. Concerns about feedback and timing.
  • 23.
  • 24.
  • 25.
    An example fromPASS: Peninsula Medical School Includes: four assessment modules that run through the 5 year undergraduate medical programme and are not linked directly to specific areas of teaching focus on high-quality learning (Mattick and Knight, 2007)
  • 26.
    Further case studies being explored Brunel New regulations which separate study and assessment blocks. Visit last week. Liverpool Hope New regulations which ‘abandon modules’ in all undergraduate programmes. ‘ Key Honours Assessment’ . Visit now arranged
  • 27.
    Brunel: the regs120 credits per year of study. Course/programme can include mix of study, assessment and modular blocks. Option blocks must be modular. Blocks must be in multiples of 5 credits Maximum assessment block is 40 points
  • 28.
    Examples from BrunelBiomedical Sciences Study and assessment blocks in all years. Cut assessment load by 2/3rds; generated more time for class contact. Synoptic exam in all three years. Mathematics Conventional modules in final year only. Improved understanding and ‘carry-over’ of ‘the basics’ into year 2.
  • 29.
  • 30.
  • 31.
    What can e-assessment achieve?
  • 32.
  • 33.
    Examples from theCurriculum Design Prog. eBioLabs (Bristol) By combining interactive media with formative self-evaluation assessments students learn the methods and techniques they will use in the lab, without risking valuable time, equipment or materials. Because students first experiment on-line there is a reduced chance of cognitive overload during the practical and they are more able to concentrate on the wider aims of the experiment, rather than blindly following the lab instructions. Because eBiolabs includes tools that automatically mark student assignments, and tools that allow academics to easily track student attendance and achievement, the marking and administrative burden associated with running practicals is very significantly reduced.
  • 34.
    Curriculum Design exampleCASCADE project (Oxford) Online assessment submission Students can now submit assignments much more easily at any time from anywhere in the world. It is also possible to predict significant efficiencies in assignment handling time for the Registry staff who deal with student submissions for approximately 260 course assignments across 48 course cohorts a year: a saving of 30 minutes per assignment or more soon cumulates savings in the order of half a day per week. Other advantages of the new online system are the reduction in paper handling and photocopying, as well as better auditing and control. Reduction in paper storage is a further advantage both in terms of less physical space being required and also in terms of less staff time being required to retrieve data from the archive. ’ 
  • 35.
    Curriculum Design exampleESCAPE project (Hertfordshire) Effectiveness vs efficiency. (watch the video ) Next two slides from Mark Russell, Deputy Director of the Blended Learning Unit, University of Hertfordshire
  • 36.
    ESCAPE Themes …Good assessment for learning: … Engages students with assessment criteria … Supports personalised learning … Focuses on student development … Ensures feedback leads to improvement … Stimulates dialogue … Considers staff and student effort
  • 37.
  • 38.
    2. Feedback and self-evaluation
  • 39.
    Example 2.1: audioThe ASEL project led by Bradford with Kingston as partner. various uses of audio, including feedback, in different disciplines. See the intro by Will Stewart as part of the recent Leeds University Building Capacity project: Jorum resource Direct weblink
  • 40.
    ASEL noted: Technologynow easy and accessible. Positive student reactions. Different tutor styles and approaches. Serendipity – e.g. feedback stimulated podcasts. And A different form of communication?
  • 41.
    ASEL main conclusions… audio is a powerful tool, providing opportunities for personalising learning, promoting greater student engagement, and encouraging creativity. In introducing audio into their practice, lecturers were required to rethink their pedagogical approaches and learning design, adopting new and innovative ways to enable students to be more actively involved in the learning process.
  • 42.
    ASEL main conclusions… (audio) allowed lecturers to provide more personal and richer feedback to students, and increased the level of interaction and dialogue amongst students and between students and lecturers. (Will Stewart and Chris Dearnley)
  • 43.
  • 44.
    Example 2.2: audioand video Growing number of examples. ALT/Epigeum Awards 2010: see the ALT Open Access Repository See the winning entry by Read and Brown from Southampton: Organic Chemistry. Use of tablets to show solutions and working. Focus on self-assessment.
  • 45.
    Example 2.3: clickersare coming Student Response Systems at the moment? They work … they can change staff and student behaviour and performance. But can be cumbersome and fiddly. setup time. need strong commitment and support (e.g. see experience at Exeter Business School).
  • 46.
    Example 2.3: clickersare coming Student Response Systems in the future? They will radically change staff and student behaviour. They will be flexible and easy to use. They will be on the student’s own device! e.g. experimentation at University of Bradford arising from our JISC Building Capacity project – work with TxtTools by John Fairhall et al)
  • 47.
    Example 2.4: adaptivesystems PBL with consequences – you get immediate feedback on the consequences of your decisions. e.g. The G4 project at St George’s http://www.generation4.co.uk/ iEthics case online Adaptive assessment e.g. the work of Trevor Barker
  • 48.
  • 49.
    Example 3.1: integratingsystems: CAA ITS4SEA project at Bradford 100-seater facility, now plus break-out 30 seats. Thin client technology. QMP as University standard for summative assessment. Procedures agreed with Exam Office. Design of room (available as cluster outside assessment times) Teaching potential.
  • 50.
    The main CAA room at Bradford
  • 51.
  • 52.
  • 53.
    And recent changes… growth in ‘hybrid exams’: mix of automatic marking (QMP) and open ended response items (e.g. short answer questions). short answers to spreadsheet & marked by human. likely use of word-processing in future. Impact on teaching - new flexibility. Example of impact: ‘ reduced my marking load for this module from 5 days to one day, whilst still enabling assessment of higher order cognitive skills.’
  • 54.
    Example 3.2: integratingapplications Use of mobile technology e.g. CampusM at Bradford: http://www.campusm.com/ http://www.techrepublic.com/software/university-of-bradford-about-uob-10-mobile/2194295?tag=content;selector-1
  • 55.
    And finally … back to the role E-assessment can play an important role re: Assessment authenticity and diversity Feedback quantity and quality Profiling and ‘mindset’. BUT ONLY IF Embedded in meaningful course/programme strategy/environment
  • 56.
    And what isan ‘effective assessment strategy’? Will it explain to staff, students and external agencies: How the course/programme assesses the main learning outcomes? How assessment and teaching are linked? How assessment both supports ‘high-quality learning’ and develops it over the course?
  • 57.
    And finally …assessment/identity interface Students as ‘conscientious consumers’ (Higgins et al, 2002). But: personal identity as ‘mediator’. e.g. apprentice (‘feedback is useful tool’) cf. victim (‘feedback is another burden’). So need to change the mindsets of some students.
  • 58.
    And finally finally… some other contacts PASS Project Manager: Ruth Whitfield r.whitfield@ bradford.ac.uk ASEL Project Manager: Will Stewart w.stewart@ bradford.ac.uk CAA (building on ITS4SEA) Project Manager: John Dermo [email_address]
  • 59.
    And maybe studentswill never again see … 59% Excellent. This was the only tutor comment on a student assignment. How do you think the student reacted and felt?
  • 60.
    And some otherinteresting stuff Challenging students to search Wikipedia creatively: C-Link http://www.conceptlinkage/clink / Helping students review and evaluate their interview performance http://www.gowerpublishing.com/isbn/9781409411369 Helping research students to prepare for their viva: Interviewer Viva Contact me for further info/demo.
  • 61.
    Thank you forlistening Peter Hartley [email_address]

Editor's Notes

  • #3 Intro – feedback as continuous correction and the feedback loop Examples Audio – reflect on personal concept map Vid – alt epigeum winner from 2010 Clickers – exeter business school Adaptive – Access – CAA at bradford Mobile – campusM/txttools? Are these the way forward To go next – take examples from the PASS and TESTA projects Bring out the implications of RH work on student identity and mindset – reactions to feedback may be a symptom and not a cause. PASS
  • #4 Intro – feedback as continuous correction and the feedback loop Examples Audio – reflect on personal concept map Vid – alt epigeum winner from 2010 Clickers – exeter business school Adaptive – Access – CAA at bradford Mobile – campusM/txttools? Are these the way forward To go next – take examples from the PASS and TESTA projects Bring out the implications of RH work on student identity and mindset – reactions to feedback may be a symptom and not a cause. PASS
  • #18 20/09/11
  • #51 20/09/11