4
Most read
9
Most read
14
Most read
Rubric Design
Workshop
OverviewOf Session
What is a rubric?
• Definition: “a set of criteria specifying the characteristics
of an outcome and the levels of achievement in each
characteristic.”
• Benefits
- Provides consistency in evaluation and
performance
- Gathers rich data
- Mixed-method
- Allows for direct measure of learning
Why use rubrics?
• Provides both qualitative descriptions of student
learning and quantitative results
• Clearly communicates expectations to students
• Provides consistency in evaluation
• Simultaneously provides student feedback and
programmatic feedback
• Allows for timely and detailed feedback
• Promotes colleague collaboration
• Helps us refine practice
Types of Rubrics - Analytic
Analytic rubrics articulate levels of performance for each
criteria used to assess student learning.
Advantages
• Provide useful feedback on areas of strength and weakness.
• Criterion can be weighted to reflect the relative importance of
each dimension.
Disadvantages
• Takes more time to create and use than a holistic rubric.
• Unless each point for each criterion is well-defined raters may
not arrive at the same score
Analytic Rubric Example
Types of Rubrics - Holistic
A holistic rubric consists of a single scale with all
criteria to be included in the evaluation being considered
together.
Advantages
• Emphasis on what the learner is able to demonstrate, rather than
what s/he cannot do.
• Saves time by minimizing the number of decisions raters make.
• Can be applied consistently by trained raters increasing reliability.
Disadvantages
• Does not provide specific feedback for improvement.
• When student work is at varying levels spanning the criteria points it
can be difficult to select the single best description.
• Criteria cannot be weighted.
Ω
Holistic Rubric Example
Steps for Implementation
Identify the outcome ✔
Determine how you will collect the evidence ✔
Develop the rubric based on observable criteria
Train evaluators on rubric use
Test rubric and revise if needed
Collect Data
Analyze and report
1
2
3
4
5
6
7
Rubric Development – Pick your Scale
Levy, J.D. Campus Labs: Data Driven Innovation. Using rubrics in student affairs: A direct assessment of learning.
Rubric Development – Pick your Dimensions
Levy, J.D. Campus Labs: Data Driven Innovation. Using rubrics in student affairs: A direct assessment of learning.
Creating you Rubric
Levy, J.D. Campus Labs: Data Driven Innovation. Using rubrics in student affairs: A direct assessment of learning.
Writing Descriptors
University of Florida Institutional Assessment: Writing Effective Rubrics
Describe each level of mastery for each characteristic
Describe the best work you could expect
Describe an unacceptable product
Develop descriptions of intermediate level products for
intermediate categories
Each description and each category should be mutually
exclusive
Be specific and clear; reduce subjectivity
1
2
3
4
5
6
Next Steps…
Levy, J.D. Campus Labs: Data Driven Innovation. Using rubrics in student affairs: A direct assessment of learning.
Training for Consistency
1. Inter-rater reliability: Between-rater consistency
Affected by:
• Initial starting point or approach to scale (assessment
tool)
• Interpretation of descriptions
• Domain / content knowledge
• Intra-rater consistency
2. Intra-rater reliability: Within-rater consistency
Affected by:
• Internal factors: mood, fatigue, attention
• External factors: order of evidence, time of day, other
situations
• Applies to both multiple-rater and single rater situations
Levy, J.D. Campus Labs: Data Driven Innovation. Using rubrics in student affairs: A direct assessment of learning.
Testing your Rubric
• Use a Meta-rubric to review your work
• Peer review- ask one of your peers to review the rubric
and provide feedback on content
• Test with students - use student work or observations to
test the rubric
• Revise as needed
• Test again
• Multiple raters – norm with other raters if appropriate
Levy, J.D. Campus Labs: Data Driven Innovation. Using rubrics in student affairs: A direct assessment of learning.
Allen, M.J. (2004). Assessing academic programs in higher education. Bolton, MA:
Anker.
Brophy, Timothy S. University of Florida Institutional Assessment: Writing Effective
Rubrichttp://assessment.aa.ufl.edu/Data/Sites/22/media/slo/writing_effective_rubri
cs_guide_v2.pdf
Jon Mueller. Professor of Psychology, North Central College, Naperville, IL. Authentic
Assessment Toolbox http://jfmueller.faculty.noctrl.edu/toolbox/rubrics.htm
Teaching Commons, Deparul University
http://teachingcommons.depaul.edu/Feedback_Grading/rubrics/types-of-
rubrics.html

More Related Content

PPTX
CPA APPROACH.pptx
PPT
Rubrics presentation
PPTX
Clarifying learning goals & standards: rubrics & exemplars
PPT
Content, Process, And Product
PDF
Assessing Student Learning Outcome.pdf
PPTX
Writing Test Items
PDF
CREATIVITY - 21st Century skills
CPA APPROACH.pptx
Rubrics presentation
Clarifying learning goals & standards: rubrics & exemplars
Content, Process, And Product
Assessing Student Learning Outcome.pdf
Writing Test Items
CREATIVITY - 21st Century skills

What's hot (20)

PPTX
Steps in developing performance based assessment
PPTX
Blooms Taxonomy Made Easy
PPTX
21st Century Skills - PPT
PDF
Guidelines for Test Administration
DOCX
Handbook on Internship in Teaching
PPT
Developing Critical Thinking Skills
PPT
Reading activities and tasks
PPTX
Performance Based Assessment
PPT
Assessment types and tasks
PPTX
General principles of testing to Different qualities of high quality assessment
PDF
Designing business english programs to suit students' needs
PPTX
Presentation how to design rubrics
PPTX
Rubrics
PPT
Informal Assessment
PPT
Math lesson problem solving
PPTX
Basic Concept in Assessment
PPT
Creating Rubrics
PPTX
Higher Order Thinking (HOTs)
PPTX
Concept attainment strategy
PPTX
Lesson 4 analysis of test results
Steps in developing performance based assessment
Blooms Taxonomy Made Easy
21st Century Skills - PPT
Guidelines for Test Administration
Handbook on Internship in Teaching
Developing Critical Thinking Skills
Reading activities and tasks
Performance Based Assessment
Assessment types and tasks
General principles of testing to Different qualities of high quality assessment
Designing business english programs to suit students' needs
Presentation how to design rubrics
Rubrics
Informal Assessment
Math lesson problem solving
Basic Concept in Assessment
Creating Rubrics
Higher Order Thinking (HOTs)
Concept attainment strategy
Lesson 4 analysis of test results
Ad

Viewers also liked (20)

DOC
A sample of analytic scoring rubrics
PPT
Performance Assessment
DOC
A sample of holistic scoring rubric
PPT
Creating Rubrics
PPT
Performance assessment by rubric method
PPT
Product oriented
PPTX
Oral Exams and Rubrics: An introduction
PPTX
Rubrics (Analytic and Holistic)
PPSX
DOCX
Field study 5 assessment
PDF
Bi sbp pmr 2011
PPT
Rubrics presentation 4.3.a
DOC
Book trailer rubric
PPTX
How to make book trailers
PPTX
Using Rubrics: Comparing Blackboard and Turnitin at GCU
PPTX
Classroom Assessment: Performance-Based strategies
PPTX
Rubrics: Transparent Assessment in Support of Learning
PPTX
Grading criteria and marking schemes, Liz Norman, ANZCVS Exam Writing Worksho...
PPTX
Creating Descriptive Rubrics for Educational Assessment
PPTX
A sample of analytic scoring rubrics
Performance Assessment
A sample of holistic scoring rubric
Creating Rubrics
Performance assessment by rubric method
Product oriented
Oral Exams and Rubrics: An introduction
Rubrics (Analytic and Holistic)
Field study 5 assessment
Bi sbp pmr 2011
Rubrics presentation 4.3.a
Book trailer rubric
How to make book trailers
Using Rubrics: Comparing Blackboard and Turnitin at GCU
Classroom Assessment: Performance-Based strategies
Rubrics: Transparent Assessment in Support of Learning
Grading criteria and marking schemes, Liz Norman, ANZCVS Exam Writing Worksho...
Creating Descriptive Rubrics for Educational Assessment
Ad

Similar to Rubric design workshop (20)

PPTX
Authentic-assessment-final.pptx
PPTX
How to write an ideal Rubric presentation
PPTX
NED 203 Criterion Referenced Test & Rubrics
PPTX
principle and types of assessment 12.pptx
PPT
Different kinds of evaluation
PDF
Interrater Reliability Made Easy
PDF
Assessment tools
PPTX
2010 ohio tif meeting creating a comprehensive teacher effectiveness system
PPTX
501 wk 12 designing and conducting summative evaluations
PPTX
RUBRICS.pptx
PPTX
Evaluation the many faces
PPTX
Check, Check, Check in the Simulation Lab
PPTX
the grading and assessment ppt-7.p p t x
PPTX
Chapter 12
PPTX
Apt 501 chapter_7
PPTX
Developing Assessment Instruments Chapter 7
PPTX
Developing Assessment Instrument
PPTX
Review Refresher-Assessment-in-Learning.pptx
PPTX
Presentation at Minnesota Brightspace Ignite on April 24, 2015, byCreating an...
PPTX
Classroom Assesment tool.pptx
Authentic-assessment-final.pptx
How to write an ideal Rubric presentation
NED 203 Criterion Referenced Test & Rubrics
principle and types of assessment 12.pptx
Different kinds of evaluation
Interrater Reliability Made Easy
Assessment tools
2010 ohio tif meeting creating a comprehensive teacher effectiveness system
501 wk 12 designing and conducting summative evaluations
RUBRICS.pptx
Evaluation the many faces
Check, Check, Check in the Simulation Lab
the grading and assessment ppt-7.p p t x
Chapter 12
Apt 501 chapter_7
Developing Assessment Instruments Chapter 7
Developing Assessment Instrument
Review Refresher-Assessment-in-Learning.pptx
Presentation at Minnesota Brightspace Ignite on April 24, 2015, byCreating an...
Classroom Assesment tool.pptx

Recently uploaded (20)

PDF
FYJC - Chemistry textbook - standard 11.
PDF
faiz-khans about Radiotherapy Physics-02.pdf
PDF
Health aspects of bilberry: A review on its general benefits
PDF
Laparoscopic Imaging Systems at World Laparoscopy Hospital
PPTX
Power Point PR B.Inggris 12 Ed. 2019.pptx
PPTX
UCSP Section A - Human Cultural Variations,Social Differences,social ChangeCo...
PDF
FAMILY PLANNING (preventative and social medicine pdf)
PDF
Diabetes Mellitus , types , clinical picture, investigation and managment
PPTX
principlesofmanagementsem1slides-131211060335-phpapp01 (1).ppt
PPTX
CHROMIUM & Glucose Tolerance Factor.pptx
PPTX
Cite It Right: A Compact Illustration of APA 7th Edition.pptx
PPTX
GW4 BioMed Candidate Support Webinar 2025
PPTX
ENGlishGrade8_Quarter2_WEEK1_LESSON1.pptx
PDF
Horaris_Grups_25-26_Definitiu_15_07_25.pdf
PDF
HSE 2022-2023.pdf الصحه والسلامه هندسه نفط
PDF
CHALLENGES FACED BY TEACHERS WHEN TEACHING LEARNERS WITH DEVELOPMENTAL DISABI...
PPTX
ACFE CERTIFICATION TRAINING ON LAW.pptx
PDF
Unleashing the Potential of the Cultural and creative industries
PDF
GSA-Past-Papers-2010-2024-2.pdf CSS examination
PDF
IS1343_2012...........................pdf
FYJC - Chemistry textbook - standard 11.
faiz-khans about Radiotherapy Physics-02.pdf
Health aspects of bilberry: A review on its general benefits
Laparoscopic Imaging Systems at World Laparoscopy Hospital
Power Point PR B.Inggris 12 Ed. 2019.pptx
UCSP Section A - Human Cultural Variations,Social Differences,social ChangeCo...
FAMILY PLANNING (preventative and social medicine pdf)
Diabetes Mellitus , types , clinical picture, investigation and managment
principlesofmanagementsem1slides-131211060335-phpapp01 (1).ppt
CHROMIUM & Glucose Tolerance Factor.pptx
Cite It Right: A Compact Illustration of APA 7th Edition.pptx
GW4 BioMed Candidate Support Webinar 2025
ENGlishGrade8_Quarter2_WEEK1_LESSON1.pptx
Horaris_Grups_25-26_Definitiu_15_07_25.pdf
HSE 2022-2023.pdf الصحه والسلامه هندسه نفط
CHALLENGES FACED BY TEACHERS WHEN TEACHING LEARNERS WITH DEVELOPMENTAL DISABI...
ACFE CERTIFICATION TRAINING ON LAW.pptx
Unleashing the Potential of the Cultural and creative industries
GSA-Past-Papers-2010-2024-2.pdf CSS examination
IS1343_2012...........................pdf

Rubric design workshop

  • 3. What is a rubric? • Definition: “a set of criteria specifying the characteristics of an outcome and the levels of achievement in each characteristic.” • Benefits - Provides consistency in evaluation and performance - Gathers rich data - Mixed-method - Allows for direct measure of learning
  • 4. Why use rubrics? • Provides both qualitative descriptions of student learning and quantitative results • Clearly communicates expectations to students • Provides consistency in evaluation • Simultaneously provides student feedback and programmatic feedback • Allows for timely and detailed feedback • Promotes colleague collaboration • Helps us refine practice
  • 5. Types of Rubrics - Analytic Analytic rubrics articulate levels of performance for each criteria used to assess student learning. Advantages • Provide useful feedback on areas of strength and weakness. • Criterion can be weighted to reflect the relative importance of each dimension. Disadvantages • Takes more time to create and use than a holistic rubric. • Unless each point for each criterion is well-defined raters may not arrive at the same score
  • 7. Types of Rubrics - Holistic A holistic rubric consists of a single scale with all criteria to be included in the evaluation being considered together. Advantages • Emphasis on what the learner is able to demonstrate, rather than what s/he cannot do. • Saves time by minimizing the number of decisions raters make. • Can be applied consistently by trained raters increasing reliability. Disadvantages • Does not provide specific feedback for improvement. • When student work is at varying levels spanning the criteria points it can be difficult to select the single best description. • Criteria cannot be weighted. Ω
  • 9. Steps for Implementation Identify the outcome ✔ Determine how you will collect the evidence ✔ Develop the rubric based on observable criteria Train evaluators on rubric use Test rubric and revise if needed Collect Data Analyze and report 1 2 3 4 5 6 7
  • 10. Rubric Development – Pick your Scale Levy, J.D. Campus Labs: Data Driven Innovation. Using rubrics in student affairs: A direct assessment of learning.
  • 11. Rubric Development – Pick your Dimensions Levy, J.D. Campus Labs: Data Driven Innovation. Using rubrics in student affairs: A direct assessment of learning.
  • 12. Creating you Rubric Levy, J.D. Campus Labs: Data Driven Innovation. Using rubrics in student affairs: A direct assessment of learning.
  • 13. Writing Descriptors University of Florida Institutional Assessment: Writing Effective Rubrics Describe each level of mastery for each characteristic Describe the best work you could expect Describe an unacceptable product Develop descriptions of intermediate level products for intermediate categories Each description and each category should be mutually exclusive Be specific and clear; reduce subjectivity 1 2 3 4 5 6
  • 14. Next Steps… Levy, J.D. Campus Labs: Data Driven Innovation. Using rubrics in student affairs: A direct assessment of learning.
  • 15. Training for Consistency 1. Inter-rater reliability: Between-rater consistency Affected by: • Initial starting point or approach to scale (assessment tool) • Interpretation of descriptions • Domain / content knowledge • Intra-rater consistency 2. Intra-rater reliability: Within-rater consistency Affected by: • Internal factors: mood, fatigue, attention • External factors: order of evidence, time of day, other situations • Applies to both multiple-rater and single rater situations Levy, J.D. Campus Labs: Data Driven Innovation. Using rubrics in student affairs: A direct assessment of learning.
  • 16. Testing your Rubric • Use a Meta-rubric to review your work • Peer review- ask one of your peers to review the rubric and provide feedback on content • Test with students - use student work or observations to test the rubric • Revise as needed • Test again • Multiple raters – norm with other raters if appropriate Levy, J.D. Campus Labs: Data Driven Innovation. Using rubrics in student affairs: A direct assessment of learning.
  • 17. Allen, M.J. (2004). Assessing academic programs in higher education. Bolton, MA: Anker. Brophy, Timothy S. University of Florida Institutional Assessment: Writing Effective Rubrichttp://assessment.aa.ufl.edu/Data/Sites/22/media/slo/writing_effective_rubri cs_guide_v2.pdf Jon Mueller. Professor of Psychology, North Central College, Naperville, IL. Authentic Assessment Toolbox http://jfmueller.faculty.noctrl.edu/toolbox/rubrics.htm Teaching Commons, Deparul University http://teachingcommons.depaul.edu/Feedback_Grading/rubrics/types-of- rubrics.html

Editor's Notes

  • #14: Focus your descriptions on the presence of the quantity and quality that you expect, rather than on the absence of them. However, at the lowest level, it would be appropriate to state that an element is “lacking” or “absent” (Carriveau, 2010) Keep the elements of the description parallel from performance level to performance level. In other words, if your descriptors include quantity, clarity, and details, make sure that each of these outcome expectations is included in each performance level descriptor.
  • #15: Focus your descriptions on the presence of the quantity and quality that you expect, rather than on the absence of them. However, at the lowest level, it would be appropriate to state that an element is “lacking” or “absent” (Carriveau, 2010) Keep the elements of the description parallel from performance level to performance level. In other words, if your descriptors include quantity, clarity, and details, make sure that each of these outcome expectations is included in each performance level descriptor.
  • #16: When using a rubric for program assessment purposes, faculty members apply the rubric to pieces of student work (e.g., reports, oral presentations, design projects). To produce dependable scores, each faculty member needs to interpret the rubric in the same way. The process of training faculty members to apply the rubric is called "calibration." It's a way to calibrate the faculty members so that scores are accurate and reliable. Reliability here means that the scorers apply the rubric consistently, not only to each piece of student work (called intrarater reliability), but among themselves (called interrater reliability).