A R T I C L E I N F O A B S T R A C T
Keywords: Associations between assessment and learning are widely studied and often organized around the notions of
Assessment as Learning (AaL) Assessment as Learning (AaL), Assessment for Learning (AfL), and Assessment of Learning (AoL). Although these
Assessment for Learning (AfL) notions are appealing in theory, the notions are unclear constructs to comprehend, as both their definitions and
Assessment of Learning (AoL)
their practice are used inconsistently in educational research. We present a synthesis of common characteristics
Scoping review
among these notions, based on a scoping review on definitions and descriptions of AaL, AfL, and AoL (131
studies). The synthesis of common characteristics consists of nine themes that refer to how educational assess-
ment relates to learning. The themes are grouped into: 1) Student-teacher roles and relationships within
assessment; 2) Assessment learning environment; and 3) Educational outcomes of assessment. Then, we used the
themes within the synthesis to analyze the results of the included empirical studies on their contributions to
practice (84 studies). The synthesis provides stakeholders with a clear and integrative view of how educational
assessment relates to learning and may be beneficial to educators to support and design their assessment prac-
tices. We argue that the notions of AaL, AfL, and AoL should be seen in coherence with one another in order to
establish an assessment culture that facilitates students’ learning maximally.
* Corresponding author at: P.O. Box 80127, 3508 TC Utrecht, the Netherlands.
E-mail address: [Link]@[Link] (L.H. Schellekens).
[Link]
Received 2 February 2021; Received in revised form 22 July 2021; Accepted 7 October 2021
Available online 20 October 2021
0191-491X/© 2021 The Authors. Published by Elsevier Ltd. This is an open access article under the CC BY license ([Link]
L.H. Schellekens et al. Studies in Educational Evaluation 71 (2021) 101094
formative function became Assessment for Learning (AfL), which in definitions and contexts also means that the efficacy of the assessment
general emphasized the purpose of assessment to improve the learning practices is difficult to research and document (Bennett, 2011; Dunn &
and teaching process. Assessment with a summative function became Mulvenon, 2009). To regard educational assessment as an overarching
Assessment of Learning (AoL), which in general was used to judge per- coherent construct it is relevant to understand the underlying notions
formance and measuring outcomes after a formal learning activity (e.g. and how they relate to each other conceptually. The current review aims
ARG, 1999; *Crooks, 2011; Earl, 2003). to provide insight into this issue by systematically reviewing definitions
In 2002, ARG provided the following definition of AfL: “Assessment and descriptions of AaL, AfL, and AoL in order to synthesize common
for Learning is the process of seeking and interpreting evidence for use characteristics of the assessment notions that refer to student learning.
by learners and their teachers to decide where the learners are in their Then, we use the common characteristics as a framework to examine the
learning, where they need to go and how best to get there” (ARG, 2002, results of the included empirical studies for their contributions to
1–2). The definition contributed to the notion of closing the learning practice. The following research questions were formulated:
gap, by monitoring student progress in comparison with reference levels
of standards and continuous feedback (*Ninomiya, 2016). However, 1) What are common characteristics of the definitions and/or de-
scholars also argued that the definition could result in practices wherein scriptions of AaL, AfL, and AoL?
teachers frequently tested their students to assess the levels they 2) What is known about the common characteristics of the assessment
attained against prescribed standards (e.g. Klenowski, 2009). Therefore, notions from findings of the included empirical studies?
in 2009 a second-generation definition of AfL was adopted to make clear
that the central focus is on learning, with students and teachers as key 1.2. Previous reviews on AaL, AfL and AoL
agents in this process: “Assessment for Learning is part of everyday
practice by students, teachers and peers that seeks, reflects upon and In two review studies, the notion of AfL (*Wiliam, 2011), and both
responds to information from dialogue, demonstration and observation AfL and AaL (Clark, 2012) are described in relation to the notion of
in ways that enhance ongoing learning” (Klenowski, 2009, 264). Formative Assessment (FA). *Wiliam (2011) discussed different defini-
In 2003, Earl (2003, 2013) added a third notion, denoted as tions of the terms FA and AfL and distinguished two requirements of
Assessment as Learning (AaL). This notion includes the active involve- assessment to support learning: 1) the assessment provides information
ment of students in self-assessment and self-directed learning as a to lead to improved performance; and 2) the assessment engages the
distinct function to improve the learning process. Earl’s intention was to learner in actions to improve learning. Clark’s (2012) review included
extend the role of AfL by emphasizing the role of the student as the 199 studies and focused on the theories and goals of FA in function of the
critical connector between the assessment and learning process (*Dann, promotion of students’ self-regulatory learning strategies. One review
2014; Earl & Katz, 2008). In this view, the student is seen as being an study had a specific focus on the notion of AaL (*Lam, 2016) and one on
active and engaged assessor in order to support the development of the notion of AfL (*Heitink, van der Kleij, Veldkamp, Schildkamp, &
metacognitive and self-regulated learning skills (Earl, 2013; *Lam, Kippers, 2016). *Lam (2016) reviewed the extent to which AaL supports
2016). Some scholars argue that AaL can generally be considered as a writing instruction and student learning in higher education. *Heitink
subsection of AfL (Clark, 2012; Earl, 2013; *Lam, 2018). Others suggest et al. (2016) conducted a literature review to reveal the prerequisites for
that the practice of AaL represents the final phase of a developmental implementation of AfL. Although review studies have been conducted
continuum for improving assessment practice (Tomlinson, 2007). on the assessment notions AaL and/or AfL, these reviews do not provide
Another perspective is that AaL, AfL and AoL should be seen as inte- a synthesis of the common characteristics of the three assessment no-
grated entities in coherence with the entire education model in order to tions in the last two decades of educational research, nor do they review
facilitate learning maximally (Van der Vleuten, Sluijsmans, & how the common characteristics are practiced. In addition, some re-
Joosten-Ten Brinke, 2017). Moreover, the notion of AaL is also subject to views were set up in the form of a critical analysis and did not report on
criticism in the literature (*Fletcher, 2016). *Torrance (2007, 281) re- features of systematic research reviews, such as conducting a systematic
fers to AaL as a concept of “procedural compliance”, in which assess- search of the literature or applying inclusion criteria, or did not include
ment procedures and practices dominate students’ learning experiences, all educational sectors.
resulting in student achievement without understanding. Hence,
although AaL was originally considered in conjunction with AfL, various 1.3. Relevance
meanings exist.
The aim of the paper is to develop a synthesis of the assessment
1.1. Problem definition and research questions notions AaL, AfL, AoL on common characteristics and to examine how
these common characteristics are empirically practiced. We do so by
The assessment notions AaL, AfL, and AoL reflect different and means of a scoping review. Rather than viewing the three notions of
valuable assessment and learning approaches. In general, AaL represents assessment as distinct, we chose to review these definitions alongside
the active engagement of students in assessment and their learning, AfL one another to prevent a fragmented picture. A common language that
represents the identification of learning throughout assessment, and AoL relates assessment to learning is a prerequisite to establish an assessment
represents the measurement of learning by using assessments (Berry, culture that supports the development of students as learners (Medland,
2013; *Birenbaum et al., 2015; *Sadeghi & Rahmati, 2017). However, as 2016). The resulting synthesis we provide can support teachers, faculty
interest in AaL and AfL has increased over the years, the various development programmes, and institutions to improve their assessment
definitions of AaL and AfL emphasize different aspects of how assess- tools, assessment practices, and assessment programs. Next, the analysis
ment relates to learning and are used interchangeably (*McDowell, of the included empirical studies on these common characteristics,
Wakelin, Montgomery, & King, 2011; *Swaffield, 2011). For example, contributes to a more coherent understanding of the research on AaL,
more recent definitions of AfL emphasize the active role of the teacher AfL, and AoL that has been conducted in the past two decades at various
and students in the assessment process, while the same characteristics educational sectors.
are central in the definition of AaL. Because of the breadth of definitions
and the diversity in educational contexts, the notions of AaL, AfL, and 2. Methods
AoL are not straightforward to comprehend (*Baird, Andrich, Hopfen-
beck, & Stobart, 2017; *Tan, 2017). Consequently, teachers may not 2.1. Scoping review
clearly understand how assessment should be practised to enhance
learning (*Tan, 2017; *Tunku Ahmad et al., 2014). The diversity in The purpose of a scoping review is to clarify complex concepts and
2
L.H. Schellekens et al. Studies in Educational Evaluation 71 (2021) 101094
identify key concepts and gaps in existing literature (Daudt, van Mossel, original inclusion criteria, which resulted in 13 additional studies. In
& Scott, 2013; Levac, Colquhoun, & O’Brien, 2010). Scoping reviews, as Fig. 1, a PRISMA flow diagram reports the flow of the articles included in
opposed to systematic reviews, tend to create an overview of a diverse this review (Moher, Liberati, Tetzlaff, Altman, & Grp, 2009).
body of work, regardless of methodological approaches (Pham et al.,
2014). The procedure for this scoping review was based on the six-stage 2.4. Stage 4: charting the data
methodological framework developed by Arksey and O’Malley (2005).
The framework enables replication of the search strategy and increase The first author developed a coding template for data extraction,
the reliability of the study findings (Pham et al., 2014). The framework which was discussed with the whole research team. To include the
includes the following stages: 1) identifying the research question; 2) richness of the data, we decided to use the coding template to record
identifying relevant studies; 3) selecting studies; 4) charting the data; 5) descriptive information only (authors’ names, year of publication, aim,
collating, summarizing and reporting the results; and 6) consulting with study design, country, sector, discipline, participants). Three authors
stakeholders to validate study findings. The first stage, the identification coded 13 articles (10 % of the total sample) and discussed text fragments
of the research questions, has been described in the introduction section. intensively to agree upon fragments that signified definitions or de-
The last stage was applied during the course of the whole review process scriptions of the notions of assessment (research question 1) and that
by consulting the research team at each stage and will not be presented indicated the results of empirical studies (research question 2). Next, the
separately. Stages 2–5 are elaborated in the next sections. first author read the 131 included studies and uploaded all articles in
NVivo11 Pro for Windows. Text fragments that defined the assessment
2.2. Stage 2: identifying relevant studies notions were highlighted as nodes (see Appendix B1). Many articles
provided more than one definition of the assessment notion(s), resulting
We conducted a systematic search for English peer reviewed articles in 569 unique nodes for definitions and descriptions of AaL (n → 85), AfL
published between 1999 and 2018. We searched in ERIC, PsycINFO, (n → 451), and AoL (n → 33). To investigate common characteristics of
PubMed and Web of Science databases to find educational research the descriptions, we considered the three assessment notions as a joint
published in the disciplines of social and behavioural sciences, human- unit of analysis by merging the descriptions of all three notions into one
ities and medicine. We chose 1999 as the year of reference because of the unit. We opted for a joint unit of analysis for two reasons. Firstly, to rule
introduction of the notion of AfL around that time. Although a diversity out that the descriptions of the assessment notions in the included
of literature is permitted in scoping reviews, we put restrictions on the studies were possibly biased by erroneous understandings of the
type by only including research articles. In addition, because a scoping assessment construct by the author, since terminology may improperly
review does not require an appraisal of the quality of the literature as no be used interchangeably (e.g. *Swaffield, 2011). Secondly, to enhance
methodological restrictions are enforced, we chose to include only peer our understanding of how assessment relates to learning we aimed to
reviewed articles to impose a kind of quality appraisal. A librarian was attain a broad portrayal of this association (Elo & Kyng! as, 2008). To
consulted to verify the search strategies. The following keywords were examine the nodes of text fragments that defined and described the
used to search in titles or headings and abstracts: ‘assessment as notions AaL, AfL, and AoL as a joint unit of analysis, all text that referred
learning’ or AaL; ‘assessment for learning’ or AfL; ‘assessment of to a particular notion was replaced by the term ‘assessment’ (see Ap-
learning’ or AoL. We thus chose not to use terminology or synonyms that pendix B2 for an example). Similarly, in some definitions of the assess-
were related to the notions of assessment (e.g., formative assessment or ment notions referrals to concepts like formative or summative
self-regulated learning), because our aim was not to classify related assessment were made. These descriptions were also replaced by the
concepts. After electronic searches, searches were conducted using the term ‘assessment’ to avoid classification in existing assessment con-
Google Scholar search engine, and a snowball method was undertaken structs. Then, the text fragments of all nodes were coded using an
by inspecting the reference lists of the included articles. inductive content analysis approach (Elo & Kyng! as, 2008). First, we
densely coded the content of each node because every single node could
2.3. Stage 3: selecting studies contain several pieces of information (i.e., codes), depending on the
richness of the description or definition of the assessment notion (Cohen,
The first search resulted in 599 articles that were imported in Ref- Manion, & Morrison, 2018). Consequently, one node could contain
Works software to detect exact match duplicates: 135 duplicates were several codes (see Appendix B3 for an example). The same code was
found, resulting in a total of 464 articles to include for selection based on given to a text fragment, when the text shared the same type of content.
title and abstract. The first author screened articles on title and abstract. The names of the codes were chosen inductively by the first three au-
When the title or abstract matched the first two inclusion criteria, the thors, based on the content and words used in the text fragments to bear
article was included for full text screening which was conducted by three resemblance to the original data (Cohen et al., 2018). To ensure con-
authors (LS, HB and LdJ). Studies were included when: sistency and coverage of the codes, text fragments of the nodes were
re-read and codes were re-assigned several times. A code was specified
1) The text was a scientific research article written in English. when at least two (parts of) text fragments referred to the same content.
2) The article focused on learning through assessment to promote stu- A total of 72 codes emerged from the analysis of the nodes. Although
dents’ learning in a regular physical educational context. some codes were more frequently used than others (e.g., ‘practice in-
3) A description or definition was given of AaL and/or AfL and/or AoL. volves feedback’ was coded 95 times, while ‘students are passive’ was
coded two times), only the content of the code was taken into account to
See Appendix A for a more thorough description of the inclusion and group the codes into themes. In an iterative process with the whole
exclusion criteria. team, the codes were examined, compared, and conceptualized on their
Of the 464 studies, a total of 290 were excluded based on screening of content in order to categorize the codes into themes until consensus was
title and abstract, resulting in 174 articles for selection based on full text reached (see Appendix B4). The classification of the codes into themes
screening. A random sample of 15 per cent was screened by three au- resulted in nine themes. Text fragments of the codes that were assigned
thors. Each author rated 26 articles and a generalized Kappa statistic for to a theme were used to describe the themes in the results section. Ap-
use with multiple raters was calculated, yielding an acceptable kappa pendix C provides an overview of the codes that were allocated to the
coefficient of 0.79. Any discrepancies between the authors were dis- themes.
cussed until consensus was reached. The first author screened the To answer the second research question, only empirical studies were
remaining articles, resulting in 118 included articles. Finally, a total of included (n → 84). To include findings of both quantitative and quali-
34 studies were found in the second search strategy and screened on the tative studies, we converted all empirical results into a qualitative form
3
L.H. Schellekens et al. Studies in Educational Evaluation 71 (2021) 101094
by formulating a summary phrase that depicted the findings of the study summary phrase overlapped with more than one theme, findings were
(Dixon-Woods, Agarwal, Jones, Young, & Sutton, 2005; Van Leeuwen & discussed by the authors (LS and LdJ) until consensus was reached. For
Janssen, 2019). For each published result in the results section of a example, when the result related to ‘feedback’ we discussed whether
particular study, a summary phrase was formulated in alignment with feedback was reported as an instructional tool to inform learning and
the aim(s) or research question(s) of the empirical study (see Appendix teaching (to assign this result to the theme ‘use various sources of in-
B5). A study could yield multiple summary phrases. The results of three formation to act upon) or whether feedback was reported, for example,
empirical studies were not applicable to include because no results as a teacher skill (to assign the result to ‘teachers adapt to students’
section was included, or because the published results did not match the needs’).
aim or research question of the study. The 81 empirical studies included
generated 221 summary phrases (mean → 2.7 per study). Summary 3. Results
phrases of nine studies (42 phrases) were checked and discussed for
accuracy and formulation of the phrases by the third author until 3.1. Descriptive analysis of the literature
consensus was reached. Finally, all summary phrases were thematically
analysed for the outcomes of research question 1 (Dixon-Woods et al., A total of 131 articles were included in this review. The majority of
2005). To assign a summary phrase to a theme, we first classified the the studies were conducted in Europe (n → 58, 44.2 %). The remainder of
subject or outcome of the summary phrase. For example, whether the the articles were written by authors from Asia (n → 25, 19.1 %), North
results related to the teacher’s skills, practices, perceptions, or (assess- America (n → 20, 15.3 %), Australia and New Zealand (n → 15, 11.5 %),
ment) knowledge/professionalisation; to the student’s skills, percep- or by authors from other, or a mix of countries (n → 13, 9.9 %). With
tions, or (assessment) knowledge; to the teacher and student regard to the educational sector that was researched, an almost even
interaction/relationships; to the learning environment; or to the out- distribution was found across primary schools (28.7 %), secondary
comes of assessment; or other. We then further examined each summary schools (23.8 %), and higher education (26.7 %). The sector of voca-
phrase with regard to what processes or objects were studied to classify tional education was underrepresented in the sample (2.0 %). Various
the result into a subtheme. We assigned a result to a theme when the studies included more than one sector in their sample (18.8 %).
subject matched the description of the codes that formed the theme. Empirical research was conducted in 84 (64.1 %) studies. The remaining
Consequently, the results that are reported within a theme, may have a studies were classified as conceptual (n → 47) and contained theoretical
broader scope than the theme itself. For example, the theme ‘teachers or discussion studies (n → 43) or they were review studies (n → 4). The
adapt to students’ needs’ includes results that reported outcomes with majority of the studies included reported research related to the notion
regard to teacher’s practices, teachers’ skills, and teacher’s perceptions of AfL (n → 109, 83.2 %). The notion of AaL was the subject in nine
towards assessment because the teacher is central in both the outcome of studies (6.9 %), and one study reported findings related to the notion of
the study and in the description of the theme. In a few cases that the AoL (0.8 %). In 12 (9.2 %) studies the subject of the research contained
4
L.H. Schellekens et al. Studies in Educational Evaluation 71 (2021) 101094
two or three assessment notions. assessment is stimulated when practices include the possibility for stu-
dents to develop the skills to assess themselves and their peers, and when
3.2. Overview themes how assessment relates to learning and its empirical activities of self-assessment (i.e., self-reflection, self-evaluation) and
support peer assessment are offered within the course to practise these skills.
Students are actively involved in learning when they have the oppor-
The synthesis of common characteristics identified nine themes, tunity to take responsibility for directing their own learning. For
based on the inductive analysis of definitions of the three assessment example, by providing activities wherein students can plan, monitor and
notions. The themes refer to common characteristics of how assessment evaluate their own learning.
relates to learning. We do not intend that the three assessment notions Nine empirical studies reported findings with regard to students’
are covered in all themes or that the three notions are equal to each involvement in assessment practices. Overall, the majority of students
other. The synthesis represents assessment practices that generally reported very few or no involvement in assessment practices that re-
matter in relation to student learning. For readability we will use the flected principles of assessment that support learning, such as the
term ‘educational assessment’ as an overall umbrella term that includes involvement of students in peer- and self-assessment (*DeLuca, Chap-
the synthesis of common characteristics of AaL, AfL, and AoL. Assess- man-Chin, LaPointe-McEwan, & Klinger, 2018; *Leirhaug & Annerstedt,
ment can be defined as “a wide range of methods for evaluating student 2016). Findings in nine studies were focused on the attitudes students
performance and attainment” (Gipps, 2011, p.11). The common char- have towards assessment approaches that involves them in assessment
acteristics (themes) we found shed light on the range of these methods and learning. A majority of the students expressed positive attitudes
and emphasize the conditions for successful educational assessment, towards such assessments (e.g. *Carless, 2002; *McDowell et al., 2011;
regardless of whether the method is applied according to purposes of an *Thompson et al., 2017). Students valued that they became more
AaL, AfL, or AoL approach. We grouped the themes into: 1) responsible for their own learning, which, in turn, motivated them to
Student-teacher roles and relationships within assessment; 2) Assess- learn. Students reported that sharing success criteria, peer support and
ment learning environment; and 3) Educational outcomes of assessment. teacher feedback were helpful to engage them in their learning, as was
See Fig. 2 for an overview of the themes. being given the opportunity to improve work before the final deadline
In the next sections we will elaborate on each theme. Each theme is (e.g. *DeLuca et al., 2018; *McDowell et al., 2011; *Newby & Winter-
described in two parts. The first paragraph contains a description of the bottom, 2011).
theme. The content of the theme’s description is generated from the text
fragments that described or defined the assessment notions AaL, AfL, 3.3.2. Students and teachers have a collaborative relationship
and AoL that were assigned to the particular theme (research question Educational assessment refers to a collaborative relationship be-
1). The subsequent paragraph(s) contain main findings from the tween students and teachers. Students and teachers share roles and re-
empirical studies that matched the particular theme (research question sponsibilities with each other, reflected by a shift from a teacher-centred
2), in order to contribute to a more coherent understanding of the towards a more student-centred perspective. The teacher acts as a guide
conducted research within the theme in the past two decades within of students’ learning processes, and students act as partners instead of
various educational sectors. Full details of the studies, assigned to being passive recipients of teachers’ decisions and actions. Practices
themes, can be found in Appendix D. provide opportunities to collaborate and to participate in two-way di-
alogues, negotiations and discussions.
Four studies reported findings about student-teacher relationships
3.3. Student-teacher roles and relationships within assessment
and their role in the assessment process. A (temporary) process of co-
construction, wherein teachers guide students to develop as autono-
3.3.1. Students are actively involved
mous and self-regulated learners, facilitates a more central role of the
Educational assessment refers to students who are actively involved
student as a partner in assessment. Co-construction is supported through
in assessment and in their own learning. Active involvement in
Fig. 2. Overview themes of how assessment relates to learning, based on the analysis of definitions and descriptions of AaL, AfL, and AoL.
Note. The number in brackets refer to the empirical studies that reported findings within the theme.
5
L.H. Schellekens et al. Studies in Educational Evaluation 71 (2021) 101094
a focus on and guidance about the learning to be achieved, shared Nine studies reported findings on outcomes of teacher development
ownership and understanding, and a safe and supportive learning programmes aimed at the enhancement of assessment practices and
environment (*Heritage, 2018; *Willis, 2011). Student-teacher re- skills. These programmes impacted on changes in classroom practice,
lationships may be influenced by the perceptions that teachers and such as an enhanced constructive alignment and an improvement in
students hold. For example, with regard to preferences and perceptions applying assessment strategies (*Jonsson, Lundahl, & Holmgren, 2015;
of students, it was found that students who held learning goals viewed *Wong, 2007). The developmental programmes also altered teachers’
assessment activities as a joint teacher-student responsibility, while views about teaching, learning and instruction (e.g. *Crossland, 2012;
students with performance goals viewed assessment as a teacher’s sole *Harrison, 2005). Another reported outcome of teacher development
responsibility (*Cowie, 2005). programmes was a change in classroom culture and in school’s assess-
ment culture (*Jones & Moreland, 2005), for example, by the estab-
3.3.3. Students and teachers use various sources of information to act upon lishment of local communities of assessment practice (*Reimann &
Educational assessment refers to a flow of meaningful information Wilson, 2012).
about the achievement of learning to inform the teaching and the
learning, and to act upon. Students and teachers continuously collect, 3.3.5. Teachers adapt to students’ needs
interpret and reflect on various sources of information to monitor Educational assessment refers to the ability of a skilled teacher to
progress and use the information to further learning. Practices include modify and adjust ongoing teaching and learning in response to stu-
opportunities for practice and rehearsal, appropriate and constructive dents’ individual pedagogical preferences. Teachers meet students at
feedback, and low-stakes assessments. Practices also facilitate the up- their level of knowledge and support students in how to progress based
take of feedback by connecting information on learning across assess- on their current achievement. Teachers’ assessment practices include
ments and/or modules. efficient and innovative teaching, monitoring and scaffolding activities,
Six studies presented findings on the presence of meaningful and differentiation between students.
assessment information to inform the teaching and learning process. Thirty studies investigated teachers’ assessment practices in their
These studies were all aimed at feedback as a source of information. classrooms and the perceptions teachers have towards teaching and
Overall, findings indicated that different participants may prefer learning. We grouped these findings into a) diversity in teachers’
different types of feedback (e.g. *Colby-Kelly & Turner, 2008; assessment practices; b) teachers’ use of assessment strategies and their
*Hargreaves, 2013). Consequently, students may benefit from multiple perceived competence; and c) teachers’ perceptions towards assessment
types of feedback (*Hargreaves, 2013). As for the uptake of feedback, practices that support learning.
reasons for students to not use the feedback given to them are: 1) when
no opportunity is provided to use the feedback to improve the work, and a) Diversity in teachers’ assessment practices
2) when the feedback is confusing (e.g., too vague/brief) (*Mumm,
Karm, & Remmik, 2016). Regarding the uptake of peer feedback, pre- Four studies published results about teachers’ diversity in assessment
conceptions of students concerning their peers may affect the usefulness practices. Teachers vary in the way they practise assessment (*Dixon,
of peer feedback, because peers may not be sufficiently critical or too Hawe, & Parr, 2011; *Tolgfors, 2018). For example, *Dixon et al. (2011)
stringent, nor trustworthy (*Colby-Kelly & Turner, 2008; *Mumm et al., noted differences in the degree of student involvement in assessment
2016). activities and the amount of control exerted by the teacher. Teacher
assessment practices may also vary in quality, ranging from low quality,
3.3.4. Students and teachers are literate in assessment wherein assessment is mainly used for grading, to high quality, wherein
Educational assessment refers to both teachers’ and students’ a variety of assessment tools are used to promote learning (*Birenbaum,
development to become literate in and familiar with talking about Kimron, & Shilton, 2011). In addition, quality also can be viewed in how
learning and assessment, and their understanding of the assessment teachers apply and act upon assessment tools that support learning
process and of what quality looks like. Practices include the develop- (*Marshall & Drummond, 2006).
ment of classroom conversation, sharing and discussing assessment
criteria and studying models of strong and weak work in order to b) Teachers’ use of assessment strategies and their perceived
communicate about and improve student learning. Twenty studies re- competence
ported findings on subjects related to teachers’ and students’ under-
standing of assessment procedures and concepts, and/or their Findings of studies that researched the assessment strategies used by
development to become assessment literate. We grouped these findings teachers in the classroom generally show that teachers are not fully
into research directed at a) understanding of the assessment notions and utilizing all available strategies (*Hawe & Parr, 2014; *Wong, 2014). In
their definitions; and b) outcomes of teacher professional development general, innovative assessment strategies that support learning and
programmes. adapt to students’ needs were not implemented on a regular basis in
classrooms (e.g. *Hawe & Parr, 2014; *Marshall & Drummond, 2006).
a) Understanding of the assessment notions and their definitions Teachers adhered to conventional practices, such as providing an
overemphasis on tests (*Volante, 2010), or only presenting generally the
Only one of the included studies examined student perspectives on learning goals to their students (*Hawe & Parr, 2014). Five studies
the clarity of definitions of the assessment notions AaL, AfL, and AoL. investigated teachers’ perceived competence in assessment practices.
Although most students could grasp the concept of AfL, none of the Teachers felt they were competent in giving oral feedback to students
students could provide a complete definition (*Lorente-Catal”an & Kirk, (*Tunku Ahmad et al., 2014). They perceived themselves as less
2016). Eleven studies reported findings regarding understanding among competent in practices to impose different curricula for different groups
teachers. In general, teachers understood the improvement and of students (*Boyle & Charles, 2010), and in grading students’ individ-
student-focused purpose of assessment (e.g. *Hui, Brown, & Chan, 2017; ual effort and ability (*Bramwell-Lalor & Rainford, 2016; *Tunku
*Leirhaug & MacPhail, 2015). However, in most studies, teachers Ahmad et al., 2014).
showed poor and varied understandings of the different assessment
notions and related practices, indicating no clarity of definition (e.g. c) Teachers’ perceptions towards assessment practices that support
*Boyle & Charles, 2010; *Torrance, 2007; *Volante, 2010). learning
b) Outcomes of teacher professional development programmes Seventeen studies targeted teachers’ perceptions towards assessment
6
L.H. Schellekens et al. Studies in Educational Evaluation 71 (2021) 101094
practices. Teachers valued assessment practices that are geared towards c) Implementation
improving learning (*Colby-Kelly & Turner, 2008; *Warwick, Shaw, &
Johnson, 2015). Teachers least valued practices that have a strong focus Implementation of an assessment environment that supports
on performance orientation (*Warwick et al., 2015). Findings of studies learning was researched on a national, school and classroom level. These
that examined the consistency of teachers’ values with their classroom studies generally targeted facilitating and constraining factors that
assessment practices, specified a gap between teachers’ values and their affected implementation. Successful implementation on a national level
actual practice. Teachers favoured assessment tasks that improve was enabled when there was trust between the various assessment
learning and develop students’ engagement in assessment, but practised stakeholders and when the programme was adapted to the local context
assessment mainly to measure achievement and for accountability (*Hopfenbeck, Fl”orez Petour, & Tolo, 2015). At the school level, facili-
purposes (*James & Pedder, 2006). Furthermore, incongruences were tating elements for implementation were related to the active involve-
noted between the perceptions of students and of teachers. Teachers ment of the school principal and an assessment literate team, and
generally perceived a higher level of assessment activities meant to embedding the assessment environment as part of the school’s culture
promote learning as present in their classrooms than did students (*Hill, 2011; *Nortvedt, Santos, & Pinto, 2016; *Smith & Engelsen,
(*Leirhaug & Annerstedt, 2016; *Pat-El, Tillema, Segers, & Vedder, 2013). Findings of seven studies referred to the implementation of an
2015). assessment learning environment at the classroom level. In general,
implementation was facilitated by teachers’ commitment and teachers’
3.4. Assessment learning environment growing competence in assessment principles and knowledge. Another
enabler concerned the opportunity for teachers to engage in professional
3.4.1. Supportive learning environment that engages students development (e.g. *Braund & DeLuca, 2018; *Lee & Coniam, 2013;
Educational assessment refers to a learning environment wherein *Webb & Jones, 2009). In various studies, the establishment of an
students feel safe and are encouraged to engage with the learning pro- appropriate classroom assessment culture was seen as crucial for suc-
cess. The focus is on the development of students’ confidence and cessful implementation (e.g. *Mak & Lee, 2014; *Webb & Jones, 2009).
strengthening their motivation. Practices include opportunities to make
and learn from errors and to help students to feel safe to take risks. 3.5. Educational outcomes of assessment
Eight studies researched aspects related to a supportive learning
environment. A few studies investigated determinants that affect the 3.5.1. Enhance students’ learning
engagement of students in learning and assessment activities (e.g. Educational assessment refers to a focus on the teaching and learning
*Dijksterhuis, Schuwirth, Braat, Teunissen, & Scheele, 2013; *Lee & process in order to enhance learning for all students to the maximum of
Coniam, 2013). With regard to students’ confidence, students may their ability. Assessment practices are aimed at improving students’
experience difficulties in assessment activities that support learning, achievement and the quality of their work and improving the quality of
such as asking appropriate metacognitive questions, or judging them- teaching.
selves or their peers (*Ellery, 2008; *Sadeghi & Rahmati, 2017). Eight studies researched the impact of assessment that supports
learning on students’ learning skills and strategies. In general, the
3.4.2. Aligned learning environment at the classroom and the program level employment of assessment strategies promoted interest in learning
Educational assessment refers to the design and implementation of (*Fletcher, 2016; *Tolgfors, 2018; *Torrance, 2007). The impact of
an aligned learning environment at both the classroom and the pro- assessment on learning depends on how the pedagogical approach is
gramme level, wherein teaching, learning, and assessment form an realized (e.g. *Hume & Coll, 2009; *Torrance, 2007) and on intraper-
iterative relationship. Classroom practices embrace both formally sonal factors of students, such as student motivation and interest
structured and informally spontaneous activities that are spread evenly (*Fletcher, 2016). Various assessment activities may contribute to the
through the learning process. At the programme level, teaching, development of students’ self-regulation and metacognition (e.g. *Baas,
learning, and assessment sequences within and between courses are Castelijns, Vermeulen, Martens, & Segers, 2015; *Fletcher, 2016; *Hawe
made explicit to various stakeholders. & Dixon, 2017). For example, when the assessment facilitates the
The assessment learning environment was examined in 18 studies. sharing of learning goals and quality criteria, and provides students with
We grouped these findings into a) cohesion between intended policy and tools that elicit evidence of learning (*Hawe & Dixon, 2017). However,
actual classroom practice; b) the design of an assessment environment the extensive support of a teacher in sharing criteria for success may also
that supports learning; and c) implementation. weaken student autonomy, as the more clearly task criteria and re-
quirements are stated, the easier it will be for students to accomplish the
a) Cohesion between intended policy and actual classroom practice task (*Torrance, 2007).
Four studies researched the cohesion of an assessment learning 3.5.2. Determine the status of learning achievement
environment between the intended policy on a national level and on the Educational assessment refers to the measuring and judging of
school level, and the actual practice on the classroom level. Policy on the learners’, teachers’ and schools’ achievements in order to make
national level showed cohesion with classroom practices (*Hume & Coll, informed decisions. These decisions relate to purposes of internal
2009; *Lorente-Catal” an & Kirk, 2016). However, there were mixed accountability, e.g., to get informed and evaluate what and how much
findings regarding consistency between the school and the classroom has been learned, and to determine the outcomes of achievement. Sec-
policy, indicating a gap between the assessment practices described in ondly, decisions also relate to purposes of external accountability, such
the curriculum and their actual use in the classroom (*Colby-Kelly & as certification, and high-stakes assessments.
Turner, 2008; *Fenwick, 2017). Nine studies aimed to determine the status of learning achievement.
These studies examined the effect of assessment on students’ achieve-
b) The design of an assessment environment that supports learning ment. At the classroom level, the majority of the studies found positive
effects through the achievement of higher scores for newly implemented
Three studies targeted the design of an assessment learning envi- assessment tasks or in learning environments that were meant to support
ronment that supports learning. For example, studies researched how learning (e.g. *Huang, 2015; *Li, 2018; *Wiliam, Lee, Harrison, & Black,
the design of an assessment framework (*Macphail & Halbert, 2010) or 2004). However, at the national level, *Hopfenbeck et al. (2015) re-
an assessment task (*Davies, Pantzopoulos, & Gray, 2011) aided the ported that despite successful implementation of an assessment learning
learning and teaching process. environment in municipalities in Norway, the researchers did not find
7
L.H. Schellekens et al. Studies in Educational Evaluation 71 (2021) 101094
an effect of the assessment programme on students’ learning outcomes, support learning. The learning environment should facilitate a contin-
measured by national tests in reading and mathematics. uous flow of (feedback) information to inform current teaching and
learning, as well as to act upon (e.g., within or across modules, or
4. Conclusions and discussion throughout the curriculum). In our review study, many descriptions of
the notion of AfL referred to the giving of feedback as a key strategy to
In this scoping review, we analysed and synthesized the various support learning (e.g. Black & Wiliam, 2009). However, giving feedback
definitions of Assessment as Learning (AaL), Assessment for Learning to students does not improve their skills without those students engaging
(AfL), and Assessment of Learning (AoL) published in research studies with and acting upon the feedback (Boud & Molloy, 2013; Winstone,
for common characteristics. Next, we examined what is known about the Nash, Parker, & Rowntree, 2017). Consequently, assessment practices
common characteristics of the assessment notions from findings re- should be designed in a way that feedback is not seen as the end point of
ported in empirical studies. We have chosen to synthesize the definitions the learning process, but rather as the starting point (Burke, 2009).
of the assessment notions because the definitions and descriptions In the second part of the scoping review, we used the thematic de-
overlap in meaning and are not used unambiguously in practice. scriptions of the synthesis as a framework to analyse the results of the
The synthesis of common characteristics of AaL, AfL, and AoL has included empirical studies on their contributions to practice. In this
resulted in nine themes that refer to how educational assessment relates section, conclusions are elaborated and implications for practice dis-
to learning, as displayed in Box 1. For readability we used the term cussed. Regarding student-teacher roles and relationships within
‘educational assessment’ as an umbrella term that includes the common assessment, the content of the themes emphasizes the active role of the
characteristics of AaL, AfL, and AoL. The themes are grouped into: 1) student in assessment and in directing their own learning. However, this
Student-teacher roles and relationships within assessment; (2) Assess- conceptualization was underrepresented in the results of the empirical
ment learning environment; and 3) Educational outcomes of assessment. studies in this review. Only a few studies researched the interactive
By viewing the assessment notions as a whole, the synthesis of the relationship between students and teachers, or reported findings about
notions present a powerful approach to ensure and enhance students’ the shift towards a more active and central role of students in assess-
learning (Biggs & Tang, 2011; Lau, 2016; *Taras, 2002). We do not ment. This indicates a gap between the thematic description that ad-
argue that all notions are covered in all themes, or that the three vocates an active role for students in assessment, and current practices
assessment notions are equal to each other. However, we believe that the that perpetuate a classroom culture wherein the majority of the teachers
synthesis provides a more nuanced overview of assessment and learning stick to traditional assessment practices. In practice, this means that
than the individual descriptions and definitions of the assessment no- students’ learning is still dependent on the teacher (*Thompson et al.,
tions do. Although the focus on learning and students’ active roles 2017). To make a shift towards a more student-centric perspective, we
within learning processes is central to definitions of AaL and AfL, the believe it is crucial to invest in both student and teacher intervention
results of this review give more profound insight in the roles and re- programmes. The content of such programmes should cover the
lationships students and teachers have within this process. For example, knowledge and skills needed to become assessment literate and to fulfil
the synthesis provides insight that both students and teachers need to be the collaborative roles of teachers and students within the assessment
literate in assessment. Students need to (learn to) understand the pur- process (*Swaffield, 2011). These programmes can help teachers to
poses and processes of assessment and need to be able to judge their grow and feel competent in a more supportive role (*Harrison, 2005),
work to become successful self-regulated learners (Sadler, 1989; Smith, and can guide students in their development to take a more active role
Worsfold, Davies, Fisher, & McPhail, 2013). Teachers should be literate within the assessment process (*Webb & Jones, 2009).
in assessment to be able to understand and differentiate the aims of With regard to the assessment learning environment, the themes
assessment and to create and use assessment information to teach refer to a supportive and aligned environment that engages and moti-
effectively (Pastore & Andrade, 2019; Xu & Brown, 2016). A more vates students in learning and that integrates the various assessment
nuanced overview is also highlighted by the themes that relate to the methods and functions at the classroom and the programme level (Lau,
context and outcomes of assessment. For example, the results of this 2016; Zeng, Huang, Yu, & Chen, 2018). However, most studies that
review study provide awareness of the design of an environment that researched the assessment learning environment were aimed at
Box 1
Synthesis of common characteristics of the notions AaL, AfL, and AoL.
8
L.H. Schellekens et al. Studies in Educational Evaluation 71 (2021) 101094
facilitating and constraining aspects of the implementation, such as the Firstly, we only included research articles that referred to the assessment
role of the principal, a supportive assessment culture, and teachers’ notions of AaL, AfL, and AoL as it was our intention to clarify the
professional development (e.g. *Smith & Engelsen, 2013). None of the complex meanings and practices of these notions particularly. We
included studies reported findings about alignment between courses therefore did not include research that focused on definitions and
and/or between a course and the curriculum level. This may indicate practices regarding assessments with formative and summative func-
that current assessment practices are oriented towards enhancing tions only, although these notions are widely used and may also be used
short-term learning (*Tan, 2013). In vocational and higher education, interchangeably in practice. Moreover, due to time and source con-
practices are often enabled by a modular degree structure and a system straints we neither did include books that are written on this subject.
linked to grading (Jessop, Mcnab, & Gubby, 2012). Such practices tend Consequently, we acknowledge that our results may be biased and that
to fix students’ attention only on overcoming the hurdle to pass the there may be more thematic descriptions that relate to assessment and
modular assessment, without awareness of their learning beyond that learning. Further integration of the assessment notions, by including
period (*Tan, 2013). To integrate learning over a period of time and to more sources and/or the notions Formative assessment and Summative
avoid fragmentation of the curriculum (*Tan, 2011), we would like to assessment, may be interesting for future research. Secondly, the find-
emphasize the need to take a more programmatic perspective on the ings of this study indicate that an assessment culture with a central role
design and implementation of assessment and learning activities. Ex- for students is still in its infancy. These findings may be affected by the
amples of a programmatic approach are found in health profession ed- search strategies we used for this review. For example, by not including
ucation within the concept of programmatic assessment. In specific search terms such as ‘self-regulated learning’ or ‘autonomy’.
programmatic assessment, individual methods of assessment are chosen However, a recent study by Winstone et al. (2017) that reviewed how
for their alignment with the curriculum outcomes and their information students actively engage with feedback also noted that research on this
value for the student and the teacher (Bok et al., 2013; Van der Vleuten topic was fragmented and underrepresented. More research is needed in
et al., 2012; Van der Vleuten, Schuwirth, Driessen, Govaerts, & Heene- this field. Thirdly, the synthesis provides limited guidance on “action-
man, 2015). able practice” to support learning (*Tan, 2017, 199). Although the de-
Regarding the educational outcomes of assessment, the themes refer scriptions of the themes include examples of classroom practice, we
to two purposes of assessment: to enhance students’ learning and to believe that a successful approach to educational assessment does not
determine the status of learning achievement. The two purposes corre- rest on techniques that can simply be added to the teacher’s repertoire
spond to the traditional ‘formative’ and ‘summative’ functions of (*James & Pedder, 2006). Assessment that supports learning is an
assessment, and to characteristics of the notions of AfL and AoL approach which needs to be adapted for each context rather than a
respectively (e.g. *Baird et al., 2017). In practice, often a tension be- general framework that can be directly applied (*Baird et al., 2017;
tween these two purposes was experienced. For example, the dominance *Bennett, 2010).
of graded assessment tasks within a course may limit time for assessment Our review shows that many perspectives are important in an
tasks geared towards enhancing learning (*Mumm et al., 2016). There is assessment culture focused on learning. We believe that the results of
agreement among researchers that the two purposes of assessment this review provide stakeholders with a clear and integrative view of
overlap (e.g. *Bennett, 2010; *Hargreaves, 2005) and that they should how educational assessment relates to learning, which is a prerequisite
be connected with the overall teaching and learning environment (Lau, to improve the assessment culture. The synthesis we provided can be
2016). To obtain the benefit of each and to develop as a learner and as a used as a practical tool for teachers to improve their daily practices with
learning organization, both purposes should be balanced in the design students, for faculty development programs to better train teachers, and
and implementation of the lesson plan, the course, and the curriculum. for institutions to better organize their assessment structure and pro-
More research is needed to examine how the promotion of student grams. By synthesizing the notions of AaL, AfL, and AoL we challenged
learning and decision-making about the status of learning achievement the differentiation of the assessment notions in literature. The synthesis
can be balanced in an appropriate way. For example, by viewing mirrors how the assessment notions relate to learning and emphasizes
assessment not as formative or summative, but as a continuum of low- that “assessment is learning” (*Hayward, 2015, 27). The notions of AaL,
and high-stakes assessments. Low-stakes assessment (e.g. narrative AfL, and AoL should therefore be seen in coherence with one another in
feedback or assessments that measure progress) continuously provide order to establish an assessment culture that facilitates students’
students and teachers with evidence of students’ performance. This in- learning maximally.
formation can be used to (self)regulate students’ learning. At the end of a
learning trajectory, information from the various low-stakes assessments Declaration of Competing Interest
can be aggregated to make a high-stakes decisions for graduation or
certification (*Schuwirth & van der Vleuten, 2012). The authors report no declarations of interest.
4.1. Limitations
With regard to the first inclusion criteria, studies were excluded when the article concerned, for example, a commentary or book review. Regarding
the second criterion, by ‘focus on learning through assessment’ we mean a focus on assessment in relation to learning, learning achievement, learning
processes, etcetera. For example, studies that were aimed solely at learning strategies or at learning styles, but not at assessment, were excluded. By
‘students’ we mean the learning of a student. For example, studies that were aimed to investigate learning of a teacher, a surgeon or other professionals
were excluded. By ‘a regular physical educational context’ we mean that studies were excluded when the context was not regular education, but was
aimed at learning disabilities, special needs students, illnesses, or gifted students. With ‘physical educational context’ we intend that the assessment or
examination or evaluation took place in a physical classroom or in a physical educational program in primary, secondary, vocational, or higher
education. Thus, studies with a context of online learning, computer-based simulations, or MOOCs were excluded. Finally, with regard to the third
criterion, studies were included when in the text of the article a description or definition was given how the notion of assessment related to the learning
9
L.H. Schellekens et al. Studies in Educational Evaluation 71 (2021) 101094
B1 ↑ B2 (RQ1)
notion Example B1 Text fragment description (node) Example B2 Anonymized node used for coding
AaL AaL is concept of assessment where students learn, self-correct, and collaborate Assessment is concept of assessment where students learn, self-correct, and
during the assessment collaborate during the assessment
AfL Process of AfL includes classroom interaction, questioning, structured classroom Process of Assessment includes classroom interaction, questioning, structured
activities, and feedback geared at helping students to bridge learning gaps classroom activities, and feedback geared at helping students to bridge learning gaps
AoL AoL constitute the certification of what and how much students have acquired Assessment constitutes the certification of what and how much students have acquired
over the course of learning. over the course of learning.
B3 (RQ1)
Example coding
Node Codes
Process of Assessment includes classroom interaction (a), questioning (b), structured classroom activities (c), and feedback (d) geared at helping (a) Practice involves
students to bridge learning gaps (e) interaction
(b) Practice involves
questioning
(c) Planned process
(d) Practice involves
feedback
(e) Bridge learning gaps
B4 (RQ1)
B5 (RQ2)
Example of a summary phrase
Summary phrase: ‘by examining achievement of secondary school students who worked in classrooms wherein teachers
were trained in their formative assessment strategies (→ aim), findings indicated that improving formative assessment
practices in classrooms produced tangible benefits in terms of achievement of externally mandated assessments with
effect sizes 0.2↔0.3 (→ result)’
Codes (8)
↓ practice involves self-assessment (68)
↓ focus on students’ SRL and autonomy (62)
↓ students are active agents in own learning (49)
Actively involved students ↓ development of cognition and metacognition (32)
↓ students are active agents in assessment (21)
↓ practice involves peer-assessment (18)
↓ development of critical thinking and inquiry skills (12)
↓ development of life-long learning skills (8)
Codes (11)
↓ involves both students and teachers (46)
↓ practice involves interaction (26)
↓ student-centered perspective (19)
Collaborative relationship
↓ interactive process (13)
↓ teacher as guide (13)
↓ practice involves questioning (11)
↓ social cultural perspective (11)
10
L.H. Schellekens et al. Studies in Educational Evaluation 71 (2021) 101094
(continued )
Student - teacher roles and relationships
Codes (4)
↓ practice involves quality criteria (27)
Assessment literacy ↓ assessment literacy (15)
↓ assessment as substitute for learning (10)
↓ practice involves communication (7)
Codes (6)
↓ adjust ongoing teaching (27)
↓ adapt to the needs of students (19)
Teachers adapt to students’ needs ↓ pedagogical approach (11)
↓ teacher provides support (9)
↓ personalized (8)
↓ key didactic skill (7)
The learning environment
Codes (6)
↓ enhance motivation (12)
↓ practice involves engaging students in learning (11)
Supportive learning environment ↓ development of students’ confidence (9)
↓ practice involves authenticity (5)
↓ teachers’ belief that students can improve (3)
↓ alter students’ attitudes (3)
Codes (10)
↓ integrated entity (35)
↓ practice involves learning goals (30)
↓ classroom level (25)
↓ collection of instruments, tools, tasks, and practices (25)
Aligned learning environment at the classroom and programme level ↓ planned (formal) process (19)
↓ assessment at the middle and/or at the end of learning (18)
↓ unplanned (informal) continuous process (18)
↓ part of everyday practice (15)
↓ practice involves design of tasks (7)
↓ provide rich learning environment (6)
Educational outcomes of assessment
Codes (6)
↓ improve learning (57)
↓ focus on the process of learning (47)
Enhancing students’ learning ↓ improve students’ achievement (24)
↓ improve teaching (11)
↓ focus on the process of teaching (8)
↓ improve quality education (3)
Codes (7)
↓ measure outcomes (44)
↓ accountability (14)
↓ practice involves making judgements (12)
Determining the status of learning achievement
↓ result in a score or grade (7)
↓ differentiating between students (5)
↓ achieve high standards (4)
↓ high-stakes assessment (3)
Note. Brackets refers to the number of text fragments (originating from descriptions of the assessment notions) assigned to a category.
11
L.H. Schellekens et al. Studies in Educational Evaluation 71 (2021) 101094
Supplementary material related to this article can be found, in the online version, at doi:[Link]
References1 *Colby-Kelly, C., & Turner, C. E. (2008). AFL research in the L2 classroom and evidence
of usefulness: Taking formative assessment to the next level. Canadian Modern
Language Review, 64(1), 9–37. [Link]
Allal, L. (2010). Assessment and the regulation of learning. International Encyclopedia of
*Cowie, B. (2005). Pupil commentary on assessment for learning. Curriculum Journal, 16
Education, 3, 172–180.
(2), 137–151. [Link]
ARG. (1999). Assessment for learning: Beyond the black box. Cambridge: University of
*Crooks, T. (2011). Assessment for learning in the accountability era: New Zealand.
Cambridge School of Education.
Studies in Educational Evaluation, 37(1), 71–77. [Link]
ARG. (2002). Assessment for learning: 10 principles. [Link]
stueduc.2011.03.002
02600-8_12
*Crossland, J. (2012). Embedding assessment for learning (AfL) into science. SSR, 93
Arksey, H., & O’Malley, L. (2005). Scoping studies: Towards a methodological
(344), 127–133.
framework. International Journal of Social Research Methodology: Theory and Practice,
*Dann, R. (2014). Assessment as learning: Blurring the boundaries of assessment and
8(1), 19–32. [Link]
learning for theory, policy and practice. Assessment in Education: Principles, Policy and
*Baird, J. A., Andrich, D., Hopfenbeck, T. N., & Stobart, G. (2017). Assessment and
Practice, 21(2), 149–166. [Link]
learning: Fields apart? Assessment in Education: Principles, Policy and Practice, 24(3),
Daudt, H. M. L., van Mossel, C., & Scott, S. J. (2013). Enhancing the scoping study
317–350. [Link]
methodology: A large, inter-professional team’s experience with Arksey and
*Baas, D., Castelijns, J., Vermeulen, M., Martens, R., & Segers, M. (2015). The relation
O’Malley’s framework. BMC Medical Research Methodology, 13(1), 1–9. [Link]
between assessment for learning and elementary students’ cognitive and
org/10.1186/1471-2288-13-48
metacognitive strategy use. British Journal of Educational Psychology, 85(1), 33–46.
*Davies, A., Pantzopoulos, K., & Gray, K. (2011). Emphasising assessment ‘as’ learning by
[Link]
assessing wiki writing assignments collaboratively and publicly online. Australasian
*Bennett, R. E. (2010). Cognitively based assessment of, for, and as learning (CBAL): A
Journal of Educational Technology, 27(5), 798–812.
preliminary theory of action for summative and formative assessment. Measurement,
*DeLuca, C., Chapman-Chin, A. E. A., LaPointe-McEwan, D., & Klinger, D. A. (2018).
8(2–3), 70–91. [Link]
Student perspectives on assessment for learning. Curriculum Journal, 29(1), 77–94.
Bennett, R. E. (2011). Formative assessment: A critical review. Assessment in Education:
[Link]
Principles, Policy and Practice, 18(1), 5–25. [Link]
*Dijksterhuis, M. G. K., Schuwirth, L. W. T., Braat, D. D. M., Teunissen, P. W., &
0969594X.2010.513678
Scheele, F. (2013). A qualitative study on trainees’ and supervisors’ perceptions of
Biggs, J., & Tang, C. (2011). Teaching for quality learning at university. Society for Research
assessment for learning in postgraduate medical education. Medical Teacher, 35(8),
into Higher Education & Open University Press. [Link]
e1396–e1402. [Link]
ctcp.2007.09.003
*Dixon, H. R., Hawe, E., & Parr, J. (2011). Enacting assessment for learning: The beliefs
*Birenbaum, M., DeLuca, C., Earl, L., Heritage, M., Klenowski, V., Looney, A., et al.
practice Nexus. Assessment in Education: Principles, Policy and Practice, 18(4),
(2015). International trends in the implementation of assessment for learning:
365–379. [Link]
Implications for policy and practice. Policy Futures in Education, 13(1), 117–140.
Dixon-Woods, M., Agarwal, S., Jones, D., Young, B., & Sutton, A. (2005). Synthesising
[Link]
qualitative and quantitative evidence: A review of possible methods. Journal of
*Birenbaum, M., Kimron, H., & Shilton, H. (2011). Nested contexts that shape assessment
Health Services Research and Policy, 10(1), 45–53. [Link]
for learning: School-based professional learning community and classroom culture.
1355819052801804
Studies in Educational Evaluation, 37(1), 35–48. [Link]
Dunn, K. E., & Mulvenon, S. W. (2009). A critical review of research on formative
stueduc.2011.04.001
assessment: The limited scientific evidence of the impact of formative assessment in
Black, P., & Wiliam, D. (2009). Developing the theory of formative assessment.
education. Practical Assessment, Research and Evaluation, 14(7), 1–11.
Educational Assessment, Evaluation and Accountability, 21(1), 5–31. [Link]
Earl, L. (2003). Assessment as learning: Using classroom assessment to maximize student
10.1007/s11092-008-9068-5
learning. Thousand oaks. CA: Corwin.
Bok, H. G. J., Teunissen, P. W., Favier, R. P., Rietbroek, N. J., Theyse, L. F. H.,
Earl, L. M. (2013). Assessment as learning: Using classroom assessment to maximize student
Brommer, H., … Jaarsma, D. A. D. C. (2013). Programmatic assessment of
learning. Corwin Press.
competency-based workplace learning: When theory meets practice. BMC Medical
Earl, L., & Katz, S. (2008). Getting to the core of learning: Using assessment for self-
Education, 13(1), 123. [Link]
monitoring and self-regulation. In S. Swaffield (Ed.), Unlocking assessment:
Boud, D., & Falchikov, N. (2006). Aligning assessment with long-term learning.
Understanding for reflection and application (pp. 90–104). Abingdon, England; New
Assessment and Evaluation in Higher Education, 31(4), 399–413. [Link]
York: Routledge.
10.1080/02602930600679050
*Ellery, K. (2008). Assessment for learning: A case study using feedback effectively in an
Boud, D., & Molloy, E. (2013). Rethinking models of feedback for learning: The challenge
essay-style test. Assessment and Evaluation in Higher Education, 33(4), 421–429.
of design. Assessment and Evaluation in Higher Education, 38(6), 698–712. [Link]
[Link]
org/10.1080/02602938.2012.691462
Elo, S., & Kyng! as, H. (2008). The qualitative content analysis process. Journal of
*Boyle, W. F., & Charles, M. (2010). Leading learning through assessment for learning?
Advanced Nursing, 62(1), 107–115. [Link]
School Leadership and Management, 30(3), 285–300. [Link]
2648.2007.04569.x
13632434.2010.485184
*Fenwick, L. (2017). Promoting assessment for learning through curriculum-based
*Bramwell-Lalor, S., & Rainford, M. (2016). Advanced level biology teachers’ attitudes
performance standards: Teacher responses in the northern territory of Australia.
towards assessment and their engagement in assessment for learning. European
Curriculum Journal, 28(1), 41–58. [Link]
Journal of Science and Mathematics Education, 4.
09585176.2016.1260486
*Braund, H., & DeLuca, C. (2018). Elementary students as active agents in their learning:
*Fletcher, A. K. (2016). Exceeding expectations: Scaffolding agentic engagement through
An empirical study of the connections between assessment practices and student
Assessment as learning. Educational Research, 58(4), 400–419. [Link]
metacognition. Australian Educational Researcher, 45(1), 65–85. [Link]
10.1080/00131881.2016.1235909
10.1007/s13384-018-0265-z
Gibbs, G., & Simpson, C. (2005). Conditions under which assessment supports students’
Burke, D. (2009). Strategies for using feedback students bring to higher education.
learning. Learning and Teaching in Higher Education, 1, 3–31.
Assessment and Evaluation in Higher Education, 34(1), 41–50. [Link]
Gipps, C. (2011). Beyond testing (classic edition): Towards a theory of educational
10.1080/02602930801895711
assessment. Routledge.
*Carless, D. (2002). The ‘Mini-Viva’ as a tool to enhance assessment for learning.
*Hargreaves, E. (2005). Assessment for learning? Thinking outside the (black) box.
Assessment and Evaluation in Higher Education, 27(4), 353–363. [Link]
Cambridge Journal of Education, 35(2), 213–224. [Link]
10.1080/0260293022000001364
03057640500146880
Cilliers, F. J., Schuwirth, L. W. T., Herman, N., Adendorff, H. J., & van der
*Hargreaves, E. (2013). Inquiring into children’s experiences of teacher feedback:
Vleuten, C. P. M. (2012). A model of the pre-assessment learning effects of
Reconceptualising assessment for learning. Oxford Review of Education, 39(2),
summative assessment in medical education. Advances in Health Sciences Education,
229–246. [Link]
17(1), 39–53. [Link]
*Harrison, C. (2005). Teachers developing assessment for learning: Mapping teacher
Clark, I. (2012). Formative assessment: Assessment is for self-regulated learning.
change. Teacher Development, 9(2), 255–263.
Educational Psychology Review, 24(2), 205–249. [Link]
*Hawe, E., & Dixon, H. (2017). Assessment for learning: A catalyst for student self-
011-9191-6
regulation. Assessment and Evaluation in Higher Education, 42(8), 1181–1192. https://
Cohen, L., Manion, L., & Morrison, K. (2018). Research methods in education. London:
[Link]/10.1080/02602938.2016.1236360
Routledge.
*Hawe, E., & Parr, J. (2014). Assessment for learning in the writing classroom: An
incomplete realisation. Curriculum Journal, 25(2), 210–237. [Link]
10.1080/09585176.2013.862172
*Hayward, L. (2015). Assessment is learning: The preposition vanishes. Assessment in
Education: Principles, Policy and Practice, 22(1), 27–43. [Link]
1 0969594X.2014.984656
References with an asterisk indicate studies included in the scoping review.
12
L.H. Schellekens et al. Studies in Educational Evaluation 71 (2021) 101094
*Heitink, M. C., van der Kleij, F. M., Veldkamp, B. P., Schildkamp, K., & Kippers, W. B. *Mumm, K., Karm, M., & Remmik, M. (2016). Assessment for learning: Why assessment
(2016). A systematic review of prerequisites for implementing assessment for does not always support student teachers’ learning. Journal of Further and Higher
learning in classroom practice. Educational Research Review, 17(February), 50–62. Education, 40(6), 780–803. [Link]
[Link] *Newby, L., & Winterbottom, M. (2011). Can research homework provide a vehicle for
*Heritage, M. (2018). Assessment for learning as support for student self-regulation. assessment for learning in science lessons? Educational Review, 63(3), 275–290.
Australian Educational Researcher, 45(1), 51–63. [Link] [Link]
018-0261-3 *Ninomiya, S. (2016). The possibilities and limitations of assessment for learning:
*Hill, M. F. (2011). ‘Getting Traction’: Enablers and barriers to implementing assessment Exploring the theory of formative assessment and the notion of ‘Closing the learning
for learning in secondary schools. Assessment in Education: Principles, Policy and gap. Educational Studies in Japan, 10, 79–91.
Practice, 18(4), 347–364. [Link] *Nortvedt, G. A., Santos, L., & Pinto, J. (2016). Assessment for learning in Norway and
*Hopfenbeck, T. N., Fl” orez Petour, M. T., & Tolo, A. (2015). Balancing tensions in Portugal: The case of primary school mathematics teaching. Assessment in Education:
educational policy reforms: Large-scale implementation of assessment for learning in Principles, Policy and Practice, 23(3), 377–395. [Link]
Norway. Assessment in Education: Principles, Policy and Practice, 22(1), 44–60. https:// 0969594X.2015.1108900
[Link]/10.1080/0969594X.2014.996524 Panadero, E., Andrade, H., & Brookhart, S. (2018). Fusing self-regulated learning and
*Huang, S. C. (2015). Setting writing revision goals after assessment for learning. formative assessment: A roadmap of where we are, how we got here, and where we
Language Assessment Quarterly, 12(4), 363–385. [Link] are going. The Australian Educational Researcher, 45(1), 13–31. [Link]
15434303.2015.1092544 10.1007/s13384-018-0258-y
*Hui, S. K. F., Brown, G. T. L., & Chan, S. W. M. (2017). Assessment for learning and for Pastore, S., & Andrade, H. L. (2019). Teacher assessment literacy: A three-dimensional
accountability in classrooms: The experience of four Hong Kong primary school model. Teaching and Teacher Education, 84, 128–139. [Link]
curriculum leaders. Asia Pacific Education Review, 18(1), 41–51. [Link] tate.2019.05.003
10.1007/s12564-017-9469-6 *Pat-El, R. J., Tillema, H., Segers, M., & Vedder, P. (2015). Multilevel predictors of
*Hume, A., & Coll, R. K. (2009). Assessment of learning, for learning, and as learning: differing perceptions of assessment for learning practices between teachers and
New Zealand case studies. Assessment in Education: Principles, Policy & Practice, 16(3), students. Assessment in Education: Principles, Policy & Practice, 22(2), 282–298.
269–290. [Link] [Link]
*James, M., & Pedder, D. (2006). Beyond method: Assessment and learning practices and Pham, M. T., Raji”c, A., Greig, J. D., Sargeant, J. M., Papadopoulos, A., & Mcewen, S. A.
values. Curriculum Journal, 17(2), 109–138. [Link] (2014). A scoping review of scoping reviews: Advancing the approach and
09585170600792712 enhancing the consistency. Research Synthesis Methods, 5(4), 371–385. [Link]
Jessop, T., Mcnab, N., & Gubby, L. (2012). Mind the gap: An analysis of how quality org/10.1002/jrsm.1123
assurance processes influence programme assessment patterns. Active Learning in *Reimann, N., & Wilson, A. (2012). Academic development in ‘Assessment for learning’:
Higher Education, 13(2), 143–154. [Link] The value of a concept and communities of assessment practice. International Journal
*Jones, A., & Moreland, J. (2005). The importance of pedagogical content knowledge in for Academic Development, 17(1), 71–83. [Link]
assessment for learning practices: A case-study of a whole-school approach. 1360144X.2011.586460
Curriculum Journal, 16(2), 193–206. [Link] Rust, C. (2002). The impact of assessment on student learning. Active Learning in Higher
*Jonsson, A., Lundahl, C., & Holmgren, A. (2015). Evaluating a large-scale Education, 3(2), 145–158. [Link]
implementation of assessment for learning in Sweden. Assessment in Education: *Sadeghi, K., & Rahmati, T. (2017). Integrating assessment as, for, and of learning in a
Principles, Policy and Practice, 22(1), 104–121. [Link] large-scale exam preparation course. Assessing Writing, 34(November 2016), 50–61.
0969594X.2014.970612 [Link]
Klenowski, V. (2009). Assessment for learning revisited: An Asia-Pacific perspective. Sadler, D. R. (1989). Formative assessment and the design of instructional systems.
Assessment in Education: Principles, Policy and Practice, 16(3), 263–268. [Link] Instructional Science, 18(2), 119–144. [Link]
org/10.1080/09695940903319646 *Schuwirth, L. W. T., & van der Vleuten, C. P. M. (2012). Programmatic assessment and
*Lam, R. (2016). Assessment as learning: Examining a cycle of teaching, learning, and Kane’s validity perspective. Medical Education, 46(1), 38–48. [Link]
assessment of writing in the portfolio-based classroom. Studies in Higher Education, 41 10.1111/j.1365-2923.2011.04098.x
(11), 1900–1917. [Link] *Smith, K., & Engelsen, K. S. (2013). Developing an Assessment for Learning (AfL)
*Lam, R. (2018). Understanding assessment as learning in writing classrooms: The case culture in school: The voice of the principals. International Journal of Leadership in
of portfolio assessment. Iranian Journal of Language Teaching Research, 6(3), 19–36. Education, 16(1), 106–125. [Link]
Lau, A. M. S. (2016). Formative good, summative bad?’ – A review of the dichotomy in Smith, C. D., Worsfold, K., Davies, L., Fisher, R., & McPhail, R. (2013). Assessment
assessment literature. Journal of Further and Higher Education, 40(4), 509–525. literacy and student learning: The case for explicitly developing students ‘assessment
[Link] literacy. Assessment and Evaluation in Higher Education, 38(1), 44–60. [Link]
*Lee, I., & Coniam, D. (2013). Introducing assessment for learning for EFL writing in an org/10.1080/02602938.2011.598636
assessment of learning examination-driven system in Hong Kong. Journal of Second *Swaffield, S. (2011). Getting to the heart of authentic assessment for learning.
Language Writing, 22(1), 34–50. [Link] Assessment in Education: Principles, Policy and Practice, 18(4), 433–449. [Link]
*Leirhaug, P. E., & Annerstedt, C. (2016). Assessing with new eyes? Assessment for org/10.1080/0969594X.2011.582838
learning in Norwegian physical education. Physical Education and Sport Pedagogy, 21 *Tan, K. (2011). Assessment for learning in Singapore: Unpacking its meanings and
(6), 616–631. [Link] identifying some areas for improvement. Educational Research for Policy and Practice,
*Leirhaug, P. E., & MacPhail, A. (2015). It’s the other assessment that is the key’: Three 10(2), 91–103. [Link]
Norwegian physical education teachers’ engagement (or not) with assessment for *Tan, K. (2013). A framework for assessment for learning: Implications for feedback
learning. Sport, Education and Society, 20(5), 624–640. [Link] practices within and beyond the gap. ISRN Education, 2013, 1–6. [Link]
13573322.2014.975113 10.1155/2013/640609
Levac, D., Colquhoun, H., & O’Brien, K. K. (2010). Scoping studies: Advancing the *Tan, K. H. K. (2017). Asking questions of (What) assessment (Should do) for learning:
methodology. Implementation Science, 5(69), 1–9. The case of bite-sized assessment for learning in Singapore. Educational Research for
*Li, X. (2018). Self-assessment as ‘Assessment as Learning’ in translator and interpreter Policy and Practice, 16(2), 189–202. [Link]
education: Validity and washback. The Interpreter and Translator Trainer, 12(1), *Taras, M. (2002). Using assessment for learning and learning from assessment.
48–67. [Link] Assessment & Evaluation in Higher Education, 27(6), 501–510. [Link]
*Lorente-Catal” an, E., & Kirk, D. (2016). Student teachers’ understanding and application 10.1080/026029302200002027
of assessment for learning during a physical education teacher education course. *Thompson, J., Houston, D., Dansie, K., Rayner, T., Pointon, T., Pope, S., …
European Physical Education Review, 22(1), 65–81. [Link] Grantham, H. (2017). Student & tutor consensus: A partnership in assessment for
1356336X15590352 learning. Assessment and Evaluation in Higher Education, 42(6), 942–952. [Link]
*Macphail, A., & Halbert, J. (2010). ‘We had to do intelligent thinking during recent PE’: org/10.1080/02602938.2016.1211988
Students’ and teachers’ experiences of assessment for learning in post-primary *Tolgfors, B. (2018). Different versions of assessment for learning in the subject of
physical education. Assessment in Education: Principles, Policy and Practice, 17(1), physical education. Physical Education and Sport Pedagogy, 23(3), 311–327. https://
23–39. [Link] [Link]/10.1080/17408989.2018.1429589
*Mak, P., & Lee, I. (2014). Implementing assessment for learning in L2 writing: An Tomlinson, C. A. (2007). Learning to love assessment. Educational Leadership, 65(4),
activity theory perspective. System, 47, 73–87. [Link] 8–13. [Link]
system.2014.09.018 *Torrance, H. (2007). Assessment as learning? How the use of explicit learning
*Marshall, B., & Drummond, M. J. (2006). How teachers engage with assessment for objectives, assessment criteria and feedback in post-secondary education and
learning: Lessons from the classroom. Research Papers in Education, 21(2), 133–149. training can come to dominate learning. Assessment in Education: Principles, Policy
[Link] and Practice, 14(3), 281–294. [Link]
*McDowell, L., Wakelin, D., Montgomery, C., & King, S. (2011). Does assessment for *Tunku Ahmad, T. B., Zubairi, A. M., Ibrahim, M. B., Othman, J., Rahman, N. S. A.,
learning make a difference? The development of a questionnaire to explore the Rahman, Z. A., … Nor, Z. M. (2014). Assessment for learning practices and
student response. Assessment and Evaluation in Higher Education, 36(7), 749–765. competency among Malaysian university lecturers: A national study. Practitioner
[Link] Research in Higher Education, 8(1), 14–31.
Medland, E. (2016). Assessment in higher education: Drivers, barriers and directions for Van der Vleuten, C. P. M., Schuwirth, L. W. T., Driessen, E. W., Dijkstra, J., Tigelaar, D.,
change in the UK. Assessment & Evaluation in Higher Education, 41(1), 81–96. Baartman, L. K. J., et al. (2012). A model for programmatic assessment fit for
Moher, D., Liberati, A., Tetzlaff, J., Altman, D. G., & Grp, P. (2009). Preferred reporting purpose. Medical Teacher, 34(3), 205–214. [Link]
items for systematic reviews and meta-analyses: The PRISMA statement. Annals of 0142159X.2012.652239
Internal Medicine, 151(4), 264–269. [Link]
13
L.H. Schellekens et al. Studies in Educational Evaluation 71 (2021) 101094
Van der Vleuten, C. P. M., Schuwirth, L. W. T., Driessen, E. W., Govaerts, M. J. B., & *Dannefer, E. F. (2013). Beyond assessment of learning toward assessment for learning:
Heeneman, S. (2015). Twelve tips for programmatic assessment. Medical Teacher, 37 Educating tomorrow’s physicians. Medical Teacher, 35(7), 560–563. [Link]
(7), 641–646. [Link] 10.3109/0142159X.2013.787141
Van der Vleuten, C., Sluijsmans, D., & Joosten-Ten Brinke, D. (2017). Competence *Ferm Almqvist, C., Vinge, J., V! akev!
a, L., & Zand”en, O. (2017). Assessment as learning in
assessment as learner support in education. Competence-based vocational and music education: The risk of ! acriteria compliance”e replacing !alearning”
e in the
professional education (pp. 607–630). Cham: Springer. [Link] Scandinavian countries. Research Studies in Music Education, 39(1), 3–18. [Link]
3-319-41713-4_28 org/10.1177/1321103X16676649
Van Leeuwen, A., & Janssen, J. (2019). A systematic review of teacher guidance during *Gavriel, J. (2013). Assessment for learning: a wider (classroom-researched) perspective
collaborative learning in primary and secondary education. Educational Research is important for formative assessment and self-directed learning in general practice.
Review, 27(February), 71–89. [Link] Education for Primary Care, 24, 93–96. [Link]
*Volante, L. (2010). Assessment of, for, and as learning within schools: Implications for 14739879.2013.11493462
transforming classroom practice. Action in Teacher Education, 31(4), 66–75. https:// *Gioka, O. (2007a). Assessment for learning in biology lessons. Journal of Biological
[Link]/10.1080/01626620.2010.10463536 Education, 41(3), 113–116. [Link]
*Warwick, P., Shaw, S., & Johnson, M. (2015). Assessment for learning in international *Gioka, O. (2007b). Assessment for learning in teaching and assessing graphs in science
contexts: Exploring shared and divergent dimensions in teacher values and practices. investigation lessons. Science Educational International, 18(3), 189–208.
Curriculum Journal, 26(1), 39–69. [Link] *Green, A. (2018). Assessment for learning in language education. Iranian Journal of
*Webb, M., & Jones, J. (2009). Exploring tensions in developing assessment for learning. Language Teaching Research, 6(3), 9–18. Retrieved from [Link]
Assessment in Education: Principles, Policy & Practice, 16(2), 165–184. [Link] *Gupta, K. (2016). Assessment as learning. The Science Teacher, 83(1), 43–47.
10.1080/09695940903075925 *Hargreaves, E. (2001). Assessment for learning in the multigrade classroom.
*Wiliam, D. (2011). What is assessment for learning? Studies in Educational Evaluation, 37 International Journal of Educational Development, 21(6), 553–560. [Link]
(1), 3–14. [Link] 10.4135/9781446250808.n2
*Wiliam, D., Lee, C., Harrison, C., & Black, P. (2004). Teachers developing assessment for *Hargreaves, E. (2007). The validity of collaborative assessment for learning. Assessment
learning: Impact on student achievement. Assessment in Education: Principles, Policy in Education: Principles, Policy & Practice, 14(2), 185–199. [Link]
and Practice, 11(1), 49–65. [Link] 09695940701478594
*Willis, J. (2011). Affiliation, autonomy and assessment for learning. Assessment in *Harlen, W. (2005). Teachers’ summative practices and assessment for learning –
Education: Principles, Policy and Practice, 18(4), 399–415. [Link] tensions and synergies. Curriculum Journal, 16(2), 207–223. [Link]
0969594X.2011.604305 10.1080/09585170500136093
Winstone, N. E., Nash, R. A., Parker, M., & Rowntree, J. (2017). Supporting learners’ *Heritage, M., & Wylie, C. (2018). Reaping the benefits of assessment for learning:
agentic engagement with feedback: A systematic review and a taxonomy of Achievement, identity, and equity. ZDM - Mathematics Education, 50(4), 729–741.
recipience processes. Educational Psychologist, 52(1), 17–37. [Link] [Link]
10.1080/00461520.2016.1207538 *Hodgen, J., & Marshall, B. (2005). Assessment for learning in English and mathematics:
*Wong, M. W. Y. (2014). Assessment for learning, a decade on: Self-reported assessment A comparison. Curriculum Journal, 16(2), 153–176. [Link]
practices of secondary school music teachers in Hong Kong. International Journal of 09585170500135954
Music Education, 32(1), 70–83. [Link] *Hutchinson, C., & Hayward, L. (2005). The journey so far: Assessment for learning in
*Wong, M. W. (2007). Assessment for learning and teacher development: The experience Scotland. Curriculum Journal, 16(2), 225–248. [Link]
of three Hong Kong teachers. Teacher Development, 11(3), 295–312. [Link] 09585170500136184
10.1080/13664530701644599 *Hutchinson, C., & Young, M. (2011). Assessment for learning in the accountability era:
Xu, Y., & Brown, G. T. L. (2016). Teacher assessment literacy in practice: A Empirical evidence from Scotland. Studies in Educational Evaluation, 37(1), 62–70.
reconceptualization. Teaching and Teacher Education, 58, 149–162. [Link] [Link]
10.1016/[Link].2016.05.010 *Johnson, M., & Burdett, N. (2010). Intention, interpretation and implementation: Some
Zeng, W., Huang, F., Yu, L., & Chen, S. (2018). Towards a learning-oriented assessment to paradoxes of assessment for learning across educational contexts. Research in
improve students’ learning—A critical review of literature. Educational Assessment, Comparative and International Education, 5(2), 122–130. [Link]
Evaluation and Accountability, 30(3), 211–250. [Link] rcie.2010.5.2.122
9281-9 *Jones, J. (2010). The role of assessment for learning in the management of primary to
secondary transition: Implications for language teachers. Language Learning Journal,
38(2), 175–191. [Link]
Further reading *Kirton, A., Hallam, S., Peffers, J., Robertson, P., & Stobart, G. (2007). Revolution,
evolution or a Trojan horse? Piloting assessment for learning in some Scottish
*Akib, E., & Ghafar, M. N. A. (2015). Assessment for learning instrumentation in higher primary schools. British Educational Research Journal, 33(4), 605–627. [Link]
education. International Education Studies, 8(4), 166–172. [Link] org/10.2307/30000011
ies.v8n4p166 *Kucey, S., & Parsons, J. (2017). Linking past and present: John Dewey and assessment
*Basse, R. (2018). ssessment for learning in the CLIL classroom. Journal of Immersion and for learning. Journal of Teaching and Learning, 8(1), 107–116. [Link]
Content-Based Language Education, 6(1), 113–137. [Link] 10.22329/jtl.v8i1.3077
[Link] *Kulasegaram, K., & Rangachari, P. K. (2018). Beyond “formative”: Assessments to
*Berry, R. (2013). The assessment as learning (AaL) framework for teaching and enrich student learning. Advances in Physiology Education, 42(1), 5–14. [Link]
learning: The AaL wheel. Assessment and Learning, 2, 51–70. [Link] org/10.1152/advan.00122.2017
[Link]. *Lee, I. (2007). Assessment for learning: Integrating assessment, teaching, and learning
*Black, P., Harrison, C., Lee, C., Marshall, B., & Wiliam, D. (2004). Working inside the in the ESL/EFL writing classroom. The Canadian Modern Language Review, 64(1),
Black Box: Assessment for Learning in the Classroom. Phi Delta Kappan. Phi Delta 199–213.
Kappa Inc.. [Link] *Leirhaug, P. E. (2016). Exploring the relationship between student grades and
*Blanchard, J. (2003). Targets, assessment for learning, and whole-school improvement. assessment for learning in Norwegian physical education. European Physical
Cambridge Journal of Education, 33(2), 257–271. [Link] Education Review, 22(3), 298–314. [Link]
03057640302036 *Loyd, G. E. (2008a). Assessment of learning outcomes: Summative evaluations.
*Blandford, S., & Knowles, C. (2012). Assessment for learning: A model for the International Anesthesiology Clinics, 46(4), 85–96. [Link]
development of a child’s self-competence in the early years of education. Education AIA.0b013e31818623df
(3-13), 40(5), 487–499. [Link] *Loyd, G. E., & Koenig, H. M. (2008b). Assessment of learning outcomes: Summative
*Brown, G. T. L., & Gao, L. (2015). Chinese teachers’ conceptions of assessment for and evaluations. International Anesthesiology Clinics, 46(4), 97–111. [Link]
of learning: Six competing and complementary purposes. Cogent Education, 2(1), 10.1097/AIA.0b013e31818623cd
1–19. [Link] *Ludwig, M. A., Bentz, A. E., & Fynewever, H. (2011). Your syllabus should set the stage
*Brown, G. T. L., Harris, L. R., & Harnett, J. (2012). Teacher beliefs about feedback for assessment for learning. Journal of College Science Teaching, 40(4), 20. Retrieved
within an assessment for learning environment: Endorsement of improved learning from [Link]
over student well-being. Teaching and Teacher Education, 28(7), 968–978. https:// u→utrecht&sid→AONE&xid→d75bc848. Accessed 10 July 2018.
[Link]/10.1016/[Link].2012.05.003 *Lysaght, Z. (2015). Assessment for learning and for self-regulation. International Journal
*Carless, D. (2005). Prospects for the implementation of assessment for learning. of Emotional Education, 7(1), 20–34.
Assessment in Education: Principles, Policy & Practice, 12(1), 39–54. [Link] *Lysaght, Z., & O’Leary, M. (2013). An instrument to audit teachers’ use of assessment
10.1080/0969594042000333904 for learning. Irish Educational Studies, 43(2), 217–232. [Link]
*Chappuis, S., & Stiggins, R. J. (2002). Classroom assessment for learning. Educational 03323315.2013.784636
Leadership, 60(1), 40–43. *Mui So, W. W., & Hoi Lee, T. T. (2011). Influence of teachers’ perceptions of teaching
*Chng, L. S., & Lund, J. (2018). Assessment for learning in physical education: The what, and learning on the implementation of assessment for learning in inquiry study.
why and how. Journal of Physical Education. Recreation and Dance, 89(8), 29–34. Assessment in Education: Principles Policy and Practice, 18(4), 417–432. [Link]
[Link] org/10.1080/0969594X.2011.577409
*Chueachot, S., Srisa-Ard, B., & Srihamongkol, Y. (2013). The development of an *Pat-El, R. J., Tillema, H., Segers, M., & Vedder, P. (2013). Validation of assessment for
assessment for learning model for elementary classroom. International Education learning questionnaires for teachers and students. British Journal of Educational
Studies, 6(9), 119–124. [Link] Psychology, 83(1), 98–113. [Link]
14
L.H. Schellekens et al. Studies in Educational Evaluation 71 (2021) 101094
*Rashid, R. A., & Jaidin, J. H. (2014). Exploring primary school teachers’ conceptions of education. Teaching in Higher Education, 22(3), 288–303. [Link]
‘Assessment for learning. International Education Studies, 7(9), 69–83. [Link] 13562517.2016.1248389
org/10.5539/ies.v7n9p69 *Tillema, H. H., & Smith, K. (2009). Assessment orientation in formative assessment of
*Rosemartin, D. S. (2013). Assessment for learning: Shifting our focus. Kappa Delta Pi learning to teach. Teachers and Teaching Theory and Practice, 15(3), 391–405.
Record, 49(1), 21–25. [Link] [Link]
*San, I. (2016). Assessment for learning: Turkey case. Universal Journal of Educational *Tolgfors, B., & Ohman,
! M. (2015). The implications of assessment for learning in
Research, 4(1), 137–143. [Link] physical education and health. European Physical Education Review, 22(2), 150–166.
*Sardareh, S. A., & Saad, M. R. M. (2013). Malaysian primary school ESL teachers’ [Link]
questions during assessment for learning. English Language Teaching, 6(8), 1–9. *Umar, A. T., & Majeed, A. (2018). The impact of assessment for learning on students’
[Link] achievement in English for specific purposes. A case study of pre-medical students at
*Sardareh, S. A., Saad, M. R. M., Othman, A. J., & Me, R. C. (2014). ESL teachers’ Khartoum University: Sudan. English Language Teaching, 11(2), 15. [Link]
questioning technique in an assessment for learning context: Promising or 10.5539/elt.v11n2p15
problematic? International Education Studies, 7(9), 161–174. [Link] *Van der Kleij, F. M., Vermeulen, J. A., Schildkamp, K., & Eggen, T. J. H. M. (2015).
10.5539/ies.v7n9p161 Integrating data-based decision making, Assessment for Learning and diagnostic
*Sicherl Kafol, B., Korde#s, U., & Holcar Brunauer, A. (2017). Assessment for learning in testing in formative assessment. Assessment in Education: Principles Policy and Practice,
music education in the Slovenian context–from punishment or reward to support. 22(3), 324–343. [Link]
Music Education Research, 19(1), 17–28. [Link] *Vlachou, M. A. (2015). Does assessment for learning work to promote student learning?
14613808.2015.1077800 The England paradigm. The Clearing House: A Journal of Educational Strategies Issues
*Stiggins, R. (2002). Assessment crisis: The absence of assessment for learning. Phi Delta and Ideas, 88(3), 101–107. [Link]
Kappan, 83(10), 758–765. [Link] *Woro, R., & Sumarlam, D. (2017). Developing model Assesement for Learning (AfL) to
*Stiggins, R. (2005). From formative assessment to assessment for learning: A path to improve quality and evaluation in pragmatic course in IAIN Surakarta. English
success in standards-based schools. Phi Delta Kappan, 87(4), 324–328. Language Teaching, 10(5), 97–103. [Link]
*Stiggins, R., & Chappuis, J. (2006). What a difference a word makes: Assessment “for” *Zandi, H., Kaivanpanah, S., & Alavi, S. M. (2015). Contract learning as an approach to
learning rather than assessment “of” learning helps students succeed. Journal of Staff individualizing EFL education in the context of assessment for learning. Language
Development, 27(1), 10–14. [Link] Assessment Quarterly, 12(4), 409–429. [Link]
*Strauss, P., & Mooney, S. (2017). Assessment for learning: Capturing the interest of 15434303.2015.1104315
diverse students on an academic writing module in postgraduate vocational
15