0% found this document useful (0 votes)
9 views23 pages

A Rubric For Assessing Mathematical

This study presents a rubric designed to assess mathematical modeling problems in scientific-engineering contexts, addressing a gap in reliable assessment tools. The rubric is based on the proficiency levels identified in the PISA Mathematics framework and aims to evaluate the modeling processes of formulation, employment, and interpretation. The authors discuss the theoretical and practical implications of the rubric, which is intended to enhance students' modeling competencies through authentic real-world problems.

Uploaded by

cronopia123
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views23 pages

A Rubric For Assessing Mathematical

This study presents a rubric designed to assess mathematical modeling problems in scientific-engineering contexts, addressing a gap in reliable assessment tools. The rubric is based on the proficiency levels identified in the PISA Mathematics framework and aims to evaluate the modeling processes of formulation, employment, and interpretation. The authors discuss the theoretical and practical implications of the rubric, which is intended to enhance students' modeling competencies through authentic real-world problems.

Uploaded by

cronopia123
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 23

Teaching Mathematics and its Applications: An International Journal of the IMA (2023) 42, 266–288

https://doi.org/10.1093/teamat/hrac018 Advance Access publication 11 October 2022

A rubric for assessing mathematical


modelling problems in a
scientific-engineering context

Downloaded from https://academic.oup.com/teamat/article/42/3/266/6751014 by guest on 30 October 2024


Zehavit Kohen and Yasmin Gharra-Badran
Faculty of Education in Science and Technology, Technion-Israel Institute of Technology, Haifa 3200003, Israel
∗ Corresponding author. Email: [email protected]

[Received April 2022; accepted September 2022]

Mathematics modelling is a vital competency for students of all ages. In this study, we aim to
fill the research gap about valid and reliable tools for assessing and grading mathematical
modeling problems, particularly those reflecting multiple steps of the modelling cycle. We
present in this paper the design of a reliable and valid assessment tool aimed at gauging the
level of mathematical modelling associated with real-world modeling problems in a scientific-
engineering context. The study defines and bases the central modelling processes on the profi-
ciency levels identified in PISA Mathematics. A two-dimensional rubric was developed, reflect-
ing the combined assessment of the type and level of a modelling process. We identified criteria
that enable a clear comparison and differentiation among the different levels across each of the
modelling processes. These criteria allow for concrete theoretical definitions for the various
modelling processes, introducing a well-defined mathematical modelling framework from a
didactical viewpoint, which can potentially contribute to promoting modelling competencies
or the understanding of modelling by teachers and students. Theoretical, methodological and
practical implications are discussed.

1. Introduction
Mathematics is one of the most effective and important tools that provide solutions to real-world
situations and challenges, as well as dealing with everyday situations as citizens in a modern society
(Blum & Niss, 1991; Li, 2013; Maaß et al., 2018). In the context of Science, Technology, Engineering
and Mathematics (STEM), mathematics is considered a basic scientific discipline, making it possible to
solve a wide range of problems arising from real-world situations or relationships related to the STEM
field (Common Core State Standards Initiative, 2010). Mathematical modelling provides a method for
better understanding of real-world situations. It is defined as a cyclic process that involves translating
from mathematics to reality and vice versa (Lesh & Doerr, 2003; Blum & Leiß, 2006; Kaiser & Sriraman,
2006; Niss et al., 2007).
Modelling is used in mathematics education in various ways, reflecting different purposes and
goals. At one end of the spectrum is educational modelling, which uses real-world situations to foster

© The Author(s) 2022. Published by Oxford University Press on behalf of The Institute of Mathematics and its Applications.
This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.
org/licenses/by/4.0/), which permits unrestricted reuse, distribution, and reproduction in any medium, provided the original work is
properly cited.
A RUBRIC FOR ASSESSING MATHEMATICAL MODELLING PROBLEMS 267

understanding of mathematical concepts. On the other end of the spectrum is realistic or applied
modelling that aligns with PISA objectives (Giberti & Maffia, 2020). PISA, the Programme for
International Student Assessment of the Organization for Economic Co-operation and Development
(OECD), is one of the well-known assessment tools for mathematical modelling. Since 2012, the
modelling cycle had been particularly emphasized in PISA as an additional reporting category for
student proficiency (Stacey, 2015). The applied modelling view is more pragmatic, claiming that abstract
mathematics can be exploited to better understand real-life situations. In this approach, the modeling
process emphasizes the authentic aspect of the problem, thus presenting to the students the challenge

Downloaded from https://academic.oup.com/teamat/article/42/3/266/6751014 by guest on 30 October 2024


of using mathematics in the way it may be used in life outside school. Usage of the language (e.g., the
actual problem’s terminology) should suit the real usage in the authentic context, and students should
be able to use the acquired mathematical knowledge in real life, not only at school (De Lange, 2006;
Stacey, 2015). Particularly in the case of PISA, students are assessed on their ability to apply knowledge
and skills to real-life problems rather than on how well they master the school curriculum (Kaiser &
Sriraman, 2006).
However, as one of the underlying foundations for PISA is that students complete a large number
of independent items in a short time, students are mostly involved in only part of the modelling cycle.
Each PISA item is classified according to the modelling process that is highly demanded in the solution
process. Thus, researchers (e.g., Blum et al., 2007; Schukajlow et al., 2021) suggest the use of extended
problems that engage students with multiple steps of the modelling cycle.
The mathematical modelling cycle provides a theoretical insight into the process of engagement
with a problem. This cycle can be used empirically to identify the different mathematical actions
taken while solving a modelling problem. However, researchers are still struggling to establish a valid
and reliable tool that serves to assess modelling problems reflecting the full modelling cycle (Blum,
2011; Wess et al., 2021). The proficiency levels identified in PISA Mathematics may be viewed as
a first attempt towards such an assessment tool. The classification made by PISA for one modelling
process for each item is based on recognizing different levels of proficiency in mathematics, defined
by fundamental mathematical capabilities that are performed actions required by students in order
to solve mathematical problems. These mathematical capabilities are not activated in isolation, but
rather referred to in the various proficiency levels, i.e., are used to describe overall proficiency. The
continuum of proficiency ranges from simple, direct ones in everyday situations to more complex work
requiring higher levels of technical competence. The US report Adding It Up (NRC, 2001) outlines
five aspects of mathematical proficiency. The first two aspects relate to the mathematical content,
particularly conceptual understanding, i.e., comprehension of mathematical concepts, operations and
relations and procedural fluency, i.e., the skill to carry out mathematical procedures efficiently and
accurately. The following two aspects describe the mathematical process, composed of strategic
competence, i.e., the ability to formulate, represent and solve mathematical problems, and adaptive
reasoning, i.e., the capability for justification, logical reasoning and reflection on the solution process.
Finally, the last aspect describes the productive disposition, i.e., the intention to use the mathemat-
ical content and process effectively, meaning the perception of mathematics as rational, useful and
worthwhile, together with an appreciation of the role of effort and one’s own efficacy (Stacey &
Turner, 2015).
However, although the various proficiency levels are precisely described, there is no concrete
definition for each level that allows one to compare them to each other, thus the differences among
the levels are inconclusive. Moreover, the assigning of each proficiency level to a specific modelling
process is challenging and not trivial, specifically when attempting to assess a full modelling cycle.
Finally, the PISA test aims to measure student performance, rather than being an objective measurement
268 Z. KOHEN AND Y. GHARRA-BADRAN

Downloaded from https://academic.oup.com/teamat/article/42/3/266/6751014 by guest on 30 October 2024


Fig. 1. The modelling cycle (Blum & Leiß, 2006).

of each PISA item. Thus, it does not shed light on the characteristics of a real-life problem that invites
the application of mathematical modelling competency (Stacey, 2015).
In this study, we aim to design a reliable and valid assessment tool to gauge the level of mathematical
modelling associated with authentic modeling problems derived from real-world situations, in particular
within a scientific-engineering context.

2. Theoretical framework: mathematical modelling


Mathematical modelling refers to a complete process of ‘mathematization’ of context from the real world,
i.e., the process of constructing a mathematical model to solve problems from the real world, and is
defined as a serial process constructed of several stages, as shown in Fig. 1.
The first two stages include comprehension and simplification of an authentic problem from non-
mathematical areas in the real world. The third stage relates to constructing a plan for solving the prob-
lem, including suitable mathematization and development of an appropriate mathematical model. The
fourth stage includes implementation of the plan using mathematical tools, processes and procedures.
The last two stages of the modelling process include interpretation of the mathematical solution in real-
world terms, and its verification through examination of its compatibility with reality. When no such
compatibility exists, certain stages of the modelling process are repeated, or even the entire process
(Lesh & Doerr, 2003; Blum & Leiß, 2006; Kaiser & Sriraman, 2006; Niss et al., 2007).
When viewed from a didactical viewpoint, i.e., when the modelling cycle is considered a tool used to
promote modelling competencies or the understanding of modelling in general by teachers and students,
the modelling cycle should include clearly arranged steps illustrating the transitions between reality and
mathematics (Blum, 2015; Borromeo Ferri, 2018; Maaß, 2006). This model is exemplified in Fig. 2 and
is based on the modelling cycles suggested by Blum (1996), Kaiser (1995) and Maaß (2005).
In this model, reality and mathematics are captured as two separate worlds, and the main features of
mathematical modelling are illustrated simply. The model begins within the real world, by idealizing a
real situation for the goal of constructing a simplified real model.
The next step is mathematization, which enables the transition from an authentic real-world problem
into a mathematically formulated problem, i.e., the construction of a mathematical model that responds
A RUBRIC FOR ASSESSING MATHEMATICAL MODELLING PROBLEMS 269

Downloaded from https://academic.oup.com/teamat/article/42/3/266/6751014 by guest on 30 October 2024


Fig. 2. A modelling cycle from didactical viewpoint.

to the real model. When engaging in this process, the learner undergoes a process of extrication of the
mathematics essential for analysis, definition and solution of the problem, using activities such as iden-
tification of the problem’s mathematical aspects, identification of meaningful variables, identification of
constraints and making assumptions and choice of suitable representation or construction of a new one.
Investigation of the model lies entirely within the mathematical world and simply means working
mathematically to solve the mathematical model. This process relates to the ability to implement
mathematical concepts, facts and algorithms in an effort to solve mathematically phrased problems and
reach mathematical conclusions. During the process of investigating the mathematical model, one must
carry out the necessary mathematical processes for getting results and finding a mathematical solution.
These processes include making arithmetic calculations, solving equations, carrying out symbolic
manipulations, extracting mathematical information from tables and figures, working with different
representations and analyzing data.
The final step is the interpretation of the mathematical result(s), making sure it responds well to
the real-world model. The interpretation process includes thinking of solutions and interpretation of
mathematical results. This action relates to the translation of the mathematical solutions discovered at
the end of the investigation process, returning to the problem’s real-life context, and a determination of
whether the results suit the given context. The process includes assessment of solutions or mathematical
deductions according to the context of the problem, and a determination of whether or not the results are
reasonable and logical in the given situation.
This simplified model of the modelling cycle corresponds with modelling cycles presented in
mathematics standards (e.g., Common Core State Standards Mathematics, 2010), and the modelling
cycle used in the PISA Framework (OECD, 2013). The PISA conceptual framework defines the
mathematical literacy as ‘an individual’s capacity to formulate, employ and interpret mathematics in a
variety of contexts’ (OECD, 2019 p.75). This definition overlaps with the three main processes, reflected
in the model presented in Fig. 2. In terms of terminology, the mathematization step displayed in arrow
b corresponds with ‘formulate’, investigation of the model (displayed in arrow c) corresponds with
‘employ’ or ‘compute’, and interpretation (displayed in arrow d) corresponds with ‘interpret’.
The conceptual framework for this study is the simplified model of the modelling cycle. Due to its
strong connection to the PISA conceptual framework, and as the process of rubric design is based on the
proficiency levels identified in PISA mathematics, hereinafter we refer to the main steps of the modelling
cycles as Formulate, Employ and Interpret.
270 Z. KOHEN AND Y. GHARRA-BADRAN

In this paper, we describe the development of a rubric for assessment of mathematical modelling
problems in a scientific-engineering context, i.e., authentic problems emphasizing the applicability of
mathematics in this context, with particular emphasis on technological and engineering developments.
The problems are defined as ‘authentic’, as they are based on existing workplace problems (workplace
mathematics) for scientists and engineers, and ‘simplified’ so that they suit the world of teachers and
students (Bakker, 2014; Kohen & Orenstein, 2021). The examples used to describe the rubric design are
retrieved from a pool of mathematical modelling tasks designed as part of the i-MAT (Integrated Math
& Technology) program, in which mathematics teaching materials suitable for the Israeli curriculum are

Downloaded from https://academic.oup.com/teamat/article/42/3/266/6751014 by guest on 30 October 2024


developed. Each of these tasks includes a central modelling question that supports students to perform a
mathematization process of authentic contexts from the science and engineering domain. The scientific-
engineering context is explicitly described in the problem, as the assumption is that these problems
are aimed at mathematics lessons, thus the students do not possess the necessary scientific-engineering
knowledge. Then, the mathematization process aims at providing the students with the modelling
competencies necessary to solve the authentic problem. Following the central modelling question and
completion of the mathematization process, the task includes different examples used to practice and
deepen the modelling process. These examples allow for identification of the three central processes of
mathematical modelling—formulation, employment and interpretation. The aim of the rubric presented
in this study is to assess the modelling level of these examples.
When describing the rubric’s reliability process, we present a central example from the Iron Dome
task (Shapir, 2013), allowing for comprehension of the assessment process based on the rubric. Below
we describe the task’s modelling process, with the examples belonging to this task described in
detail later.
The Iron Dome task is intended for ninth grade students (based on the mathematical curriculum for
this grade in Israel, as further explained below). It begins with the presentation of a simplified scientific-
engineering situation, describing how the Iron Dome system works. Iron Dome is a central defense tool
against rocket fire and mortar shell attacks. It has an advanced radar that identifies a rocket launch and
passes the information regarding the rocket trajectory to a control system, which calculates the expected
hit site. A rocket is a type of weapon shaped like a cylinder with an aerodynamic structure (sharpened
head), moving by use of a simple rocket engine that does not have a navigation and guidance system.
For simplification of the problem, we assume that it is a very simple object (i.e., an object not influenced
by air resistance, thus the friction is neglected). The rocket is located by the radar immediately after
its launch, its angle from the ground is determined and its engine pushes it at a constant force from the
moment it ignites to the moment it is turned off. The rocket then continues at a free fall subject to gravity.
In accordance with the laws of free motion in physics, the rocket’s trajectory, from the moment the engine
shuts down, will be in the shape of a parabola, so that the vertical axis is parallel to the symmetry axis
(see Fig. 3).
Calculation of the rocket’s trajectory is done by the radar during the free fall stage, i.e., after the
engine has been shut down, which can be determined by various factors such as the rocket type. If the
hypothesized hit site is a populated area, an intercepting missile is launched, which explodes near the
rocket while it is still in the air.
At this stage the mathematization stage of the problem is implemented, and the central modelling
question asked—How can the rocket’s parabolic trajectory be predicted? This question allows for the
connection between the scientific-engineering context and the mathematical world, i.e., what is the
mathematical rule that can be used to provide a solution to the problem in the scientific-engineering
context? At this stage, the students should conclude that three non-linear points define a parabola on a
certain surface, thus the Iron Dome radar only needs to identify three points on the rocket’s parabolic
A RUBRIC FOR ASSESSING MATHEMATICAL MODELLING PROBLEMS 271

Downloaded from https://academic.oup.com/teamat/article/42/3/266/6751014 by guest on 30 October 2024


Fig. 3. Visualization of the rocket trajectory by Iron Dome.

trajectory in order to know its exact trajectory and hypothesized hit site. This calculation is based on the
manner of solving a parabolic equation, i.e., solving an equation of a quadratic function, which is at the
heart of the Israeli ninth grade curriculum.

3. Methodology and findings


In this section, we describe the rubric design process (3.1), validation (3.2) and reliability (3.3). Figure 4
displays visually these processes of how the rubric was designed, validated and tested for reliability. We
following detail each one of these processes.

3.1. Rubric design process


The rubric is based on the proficiency scale identified in PISA mathematics (OECD, 2018), which
serves as the basis for the three basic processes of modelling a mathematical problem—formulation,
employment and interpretation. The rubric’s aim is to assess the mathematical modelling level of
authentic problems in a scientific-engineering context, based on assessment of each modelling process
separately. To that end, we designed a two-dimensional rubric that allows simultaneous assessment of
modelling level and modeling process type.
The rubric design process encompassed three stages, which were completed by the authors of this
paper. The first stage included mapping of all the sentences describing each of the six proficiency levels
of the different modelling processes. For example, the sentence appearing for level 6 below: ‘At Level
6, students can conceptualize, generalize and utilize information based on their investigations.’, was
categorized as belonging to the formulation process. It is important to mention that not all sentences
were classified as one unit. Some included sub-sentences that could be classified as belonging to more
than one process. In such cases, each sentence part was matched separately to the appropriate process.
The different classifications of the sentences appear in Table 1.
272 Z. KOHEN AND Y. GHARRA-BADRAN

Downloaded from https://academic.oup.com/teamat/article/42/3/266/6751014 by guest on 30 October 2024


Fig. 4. Rubric design, validation and reliability.

However, and as reported in previous research (Stacey, 2015), regarding some of the levels, we found
it difficult to map the different statements of the modelling processes, as some of them did not clearly
belong to a specific process. In addition, when looking at each of the three processes separately, we found
it difficult to clearly differentiate among the sentences classified as belonging to each level, specifically
relating to levels next to each other. These challenges arose since the PISA proficiency levels aim to
represent the increase in skills and abilities by locating cut-off points across levels rather than attributing
them to the modelling processes. Thus, the second stage of the rubric design process included combining
sentences or sub-sentences from each pair of levels expressing similar belonging to a specific process. In
addition, when necessary, we changed the wording of the sentence so that it best expresses its belonging
to a particular process. An example of this can be seen in the formulation process in the move between
Level 1 and Level 2. The sentence classified for the formulation process at level 1 was ‘Students can
answer questions involving familiar contexts where all relevant information is present and defined. They
are able to identify information.’, and the sentence classified for this process at level 2 was ‘Students can
interpret and recognize situations in contexts that require no more than direct inference. They can extract
relevant information from a single source and make use of a single representational mode.’ In this case,
it is difficult to point at a concrete difference between the levels, particularly as the difference between
a familiar context and a context necessitating a direct inference only is not sufficiently clear to point at
two different levels of formulation. Additionally, we found that ‘interpret and recognize’ in the context
of formulating a mathematical model out of a real-world situation can be simply changed to ‘identify’.
Thus, in the second stage of rubric design, the lower level of the formulation process, which represents
A RUBRIC FOR ASSESSING MATHEMATICAL MODELLING PROBLEMS 273

the combination of the definitions appearing at level 1 and level 2, was defined as ‘Students can identify
situations involving familiar contexts necessitating a direct inference only. All relevant information is
present and the questions are clearly defined; they are able to extract relevant information from a single
source and make use of a single representational model.’ This stage was finalized with the two lower
levels (1–2) combined into the low level, the third and fourth levels combined into the intermediate level
and the two highest levels (5–6) combined into the high level.
In the third stage of the rubric design, we adapted the sentences so that they expressed the level of
the problem’s impetus for modelling rather than the students’ modelling abilities, since the rubric’s

Downloaded from https://academic.oup.com/teamat/article/42/3/266/6751014 by guest on 30 October 2024


aim was to assess to what extent a mathematical modelling problem invites the students to implement
modelling competencies, not their modeling abilities. For example, the following sentence retrieved
from the low level of the formulation process, displayed above: ‘Students can identify situations
involving familiar contexts necessitating a direct inference only’, changed into ‘(the problem invites)
Identification of situations involving familiar contexts, necessitating a direct inference only’. Further,
as in the previous stage, when necessary we changed the wording of the sentence in order to convey
its belonging to a particular process. For example, the concept ‘one source’ was expanded into ‘one
source or representation model’, in order to include cases where the data appear through representations.
Accordingly, we arrived at the following definition, for the second part of the low-level definition for
formulation: ‘Extrication of relevant information from a single source or representation model, where
all relevant information is clearly defined and presented’.
In this way, all the sentences from the proficiency scale were classified to the three various levels for
each of the modelling processes, which yielded a two-dimensional rubric where one dimension included
the three processes of formulation, employment and interpretation, and the other related to three different
levels—low, intermediate and high.

3.2. Rubric validation


The validation process was made up of two stages. In the following, we describe the first stage of the
content validity process.

3.2.1. Rubric validation—phase 1. The first stage included content validity by the authors of this paper
and seven graduate students in the mathematics education field who have changed their academic or
career path from science or engineering fields to mathematics education and were pursuing an advanced
degree in mathematics education. Thus, these students had the ability to validate the transition from the
scientific-engineering context to the mathematical context and vice versa, for the modelling problems
that are at the base of this study.
The rubric for assessment of mathematical modelling problems in a scientific-engineering context
was validated during the mathematics education seminar course that was conducted during the spring
semester of 2020 and was held online as a result of an extensive transition to online learning in all
educational institutions in Israel due to COVID-19 restrictions. The course included both synchronous
and asynchronous lessons and was taken by graduate students retraining to teach mathematics from
engineering fields at a leading Israeli university. The validation process was carried out during two class
sessions, two academic hours long each. The first session included an exposure to the mathematical
proficiency scale of PISA’s conceptual framework (Table 1, left column) by the instructor of the course
(the first author). The students were then divided into three Zoom breakout rooms (2–3 students per
room) and were asked to match the sentences in the table to one of the three mathematical modelling
processes: formulating, employing or interpreting and the appropriate level—low, intermediate or high,
Table 1. The initial stage of rubric design for assessment of mathematical modelling problems
274

The original proficiency levels (OECD, 2018) Sentences classified to the Sentences classified to the Sentences classified to the
formulation process employment process interpretation process

1 At Level 1, students can answer questions At Level 1, students can and to carry out routine N/A
involving familiar contexts where all answer questions involving procedures according to direct
relevant information is present and the familiar contexts where all instructions in explicit
questions are clearly defined. They are relevant information is present situations. They can perform
able to identify information and to carry and the questions are clearly actions that are almost always
out routine procedures according to direct defined. They are able to obvious and follow
instructions in explicit situations. They identify information. immediately from the given
can perform actions that are almost stimuli.
always obvious and follow immediately
from the given stimuli.
2 At Level 2, students can interpret and At Level 2, students can Students at this level can They are capable of making
recognize situations in contexts that interpret and recognize employ basic algorithms, literal interpretations of the
require no more than direct inference. situations in contexts that formulae, procedures or results.
They can extract relevant information require no more than direct conventions to solve problems
from a single source and make use of a inference. They can extract involving whole numbers
single representational mode. Students at relevant information from a
this level can employ basic algorithms, single source and make use of
formulae, procedures or conventions to a single representational mode.
solve problems involving whole numbers.
They are capable of making literal
interpretations of the results.
3 At Level 3, students can execute clearly Their interpretations are At Level 3, students can Their solutions reflect that
described procedures, including those sufficiently sound to be a base execute clearly described they have engaged in basic
that require sequential decisions. Their for building a simple model. procedures, including those interpretation and reasoning.
interpretations are sufficiently sound to Students at this level can that require sequential
Z. KOHEN AND Y. GHARRA-BADRAN

be a base for building a simple model or interpret and use decisions


for selecting and applying simple representations based on for selecting and applying
problem-solving strategies. Students at different information sources simple problem-solving
this level can interpret and use and reason directly from them. strategies.
representations based on different They typically show some
information sources and reason directly ability to handle percentages,
from them. They typically show some fractions and decimal
ability to handle percentages, fractions numbers, and to work with
and decimal numbers, and to work with proportional relationships
proportional relationships. Their
solutions reflect that they have engaged
in basic interpretation and reasoning.
(Continued)

Downloaded from https://academic.oup.com/teamat/article/42/3/266/6751014 by guest on 30 October 2024


Table 1. Continued
The original proficiency levels (OECD, 2018) Sentences classified to the Sentences classified to the Sentences classified to the
formulation process employment process interpretation process

4 At Level 4, students can work effectively At Level 4, students can work Students at this level can They can construct and
with explicit models for complex effectively with explicit utilize their limited range of communicate explanations and
concrete situations that may involve models for complex concrete skills and can reason with arguments based on their
constraints or call for making situations that may involve some insight, in interpretations, arguments and
assumptions. They can select and constraints or call for making straightforward contexts. actions.
integrate different representations, assumptions. They can select
including symbolic, linking them directly and integrate different
to aspects of real-world situations. representations, including
Students at this level can utilize their symbolic, linking them
limited range of skills and can reason directly to aspects of
with some insight, in straightforward real-world situations.
contexts. They can construct and
communicate explanations and
arguments based on their interpretations,
arguments and actions.
5 At Level 5, students can develop and At Level 5, students can They can select, compare and They begin to reflect on their
work with models for complex situations, develop and work with models evaluate appropriate work and can formulate and
identifying constraints and specifying for complex situations, problem-solving strategies for communicate their
assumptions. They can select, compare identifying constraints and dealing with complex interpretations and reasoning.
and evaluate appropriate problem-solving specifying assumptions. problems related to these
strategies for dealing with complex models. Students at this level
problems related to these models. can work strategically using
Students at this level can work broad, well-developed
strategically using broad, well-developed thinking and reasoning skills,
thinking and reasoning skills, appropriate appropriate linked
A RUBRIC FOR ASSESSING MATHEMATICAL MODELLING PROBLEMS

linked representations, symbolic and representations, symbolic and


formal characterizations and insight formal characterizations and
pertaining to these situations. They begin insight pertaining to these
to reflect on their work and can formulate situations.
and communicate their interpretations
and reasoning.
(Continued)
275

Downloaded from https://academic.oup.com/teamat/article/42/3/266/6751014 by guest on 30 October 2024


276

Table 1. Continued
The original proficiency levels (OECD, 2018) Sentences classified to the Sentences classified to the Sentences classified to the
formulation process employment process interpretation process

6 At Level 6, students can conceptualize, At Level 6, students can Students at this level are Students at this level can
generalize and utilize information based conceptualize, generalize and capable of advanced reflect on their actions and can
on their investigations and modelling of utilize information based on mathematical thinking and formulate and precisely
complex problem situations and can use their investigations and reasoning. These students can communicate their actions and
their knowledge in relatively modelling of complex problem apply this insight and reflections regarding their
non-standard contexts. They can link situations, and can use their understanding, along with a findings, interpretations,
different information sources and knowledge in relatively mastery of symbolic and arguments and the
representations and flexibly translate non-standard contexts. They formal mathematical appropriateness of these to the
among them. Students at this level are can link different information operations and relationships, original situation.
capable of advanced mathematical sources and representations to develop new approaches and
thinking and reasoning. These students and flexibly translate among strategies for attacking novel
can apply this insight and understanding, them. situations.
along with a mastery of symbolic and
formal mathematical operations and
relationships, to develop new approaches
and strategies for attacking novel
situations. Students at this level can
reflect on their actions and can formulate
Z. KOHEN AND Y. GHARRA-BADRAN

and precisely communicate their actions


and reflections regarding their findings,
interpretations, arguments and the
appropriateness of these to the original
situation.

Downloaded from https://academic.oup.com/teamat/article/42/3/266/6751014 by guest on 30 October 2024


A RUBRIC FOR ASSESSING MATHEMATICAL MODELLING PROBLEMS 277

Table 2. Definition of the formulation process in the initial validated rubric

Low level Intermediate level High level

Identification of situations Working effectively with Developing/generalization of


involving familiar contexts, explicit models that describe models that describe complex
necessitating a direct complex concrete situations, situations from unfamiliar or
inference only. with some constraints or unusual contexts, while
Formulation Extrication of relevant assumptions. Connecting identifying constraints and

Downloaded from https://academic.oup.com/teamat/article/42/3/266/6751014 by guest on 30 October 2024


information from single between different specifying assumptions.
source or representation information sources and/or
model, where all the different representations in Linking different information
relevant information is order to extricate the data. sources/representations so that
presented. it takes some effort to extricate
the data and interpret them.

similarly to the first stage of rubric design. Following this, a class discussion of the classifications ensued,
which ended with all the students in complete agreement. After this process was completed, the rubric
was presented and was found to match at 90% level to the agreement the class had reached.

3.2.2. Rubric validation—phase 2. The second stage included a process of content between the second
author and two mathematics education experts who are part of the R&D group who design the problems
assessed in this study as part of the i-MAT program.
At this stage, the three experts underwent a similar process to the one the graduate students had
undergone, i.e., mapping of the sentences from the mathematical proficiency scale to one of the three
mathematical modelling processes: formulating, employing or interpreting and the appropriate level—
low, intermediate or high. These experts presented a match at 96% level to the classification made by the
authors of this study during the design process of the rubric. The experts further performed the next two
stages of rubric design, i.e., the combining of sentences or sub-sentences from each pair of levels for each
of the modelling processes, and the transition to the terminology that expressed the problem’s impetus for
modelling. At these stages, major changes were made to simplify the definition from a mathematical view
of point, in order to emphasize and highlight the differences among the levels regarding each process.
Also, when necessary, the wording of the sentences taken from the proficiency scale was updated. At the
end of the process, we had an initial validated rubric. Table 2 presents the section of the rubric describing
the formulation process.
After the validation of the initial rubric, the experts managed to identify criteria that allowed them to
make a clear comparison and differentiation among the three levels—low, intermediate and high. For
example, for the formulation process, four criteria were identified and defined, relating to each of the
three levels. Moreover, the three judges claimed that in some cases, not all criteria should be observed
in a problem when determining the level of each of the modelling processes. The distribution to various
criteria, therefore, enabled a clear distinction between the various levels, and a refined process of coding
a problem’s level, i.e., observing a high level in one criterion will mark the whole process as high,
regardless of whether other criteria show an intermediate level or a low level.
In this classification, the formulation process was defined on the basis of four criteria. The First
was identification of situations, relating to the degree to which the question invites identification or
generalization to an existing model. The second criterion is identification of constraints and assumptions
that relate to considering constraints, and make appropriate assumptions, if needed. The third criterion
278 Z. KOHEN AND Y. GHARRA-BADRAN

Table 3. Definition of type of information extraction criterion in the formulation process, following the experts’
content validation stage
Low level Intermediate level High level

Formulation Extraction of relevant Extraction of clearly Extraction of information, all


information from one source presented and relevant or some of which is not clearly
or representation, where all information, with direct provided, with integration of
the relevant information is integration of different different information sources

Downloaded from https://academic.oup.com/teamat/article/42/3/266/6751014 by guest on 30 October 2024


presented explicitly. information sources or or representations.
representations.

is familiarity with the mathematical context, related to the familiarity and clarity of the mathematical
context. The fourth criterion is type of information extraction that refers to the extent to which the
information is accessible and clear.
After identification of the joint criterion of information extraction, we updated the definitions. This
process was based on a discussion among the judges regarding examples taken from the Iron Dome task
and others, which allowed one to concretely examine the wording of each criterion at each level, as well
as carry out an assessment of the accuracy of the rubric’s assessment process relating to the different
problems. Accordingly, some sentences were reworded. The major change consisted of rephrasing the
sentence mapped to the intermediate and high level in order to emphasize the method of extraction of
information, so that it starts with the phrase ‘extraction of...,’ as in the low-level definition. Other minor
changes of rephrasing were done in order to emphasize the difference and switching between levels
to make a clear distinction between the levels. For the lowest level, we emphasized that the relevant
information is presented explicitly, in order to highlight that in this level all the information needed is
presented clearly. For the intermediate level, the phrase ‘connection among information sources’ was
rewritten to say ‘direct integration of different information sources’, in order to differentiate between
the integration done at the intermediate level, which is a direct integration of information sources
or representations, and the integration done at the high level, where we need a connection between
information sources/representations, not necessarily through direct integration. For the high level, we cut
out the definition of effort, as it was found to be unclear and/or does not allow for an objective problem
definition. In addition, we added a uniform wording regarding information sources, in order to create
uniformity with the other levels. In this way we created clear cross-sections of the criteria, emphasizing
the differences between the different levels for each criterion. Table 3 presents the definitions made to
the type of information extraction criterion in the formulation process.
The rubric’s validation process ended when all three judges fully agreed regarding all the criteria and
their cross-sections, characterizing the classification into the different levels.

3.3. Rubric reliability process


In order to examine the rubric’s reliability, the three mathematical education experts who participated
in the second stage of the validation process used the rubric to code three examples from the pool of
mathematical modelling tasks designed as part of the i-MAT program; two from the Iron Dome task and
one from another task called Mobileye, dealing with a connection between Mobileye’s revolutionary
technology (prevention of motor accidents) and the topic of triangle similarity (the full tasks coded
appear on the program site, www.imat.org.il).
A RUBRIC FOR ASSESSING MATHEMATICAL MODELLING PROBLEMS 279

Downloaded from https://academic.oup.com/teamat/article/42/3/266/6751014 by guest on 30 October 2024


Fig. 5. Example #3, retrieved from the Iron Dome task.

In an effort to examine the matching among the different codings, the judges were asked to assess the
modelling level of the various examples based on the rubric, i.e., assessing the modelling level for each of
the modelling processes separately, and to mark in the rubric the criteria they used to code each process
at the different levels. When there was no agreement among the judges regarding a particular process,
they used the coded criteria to justify their choice, so that the discussion of the coding took place on the
basis of the different criteria. As a measure of inter-rater reliability, Kappas (Fleiss & Cohen, 1973) were
calculated for each example. The kappa values arrived at were at the range of 0.609–1, demonstrating an
intermediate-to-high reliability level. In addition, in order to examine agreement percentages, we carried
out an internal reliability test using the Krippendorff method (Hayes & Krippendorff, 2007), which
includes the inter-rater agreement element, so that it matches the number of judges and multiple coding
levels. The test was carried out with 10,000 bootstrap samples with the SPSS program and demonstrated
a high Alpha Reliability level of 0.84 (Krippendorff’s Estimate), with an agreement percentage of 80%.
Below we present one of the Iron Dome samples analyzed (Example #3) (Fig. 5), its solution (Fig. 6),
and the coding the three judges gave it. It is important to mention that the rubric aims at the assessment
of a full example that includes a few sections rather than each section separately, as the examples are
constructed in a graduated manner on the basis of several inter-connected sections. For example, it is
possible that the solution of section b will rely on the formulating process carried out in section a or a
part of it, so that in such a case it is difficult to determine a different analysis for each section. Thus,
it was decided that the assessment using the rubric will be carried out on a full example, with the final
modelling level determined by adding up the sections, so that if there is a certain section raising the level
of one of the modelling processes, the final level will be determined according to this section.
Figure 6 presents a suggested solution for this example.
Regarding the suggested solution, we note a rise from intermediate to a high level of formulation for
the two last sections. In section ‘a’, we need identification and matching between the problem’s data
and a mathematical model (a quadratic function and its symmetry axis), the mathematical context is
familiar and the relevant information is provided clearly. In section ‘b’, the information needed to solve
the question is clear and presented, the mathematical context is familiar and clear and the only thing
280 Z. KOHEN AND Y. GHARRA-BADRAN

Downloaded from https://academic.oup.com/teamat/article/42/3/266/6751014 by guest on 30 October 2024


Fig. 6. A solution to example #3, retrieved from the Iron Dome task.

necessary is information identification. And in section ‘c’, we need to integrate information received
in former sections and match the data to a quadratic function equation, thus, in this case the level
of formulation is intermediate. In order to solve section ‘d’, two stages are required. The first is the
understanding that there is a need to find the point of intersection of the parabola with the x axis, the
second that there is a need to connect the different representations (graphic/algebraic) while considering
the constraints of an impossible zero point and the rocket’s hit site. In addition, in section ‘e’, there is a
need to generalize the model, while considering constraints from several information sources, thus it is
a formulation of a model describing a complex situation.
Table 4 presents the analysis done on the example from the Iron Dome task by the three experts, with
grades they assigned to each modelling process, along with the criteria by which they determined the
coding.
A RUBRIC FOR ASSESSING MATHEMATICAL MODELLING PROBLEMS 281

Table 4. Coding of example #3 from the Iron Dome task by three experts

Expert #1 Expert #2 Expert #3

Formulation Level High High High


Justification Identification of Identification of Identification of
situations situations constraints and
Identification of Identification of assumptions
constraints and constraints and

Downloaded from https://academic.oup.com/teamat/article/42/3/266/6751014 by guest on 30 October 2024


assumptions assumptions
Employment Level Intermediate High Intermediate
Justification Mathematical procedures Mathematical procedures Mathematical procedures
needed needed needed
Interpretation Level High High High
Justification Reporting the Reporting the Reporting construction of
mathematical solution mathematical solution claims and explanations
Reporting construction of Reporting construction of
claims and explanations claims and explanations

As can be seen in Table 4, there was almost complete agreement between experts #1 and #3 regarding
the coding of each of the levels. The difference between them and expert #2 stemmed from the assessment
of the employment level. Expert #2 assessed this level in the example as high, when experts #1 and #3
assessed it as intermediate. After all the assessments were received, the experts met in an effort to discuss
their analysis, regarding both agreements and disagreements.
At the beginning of the meeting a summary of the results received from the different analyses was
presented by the second author of this study (Expert #1). This was followed by a deep discussion
of the elements the experts had agreed upon, making sure that their assessments regarding a specific
level stemmed from the same reason. The next stage included a discussion of their coding of each
of the modelling processes, particularly discussing the point they had disagreed upon, in an effort to
reach full agreement. Regarding the coding of the formulation process, while experts #1 and #2 based
their justification for choosing the high level on both criterions of ‘Identification of constraints and
assumptions’ and ‘Identification of situations’, expert #3 claimed that the criterion ‘Identification of
constraints and assumptions’ was sufficient for her to determine the high level of the formulation process
the problem invites. However, she also agreed that compatibility and generalization of the model were
necessary in order to deal with the situation presented in section ‘e’, where a new quadratic function
should be created, thus an agreement was reached regarding this criterion, leading to similar coding.
Regarding the coding of the employment level, expert #2 assessed the level as high, claiming that it
needs processes necessitating a series of informed decision-making. Experts #1 and #3 claimed that the
question’s employment level was intermediate, as in sections ‘d’ and ‘e’, once the formulation process is
undertaken successfully, the mathematical steps needed to solve these sections need a series of decision-
making and simple strategies (trial and error) that include examination of a number of quadratic functions
until one is found, which meets the criteria. As a supportive argument, they offer the following example,
according to which, had the question been worded thus—‘What is the range of possible y values of the
highest point for which the rocket falls in open areas?’, then, indeed, the employment process would
be defined at the high level, as in this situation there is a process of development of new strategies, and
a series of informed decision-making in needed. Expert #2 accepted these claims and agreed that the
employment process is, indeed, at an intermediate level.
282 Z. KOHEN AND Y. GHARRA-BADRAN

Table 5. The rubric for assessing mathematical modelling problems in a scientific-engineering context

Modelling process Criterion Low level Intermediate level High level


The problem
invites . . .

Formulation Identification of Identification of Identification and Identification of complex


The extent to which the situations situations in contexts matching between the situations, and
problem provides the that require direct data presented in the development and/or

Downloaded from https://academic.oup.com/teamat/article/42/3/266/6751014 by guest on 30 October 2024


mathematical structure inference only, based on problem and clear compatibility and/or
needed for solving it, the data explicitly models describing generalization of models
inviting the student to included in the problem. concrete situations. describing them.
use Mathematics. Identification of N/A Identification of Identification of
constraints and constrains. constraints and
assumptions specifying detailed
assumptions.
Familiarity with The mathematical The mathematical context The mathematical
the mathematical context is familiar and is familiar. At times the context is unfamiliar or
context clear. problem invites different unusual. At times the
mathematical contexts. problem invites different
mathematical contexts.
Type of Extraction of relevant Extraction of clearly Extraction of
information information from one presented and relevant information, all or some
extraction source or representation, information, with direct of which is not clearly
where all the relevant integration of different provided, with
information is presented information sources or integration of different
explicitly. representations. information sources or
different representations.
Employment Mathematical Employment of routine Employment of Employment of
The extent to which the procedures mathematical mathematical procedures mathematical procedures
problem demands use needed procedures (algorithms, necessitating a series of necessitating a series of
of concepts and facts, equations or basic simple decision-making, informed
making calculations conventions), which are including selecting decision-making,
and manipulations in clearly stated or simple strategies or including selection and
order to reach obvious, based on direct different representations assessment, development
mathematical solutions. instructions in the for solving the problem. and generalization of
problem. approaches and strategies
and/or new or familiar
representations
(respectively).
(Continued)

Relating to the coding of the interpretation process, experts #1 and #2 agreed regarding the same two
criteria, which led to the problem being classified as being at a high level. The third expert coded the
level of the interpretation process as high based on one criterion only, which the others agreed with. The
criterion she did not mark was ‘reporting mathematical solution’. In the ensuing discussion, experts #1
and #2 justified the need for this criterion in coding the level of the interpretation process, by that section
‘d’ calls for a re-evaluation of the mathematical solution within the context of the real world, since one
of the solutions of this section needs to be rejected for a reason that comes from the real world rather
than from the mathematical one. Expert #3 accepted this claim and agreed that this criterion also testifies
to a high level of the interpretation process. A similar process had been done regarding other examples,
yielding to the final version of the rubric. Table 5 presents the full, final rubric.
A RUBRIC FOR ASSESSING MATHEMATICAL MODELLING PROBLEMS 283

Table 5. Continued
Modelling process Criterion Low level Intermediate level High level
The problem
invites . . .

Interpretation Reporting the Providing a Providing a mathematical Reporting on a reflective


The extent to which the mathematical mathematical answer answer, along with process regarding the
problem necessitates solution only. reporting on basic solution process and the
the creation and reasoning, and findings, including

Downloaded from https://academic.oup.com/teamat/article/42/3/266/6751014 by guest on 30 October 2024


presentation of interpretation of the assessment of the range
explanations and claims results in light of the data and limitations of the
in a real-world context, presented in the problem. solution or the
while thinking both mathematical model in a
about the whole real-world context.
modelling cycle and the Reporting Literal interpretation of Construction and Construction and
results arrived at. construction of results. presentation of short formulation of
claims and explanations and claims explanations and
explanations based on certain insights reasoning for the thought
and specific actions process which led to the
leading to the solution. solution.

4. Discussion, limitations and study contribution


According to researchers (e.g., Borromeo Ferri, 2018; Schukajlow et al., 2021), mathematics modelling
is an essential competency that students of all ages should acquire. Indeed, efforts are made around
the world to include modelling problems as part of formal mathematical lessons in both primary and
secondary school (Lingefjärd, 2011). Yet, research points at challenges in implementing modelling-based
instruction, mainly due to materials that are not closely related to the formal curriculum, the time spent on
modelling problems during mathematics lessons, which is not aligned with other mathematics learning
problems, and difficulty in assessment and grading of mathematical modelling problems (Yu & Chang,
2011; Vos, 2013; Borromeo Ferri, 2018).
The current study aims to respond to one of the aforementioned challenges, presenting a rubric
designed and validated based on the use of extended problems reflecting multiple steps of the modelling
cycle (e.g., Blum et al., 2007; Schukajlow et al., 2021). The rubric as an assessment tool allows one to
evaluate the extent to which authentic mathematical modelling problems with a scientific-engineering
context invite the application of the three central modelling processes, namely formulation, employment
and interpretation. From a didactical viewpoint, these clearly arranged processes are at the base of a full
modelling cycle (Kaiser, 1995; Blum, 1996; Maaß, 2005), and assist in understanding what processes are
involved in modelling applied in educational settings by teachers and students (Blum, 2015; Borromeo
Ferri, 2018; Maaß, 2006). Thus, the study attempts to address the existing research gap regarding the
need for a valid and reliable tool serving to assess modelling problems that reflect the full modelling
cycle (Blum, 2011; Wess et al., 2021).
The study defines and bases the central modelling processes on the proficiency levels identified in
PISA Mathematics. The rubric presents a method of evaluation that transforms the one-dimensional
mapping of proficiency levels (NRC, 2001; Stacey & Turner, 2015) into a two-dimensional one,
reflecting the combined evaluation of the type of modelling process and its level. Transitioning to
a two-dimensional indicator allows for the evaluation of a modelling problem in terms of the full
modelling cycle, while evaluating each of the modelling processes independently—each can be assessed
284 Z. KOHEN AND Y. GHARRA-BADRAN

as reflecting a different modelling level. In this connection between the modelling cycle and the
assessment of the level of each modelling process based on the PISA proficiency levels, this study offers
a theoretical contribution to the relation between educational modelling and applied modelling, which is
described in literature as distinctive (Kaiser & Sriraman, 2006).
Additionally, during the rubric’s validation process by experts, criteria that enable a clear comparison
and differentiation among the different levels across each of the modelling processes were identified.
Although these criteria are addressed in the proficiency scale, they are not explicitly declared, nor
separated, thus differences among the levels are inconclusive, and the assigning of each proficiency

Downloaded from https://academic.oup.com/teamat/article/42/3/266/6751014 by guest on 30 October 2024


level to a specific modelling process is challenging and not trivial (Stacey, 2015). Therefore, these
criteria allow for concrete theoretical definitions for the various modelling processes, introducing a well-
defined mathematical modelling framework from a didactical viewpoint, validated through the design of
an assessment rubric.
Further, the reliability process of the designed rubric demonstrated the applicability of the tool as
an objective measurement of a problem soliciting the use of mathematical modelling competency,
specifically when applied to a full modelling cycle problem (Stacey, 2015). This implies that the rubric
can serve as a formative assessment tool when designing and developing content, problems or tasks that
invite the application of modelling competencies by students.
A unique aspect of this study is the in view of the rubric in respect to modelling problems oriented
toward the connection between authentic real-world applications and school mathematics (Kohen &
Orenstein, 2021). This authentic context reflects the significant role of modelling as a method for better
understanding the world, which is full of mathematics, particularly in authentic applications retrieved
from the scientific and engineering fields, which are not sufficiently addressed in the literature about
modelling (Damlamian et al., 2013; Ferry, 2018; Kaiser et al., 2013; Maaß et al., 2018). Yet, at the same
time, this also reflects one of the main limitations of this study, which is that it focuses exclusively on
authentic problems within a scientific-engineering context only, ignoring other contexts. Specifically, the
rubric does not relate to the idealizing process occurring in the real world only, but solely to the processes
connecting the real world to the mathematical world, i.e., formulation and interpretation, as well as the
process occurring in the mathematical world only, i.e., employment. While the rubric’s reliability is
determined on the basis of mathematical problems in a scientific-engineering context only, the present
study was validated on the basis of the different proficiency levels, which are general and relate to
authentic problems in different contexts. Thus, we believe that the rubric is suitable for assessment of
authentic mathematical problems in different contexts. Nonetheless, we encourage further research in this
regard, for scientifically exploring how the rubric can be applied to various applications and situations.
We suggest, for example, to examinee this in a follow-up study on the basis of concrete examples from
different contexts, such as the ones mentioned in the PISA conceptual framework—the personal context
relating to aspects of daily life, such as preparing food, games, personal health, sports and travel, the
social context relating to community aspects, focusing on problems of a community-related nature, such
as voting in elections, public transport and economics, and the employment context, relating to aspects
of the labor world, such as pricing, salaries, quality control and setting timetables.
Another limitation of the present study is the fact that the idealizing process was facilitated by the
problem developers, as the problems were suited for formal mathematics lessons in school, thus focus
on moving to the mathematical world (i.e., the formulating process), a mathematical solution matching
the curriculum (employment) and back to the real world (interpretation). Thus, the rubric described in this
study is not aimed at assessing the idealizing level, rather focusing on the three processes of formulation,
employment and interpretation. We recommend that a follow-up study expand the rubric, enabling it to
also assess ‘open’ modelling problems, including assessment of the idealizing stage.
A RUBRIC FOR ASSESSING MATHEMATICAL MODELLING PROBLEMS 285

To sum up, this study targets stakeholders who have an understanding of what lies behind authentic
mathematical modelling problems, both in their conceptualization and in the practical issues of effective
design and implementation of such problems. This requires a useful measurement methodology that
enables clarity regarding what and how to measure in such problems. The present study presents the
design process of an innovative assessment tool—a rubric for assessing the mathematical modelling level
of authentic problems, allowing us to characterize and categorize mathematical modelling problems. This
study can contribute to the theoretical understanding of the connection between authentic mathematical
modelling problems in a scientific-engineering context, reflecting applied modelling, and different

Downloaded from https://academic.oup.com/teamat/article/42/3/266/6751014 by guest on 30 October 2024


mathematical modelling levels expressing the perception of modelling as educational modelling.
The uniqueness of the rubric lies in the possibility of parallel classification of the same problem to
different modelling processes, with each process representing a different modelling level. In addition, the
study provides an assessment tool, which may be used in the design and assessment process of authentic
mathematical modelling problems, shedding light on the main characteristics of authentic mathematical
modelling problems at different levels.
As a methodological contribution, the study attempts to define an effective validation and reliability
process of the rubric on the basis of mathematical modelling problems in a scientific-engineering context.
This process was done in two stages, first by experts knowledgeable in the theoretical framework
of mathematical modelling with an academic background in science and engineering suitable for the
assessment of mathematical modelling problems in a scientific-engineering context; and second, by
mathematical education experts who regularly engage in designing mathematical modelling problems
in this context.
Practically, the rubric presented in this paper can benefit teachers by helping them understand how
authentic mathematical problems can be used to improve their students’ mathematical proficiency. An
additional contribution to teachers using these problems is the ability to identify questions or problems
inviting application of modelling competencies by the students, as well as updating existing questions
(based on using the rubric) in order to get them to a level representing a high mathematical modelling
level. The study may also be beneficial for problem designers who can utilize the assessment rubric to
evaluate the modelling level of a problem and update it, so as to increase or decrease literacy levels as
needed. Moreover, it is also likely to be of relevance to researchers who can use the methodology as
a validated research tool for evaluation studies that aim to promote the body of knowledge regarding
mathematical modelling.

Acknowledgment
We would like to thank Dr. Ortal Nitzan, Mrs. Hadas Handelman, and all the graduate students who contributed to
the design of the rubric. Their role and contribution to this research are greatly appreciated.

References
Bakker, A. (2014) Characterising and developing vocational mathematical knowledge. Educ. Stud. Math., 86,
151–156.
Blum, W. (1996) Anwendungsbezüge im Mathematikunterricht—trends und perspektiven. Schrift. Didakt. Math.,
23, 15–38.
Blum, W. (2011) Can modelling be taught and learnt? Some answers from empirical research. Trends in Teaching
and Learning of Mathematical Modelling (G. Kaiser, W. Blum, R. Borromeo Ferri & G. Stillman eds).
Dordrecht: Springer, pp. 15–30.
Blum, W. & Leiß, D. (2006) “Filling up”—the problem of independence-preserving teacher interventions in lessons
286 Z. KOHEN AND Y. GHARRA-BADRAN

with demanding modelling tasks. In M. Bosch (Ed.), Proceedings of the Fourth Congress of the European
Society for Research in Mathematics Education (CERME 4). Barcelona, Spain: Universitat Ramon Llull
Editions, pp. 1623–1633.
Blum, W. & Niss, M. (1991) Applied mathematical problem solving, modelling, applications, and links to other
subjects—state, trends and issues in mathematics instruction. Educ. Stud. Math., 22, 37–68.
Blum, W., Galbraith, P., Henn, H.-W. & Niss, M. (eds.) (2007) Modelling and Applications in Mathematics
Education. New York: Springer.
Blum, W. (2015) Quality teaching of mathematical modelling: What do we know, what can we do? In The

Downloaded from https://academic.oup.com/teamat/article/42/3/266/6751014 by guest on 30 October 2024


Proceedings of the 12th International Congress on Mathematical Education. Cham: Springer, pp. 73–96.
Borromeo Ferri, R. (2018) Leaning How to Teach Mathematical Modeling in School and Teacher Education.
Cham: Springer.
Common Core State Standards Initiative. (2010). Common Core State Standards for mathematics. Retrieved
from http://www.corestandards.org/assets/CCSSI_Math%20Standards.pdf.
Damlamian, A., Rodrigues, J. F. & Sträer, R. (eds.) (2013) Educational Interfaces Between Mathematics and
Industry: Report on an ICMI-ICIAM-Study. New York: Springer.
De Lange, J. (2006) Mathematical literacy for living from OECD-PISA perspective. Tsukuba Journal of Edu-
cational Study in Mathematics. Vol. 25. Special Issue on The APEC-TSUKUBA International Conference
Innovative Teaching Mathematics through Lesson Study. Tokyo, Japan: University of Tsukuba, pp. 13–35.
Fleiss, J. L. & Cohen, J. (1973) The equivalence of weighted kappa and the intraclass correlation coefficient as
measures of reliability. Educ. Psychol. Meas., 33, 613–619.
Giberti, C. & Maffia, A. (2020) Mathematics educators are speaking about PISA, aren’t they? Teach. Math. Appl.
Int. J. IMA, 39, 266–280.
Hayes, A. F. & Krippendorff, K. (2007) Answering the call for a standard reliability measure for coding data.
Commun. Methods Meas., 1, 77–89.
Kaiser, G. (1995) Realitätsbezüge im Mathematikunterricht—Ein Überblick über die aktuelle, und historische
Diskussion. Materialien für einen realitätsbezogenen Mathematikunterricht (G. Graumann et al. eds). Bad
Salzdetfurth: Franzbecker, pp. 66–84.
Kaiser, G. & Sriraman, B. (2006) A global survey of international perspectives on modelling in mathematics
education. Zentral. Didak. Math., 38, 302–310.
Kaiser, G., Bracke, M., Göttlich, S. & Kaland, C. (2013) Authentic complex modelling problems in mathemat-
ics education. Educational Interfaces Between Mathematics and Industry (A. Damlamian, J. F. Rodrigues
& R. Strässer eds). London, United Kingdom: Springer, pp. 287–297.
Kohen, Z. & Orenstein, D. (2021) Mathematical modeling of tech-related real-world problems for secondary
school-level mathematics. Educ. Stud. Math., 107, 71–91.
Lesh, R. & Doerr, H. M. (eds.) (2003) Beyond Constructivism: Models and Modeling Perspectives on Mathematics
Problem Solving, Learning, and Teaching. Mahwah, NJ: Erlbaum.
Li, T. (2013) Mathematical modeling education is the most important educational interface between mathe-
matics and industry. Educational Interfaces Between Mathematics and Industry: Report on an ICMI-
ICIAM-Study (A. Damlamian, J. F. Rodrigues & R. Strässer eds). London, United Kingdom: Springer,
pp. 51–58.
Lingefjärd, T. (2011) Modelling from primary to upper secondary school: findings of empirical research—
overview. Trends in Teaching and Learning of Mathematical Modelling (G. Kaiser, W. Blum, R. Borromeo
Ferri & G. Stillman eds), vol. 1. Dordrecht: Springer Netherlands, pp. 9–14.
Maa, K. (2005) Modellieren im mathematikunterricht der sekundarstufe I. J. Math.-Didak., 26, 114–142.
Maa, K. (2006) What are modelling competencies? ZDM—Int. J. Math. Educ., 38, 113–142.
Maa, J., O’Meara, N., Johnson, P. & O’Donoghue, J. (2018) Mathematical Modelling for Teachers: A Practical
Guide to Applicable Mathematics Education. Cham: Springer.
National Research Council (2001) Adding it up: helping children learn mathematics. Mathematics Learn-
ing Study Committee. Center for Education. Division of Behavioral and Social Sciences and Education
(J. Kilpatrick, J. Swafford & B. Findell eds). Washington, DC: National Academy Press.
A RUBRIC FOR ASSESSING MATHEMATICAL MODELLING PROBLEMS 287

Niss, M., Blum, W. & Galbraith, P. (2007, 2007) Introduction. Modelling and Applications in Mathematics
Education. The 14th ICMI Study (W. Blum, P. Galbraith, H.-W. Henn & M. Niss eds). New York, NY:
Springer Science + Business Media, LLC, pp. 3–32.
Organisation for Economic Cooperation and Development (OECD) (2013). PISA 2012 Assessment and
Analytical Framework. Paris: OECD Publishing, https://doi.org/10.1787/9789264190511-en.
Organisation for Economic Cooperation and Development (OECD) (2018). PISA 2021 Mathematics
Framework (Draft). Retrieved from http://www.oecd.org/pisa/publications/
Organisation for Economic Cooperation and Development (OECD) (2019). PISA 2018 Assessment and

Downloaded from https://academic.oup.com/teamat/article/42/3/266/6751014 by guest on 30 October 2024


Analytical Framework. PISA, OECD Publishing, Paris, https://doi.org/10.1787/b25efab8-en.
Schukajlow, S., Kaiser, G. & Stillman, G. (2021) Modeling from a cognitive perspective: theoretical con-
siderations and empirical contributions. Math. Think. Learn., 1–11. https://doi.org/10.1080/10986065.2021.
2012631.
Shapir, Y. (2013) Lessons from the iron dome. Military Strateg. Affairs, 5, 81–94.
Stacey, K. (2015) The real world and the mathematical world. Assessing Mathematical Literacy: The PISA
Experience (K. Stacey & R. Turner eds). New York: Springer, pp. 57–84.
Stacey, K. & Turner, R. (2015) The evolution and key concepts of the PISA mathematics frameworks. Assessing
Mathematical Literacy: The PISA Experience (K. Stacey & R. Turner eds). New York: Springer, pp. 5–34.
Vos, P. (2013) Assessment of modelling in mathematics examination papers: ready-made models and reproductive
mathematising. Teaching Mathematical Modelling: Connecting to Research and Practice (G. A. Stillman
et al. eds). Dordrecht, the Netherlands: Springer, pp. 479–488.
Wess, R., Klock, H., Siller, H. S. & Greefrath, G. (2021) Measuring professional competence for the teaching
of mathematical modelling. Mathematical Modelling Education in East and West (F. K. S. Leung, G. A.
Stillman, G. Kaiser & K. L. Wong eds). Cham: Springer, pp. 249–260.
Yu, S. Y. & Chang, C. K. (2011) What did Taiwan mathematics teachers think of model-eliciting activities and
modelling teaching? Trends in Teaching and Learning of Mathematical Modelling (G. Kaiser, W. Blum, R.
Borromeo Ferri & G. Stillman eds). New York, NY: Springer, pp. 147–156.

Zehavit Kohen, Ph.D., is an Assistant Professor at the Faculty of Education in Science and Technology at the
Technion and the head of the Mathematics Teachers Education & Development lab. Dr Kohen is the academic
director of the i-MAT program. Her research focuses on the design and integration of i-MAT materials in professional
communities of leading teachers and of teachers who integrate these materials in their classes, as well as the effect
on students’ learning. Her doctoral research (Suma Cum Laude, 2011) focused on developing pedagogical self-
regulation at preservice teachers in a technological environment, supported by reflection in different foci. During
the academic year 2015–2016, she was a Visiting Scholar at the Center to Support Excellence in Teaching at
Stanford University, where she investigated the professional development of early-career mathematics teachers who
participated in the Hollyhock Fellowship Program. During 2012–2017, she had been a researcher at the Technion
Research and Development Foundation and at the Neaman Institute, Israel. Her research work focused on assessment
of STEM education and on the choice of and retention in STEM careers.

Yasmin Gharra-Badran is a member of the i-MAT assessment team. She is an MA student in the Faculty of
Education in Science and Technology and is working on her thesis in mathematics education under the supervision
of Dr Zehavit Kohen. Her research focuses on the development and validation of a unique rubric for assessing the
quality of mathematical modelling problems, which are embedded in hi-tech and technology real-world situations.
288 Z. KOHEN AND Y. GHARRA-BADRAN

Today, Yasmin is a mathematics teacher and coordinator in the Sindiana school in Givat Haviva. She teaches
students in grades 10–12 and prepares them for the 5-point matriculation exam in mathematics. Yasmin has a BSc
in mathematics, statistics and operations research from the Technion.

Downloaded from https://academic.oup.com/teamat/article/42/3/266/6751014 by guest on 30 October 2024

You might also like