0% found this document useful (0 votes)
16 views6 pages

Steps in Implementing Responsive Model

The document outlines the steps in the responsive evaluation model, which involves engaging stakeholders through open dialogue, defining the evaluation scope based on their input, understanding the program activities, identifying issues, conceptualizing the issues, determining necessary data, selecting observers and instruments, collecting and analyzing data, communicating findings, and ongoing interpretation.

Uploaded by

yoga laksana
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
16 views6 pages

Steps in Implementing Responsive Model

The document outlines the steps in the responsive evaluation model, which involves engaging stakeholders through open dialogue, defining the evaluation scope based on their input, understanding the program activities, identifying issues, conceptualizing the issues, determining necessary data, selecting observers and instruments, collecting and analyzing data, communicating findings, and ongoing interpretation.

Uploaded by

yoga laksana
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

1. Talking with clients, program staff and audiences.

This is an integral component of the evaluation since the research questions and scope of the
evaluation emerge from these discussions with the different stakeholders. In addition,
communication with stakeholders remains pivotal to all events.
2. Identifying the program/evaluation scope.
This is done from the perspective of the different stakeholders involved in the program through
qualitative discussions.
3. Overview of program activities.
These are gathered through qualitative techniques which include but is not limited talking to
stakeholders and observations of program implementation.
4. Discovering purposes and concerns (issues).
This is done primarily through observations or interviews with stakeholders. A significant amount
of time is dedicated to discovering issues and this is done throughout the course of the evaluation.
5. Conceptualizing issues and problems.
This is a key feature of responsive evaluation where issues raised by stakeholders are used to
inform the criteria for the evaluation as they emerge. These issues should be linked to underlying
value-systems of the program in order to enable the discussions and mutual understanding
among stakeholders
6. Identifying data needs.
By using the conceptualization of issues to guide the data collection plan the evaluator needs to
identify the data that will explain the issues raised. Different stakeholders will have different
information needs and these have to be taken into account in the evaluation structure
7. Selecting observers and judging instruments.
In this step, the evaluator identifies and selects the appropriate observers and instruments to
gather the necessary data. Observers can include the evaluator, program staff, clients, or other
stakeholders who can provide valuable insights and perspectives on the program. Judging
instruments may consist of interview guides, observation protocols, surveys, or other tools that
align with the identified data needs and help capture the required information. The selection of
observers and instruments should be based on their relevance to the issues and their ability to
provide meaningful data.
8. Observe designated antecedents, transactions, and outcomes
This step involves conducting the actual observation and data collection process. The evaluator
and selected observers use the chosen instruments to gather information on the program's
antecedents (conditions or events preceding the program), transactions (activities and
interactions within the program), and outcomes (results or changes resulting from the program).
Observations can be made through various methods such as direct observation, participant
observation, or document review. The evaluator should ensure that the observations are
systematic, objective, and well-documented to support the evaluation findings.
9. Thematize: prepare protrayals, case studies
Thematising, preparing depictions of the different issues from different people’s experiences to
specific audiences to create an understanding of the issues from various perspectives. The
reactions of the audience are included in the evaluator’s report.
10. Validation, confirmation or attempts to disconfirm.
This will happen throughout the course of the evaluation and will give an opportunity for the
evaluator and his stakeholders to learn but most important to be able to adjust the evaluation
criteria
11. Winnow format for audience use
In this step, the evaluator organizes and presents the evaluation findings in a format that is
accessible and meaningful to the intended audience. This involves reviewing and selecting the
most relevant and significant information from the collected data and thematized portrayals. The
evaluator should consider the audience's needs, preferences, and level of understanding when
deciding on the format. This may include executive summaries, visual presentations, interactive
workshops, or other engaging formats that facilitate understanding and use of the evaluation
results. The goal is to ensure that the findings are clearly communicated and can be easily
understood and applied by the stakeholders.
12. Assemble formal reports if any.
With responsive evaluation interpretation of findings and dissemination of findings is an ongoing
process and can be done using appropriate reporting channels as agree at all stages of the
evaluation process (including end of evaluation period).

STEPS IN RESPONSIVE EVALUATION MODEL

1. Talking with clients, program staff, and audiences: The evaluator engages in conversations with
the language school administrators, teachers, students, and parents to gather their perspectives
on the program's goals, curriculum, teaching methods, and learning outcomes.

This step involves engaging in open and inclusive dialogue with all
stakeholders to gather their perspectives, expectations, and concerns
regarding the program. The evaluator should create a safe and welcoming
environment that encourages honest and constructive communication.
Active listening, empathy, and the ability to facilitate meaningful
discussions are crucial skills for the evaluator in this phase. The insights
gained from these conversations will help shape the evaluation questions,
scope, and approach.

2. Identifying the program/evaluation scope: Based on stakeholder input, the evaluator defines the
focus areas, such as assessing the effectiveness of the curriculum in developing students' language
skills, the quality of teaching, and student satisfaction.

Based on the stakeholder discussions, the evaluator works collaboratively


with the clients and program staff to define the boundaries and focus areas
of the evaluation. This involves clarifying the program's objectives, target
population, timeframe, and available resources. The evaluator should also
consider any external factors or constraints that may impact the evaluation
scope. A clear and agreed-upon scope sets the foundation for a focused and
manageable evaluation process.

3. Overview of program activities: The evaluator collects information about the program's key
components, such as classroom instruction, language labs, extracurricular activities, and
assessment practices, through document review, observations, and interviews with staff and
students.

The evaluator gathers information about the program's key activities,


interventions, and processes through various qualitative techniques. This
may include reviewing program documents, observing program
implementation, and conducting interviews with program staff and
participants. The evaluator should aim to gain a comprehensive
understanding of how the program operates in practice, identifying any gaps
between the intended design and actual implementation.

4. Discovering purposes and concerns (issues): Through stakeholder interactions, the evaluator
uncovers issues such as the need for more diverse teaching materials, concerns about student
engagement, or the desire for more individualized feedback on language progress.

Through ongoing interactions with stakeholders, the evaluator identifies


and explores the underlying purposes, values, and concerns that drive the
program. This step involves deep listening and probing to uncover the
stakeholders' motivations, aspirations, and any issues or challenges they
face. The evaluator should create an atmosphere of trust and confidentiality
to encourage stakeholders to share their honest perspectives. Discovering
these issues forms the basis for developing evaluation criteria that are
meaningful and relevant to the program context.

5. Conceptualizing issues and problems: The evaluator analyzes the identified issues, linking them
to language acquisition theories, pedagogical approaches, and student motivation and learning
styles.

In this step, the evaluator analyzes and synthesizes the identified issues and
concerns, linking them to the program's underlying value systems. This
involves looking for patterns, themes, and relationships among the various
perspectives and experiences shared by stakeholders. The evaluator should
aim to develop a conceptual framework or map that represents the key
issues and their interconnections. This conceptualization helps to guide the
evaluation design and facilitates a shared understanding among
stakeholders.

6. Identifying data needs: The evaluator determines the data required, such as student language
proficiency scores, teacher qualifications and experience, classroom observation ratings, and
student and parent satisfaction surveys.

Based on the conceptualized issues and evaluation questions, the evaluator


determines the specific data required to address the stakeholders'
information needs. This involves considering the different types of data
(quantitative, qualitative, or mixed), sources (primary or secondary), and
methods (surveys, interviews, observations, etc.) that will provide the most
relevant and credible evidence. The evaluator should also assess the
feasibility and appropriateness of data collection methods within the given
context and resources.

7. Selecting observers and judging instruments: The evaluator selects language assessment experts
and experienced teachers as observers, using standardized language proficiency tests, classroom
observation protocols, and survey instruments to collect data.

In this step, the evaluator identifies and selects the appropriate observers
and instruments to gather the necessary data. Observers can include the
evaluator, program staff, clients, or other stakeholders who can provide
valuable insights and perspectives on the program. Judging instruments
may consist of interview guides, observation protocols, surveys, or other
tools that align with the identified data needs and help capture the required
information. The selection of observers and instruments should be based on
their relevance to the issues and their ability to provide meaningful data.

8. Observe designated antecedents, transactions, and outcomes: The evaluator and observers
collect data on the program's inputs (e.g., curriculum, teaching resources), processes (e.g.,
classroom interactions, language practice activities), and outcomes (e.g., student language
proficiency, confidence in using English).

This step involves conducting the actual observation and data collection
process. The evaluator and selected observers use the chosen instruments
to gather information on the program's antecedents (conditions or events
preceding the program), transactions (activities and interactions within the
program), and outcomes (results or changes resulting from the program).
Observations can be made through various methods such as direct
observation, participant observation, or document review. The evaluator
should ensure that the observations are systematic, objective, and well-
documented to support the evaluation findings.

9. Thematize: prepare portrayals, case studies: The evaluator analyzes the data, identifying themes
such as the effectiveness of communicative language teaching methods, the impact of technology
on language learning, or the role of cultural context in language acquisition. Case studies of
successful language learners could be developed to illustrate the program's impact.

In this step, the evaluator analyzes and organizes the collected data into
meaningful themes, patterns, and stories that illuminate the program's
experiences and outcomes. This may involve developing rich narratives,
case studies, or other portrayals that capture the essence of the stakeholders'
perspectives and the program's impact. The evaluator should strive to
present the findings in a way that is engaging, accessible, and resonates with
the intended audiences. Incorporating the audience's reactions and
feedback into the final report enhances the evaluation's credibility and
usefulness.

10. Validation, confirmation, or attempts to disconfirm: The evaluator shares the preliminary findings
with stakeholders, such as teachers and students, to validate the interpretations and gather
additional insights or alternative explanations.

Throughout the evaluation process, the evaluator actively seeks to validate


or confirm the emerging findings and interpretations. This involves sharing
preliminary results with stakeholders, soliciting their feedback, and looking
for additional evidence that supports or challenges the initial conclusions.
The evaluator should remain open to alternative explanations and be willing
to revise the findings based on new insights or data. This iterative process of
validation helps to strengthen the evaluation's rigor and ensures that the
final conclusions are well-grounded and defensible.

11. Winnow format for audience use: The evaluator presents the evaluation results in formats
tailored to different audiences, such as a summary report for school administrators, a
presentation for teachers highlighting effective instructional strategies, or an interactive
workshop for parents on supporting their children's language learning at home.

In this step, the evaluator organizes and presents the evaluation findings in
a format that is accessible and meaningful to the intended audience. This
involves reviewing and selecting the most relevant and significant
information from the collected data and thematized portrayals. The
evaluator should consider the audience's needs, preferences, and level of
understanding when deciding on the format. This may include executive
summaries, visual presentations, interactive workshops, or other engaging
formats that facilitate understanding and use of the evaluation results. The
goal is to ensure that the findings are clearly communicated and can be
easily understood and applied by the stakeholders.

12. Assemble formal reports, if any: The evaluator prepares a comprehensive evaluation report,
including the methodology, findings, conclusions, and recommendations for enhancing the
English language teaching program. The report may address areas such as curriculum design,
teacher professional development, assessment practices, and resources allocation.

In the final step, the evaluator compiles the evaluation findings, insights,
and recommendations into a formal report, if required by the client or
stakeholders. The report should be clear, concise, and tailored to the needs
and preferences of the intended audience. It should include an executive
summary, methodology, key findings, conclusions, and actionable
recommendations. The evaluator should also consider other dissemination
formats, such as presentations, workshops, or interactive dashboards, to
facilitate the effective communication and utilization of the evaluation
results.

You might also like