Sunteți pe pagina 1din 9

Republic of the Philippines

DEPARTMENT OF EDUCATION
Region IV-A CALABARZON
DIVISION OF RIZAL

DIVISION GUIDELINES ON THE EVALUATION OF


TRAINING EFFECTIVENESS

I. RATIONALE

In pursuit of continuous improvement on the implementation of the Schools Division


Office of Rizal Quality Management System, the Training Education Committee (TEC) in
collaboration with Human Resource Development (HRD) and School Management Monitoring
and Evaluation (SMME) sections initiated series of capacity building and training activities for
teaching and non-teaching personnel. The SDO Rizal has established quality assurance
monitoring and evaluation which measures the training delivery based on the feedback of the
participants. This ensures quality and effectiveness of learning activities to sustain quality
service delivery. However, there is a need to come up with tools and guidelines to measure
training effectiveness. Hence, SDO Rizal hereby crafted the Division Guidelines on the
Evaluation of Training Effectiveness.
The Division Guidelines on the Evaluation of Training Effectiveness is anchored to
Department of Education Order No. 32, s. 2011 also known as Policies and Guidelines on
Training and Development Programs and Activities which states that “at the Division level,
trainings, workshops and conferences shall be conducted to respond to the competencies of
the Schools Division Office(SDO) target personnel” thus, measuring effectiveness is
imperative.
In view of the aforementioned, this guidelines aims to present a tool which can be used
to evaluate trainings/seminars; gather information on how the training has been effective; and
upgrade training practices in accordance to the feedbacks from the proponents and
participating individuals.

II. SCOPE OF THE POLICY

This embodied guidelines provides information and direction to the division-initiated


trainings to measure its effectiveness which covers trainings for teaching and non-teaching
personnel. However, other agencies’ program proponents who opted to utilize this tool may
adopt upon seeking approval from the SDO Rizal.
III. DEFINITION OF TERMS

For the purpose of this guidelines, the following words and phrases are defined
conceptually and operationally:

Competence and Skills Tool refers to an instrument which measures the training
effectiveness of the participants’ self-management, professionalism and ethics, teamwork,
innovation and service orientation.

Customer refers to the end-user.

Customer Satisfaction refers to the end-user’s feedback on the performance and service
delivery of the training-participant.

Customer Satisfaction Tool refers to an instrument which measures the performance and
service delivery of the training-participant based on the feedback of the customer.

Indicators refers to the concrete, observable, and measurable performance of the training-
participants.

Program Proponent refers to the person who spearheads the training program.

Training Effectiveness refers to the extent to which training objectives are achieved and
carried out.

Training Effectiveness Monitoring and Evaluation Team (TEMET) refers to a group of


people who monitor, consolidate and analyze results of the evaluated training effectiveness
tools.

Training Effectiveness Tools refers to the instruments which measure the participants’
performance and service delivery. This includes Competence and Skills Tool, Customer
Satisfaction Tool, and Service Delivery Tool.

Quality Management System (QMS) refers to the collection of processes focused on


consistently meeting customers’ requirements and enhancing their satisfaction.

Service Delivery
Service Delivery Tool refers to the instrument which measures the participants’

IV. POLICY STATEMENT

This Division Guidelines on the Evaluation of Training Effectiveness adheres to the


SDO Rizal Quality Policy since it involves processes which promote the continuous
improvement of basic education service delivery to meet the needs and expectations of the
customer.
In keeping the quality of service delivery, SDO Rizal crafted the training effectiveness
tools namely: Competence and Skills Tool, Customer Satisfaction Tool, and Service Delivery
Tool. These instruments measure the extent to which training objectives are achieved and
carried out.

V. PROCEDURES

In order to facilitate the conduct of evaluation of training effectiveness, the following provisions

shall be observed:

PROVISIONS DESCRIPTION
Duration Six months after the training, evaluation shall commence
and will end in fifteen working days,
Number of Target Participants 1-30 (100%)
31 and above (at least 75% using convenience sampling)
Composition of TEMET Chairperson, Deputy Chairperson, Members

This guidelines provides comprehensive procedures on the conduct of evaluation of training

effectiveness as reflected below:

Step 1: Inform the concerned authority on the conduct of evaluation.

Step 2: Evaluate the training- participant/personnel.

Step 3: Get the data from the accomplished training effectiveness tools.

Step 4: Submit the result of the evaluation to the Training Proponent.


Step 5: Present the Certification on the conduct of Evaluation of Training

Effectiveness with the final results.

Process Map on the Evaluation of Training Effectiveness (See Appendix __ ).

VI. CONTENTS AND GUIDELINES

This Division Guidelines on the Evaluation of Training Effectiveness (DGETE) presents


three different tools as follows:

 Competencies and Skills Tool (See Appendix 1)


 Service Delivery Tool (See (Appendix 2)
 Customer Satisfaction Tool (See Appendix 3)

Who will evaluate?

The designated QATAME Team/Associate of a particular training conducted will carry


out the evaluation on training effectiveness using the customized tools.

When to evaluate?

These guidelines put forward an appropriate timeline as to when a certain training shall
be evaluated utilizing the customized tools. As directed, trainings need to be evaluated as to
its effectiveness, three months after being conducted.

How much percent of training participant shall be evaluated?

For this purpose, a certain percent vis a vis the number of participants is directed to
be evaluated accordingly.
Number of Participants Percent
1 – 30 100%
31 and above 75%

VI. MONITORING AND EVALUATION

Ensuring the effective compliance on the implementation of this Division Guidelines on


Evaluating Training Effectiveness is entailed. Thus, the Schools Division Office of Rizal
through the Quality Management System - Training Education Committee and the School
Governance and Operations Division - Human Resources Development unit are directed to
conduct monitoring, documentation of the process, and provision of technical assistance.

VII. EFFECTIVITY
This Division Guidelines on the Evaluation of Training Effectiveness shall take effect
immediately upon issuance of the Division Memorandum of widest dissemination and shall
remain in force and effect unless sooner repealed, amended, or rescinded.

VIII. REFERENCES

RA 10533, Enhanced Basic Education Act of 2013


DepEd Order No 42, s. 2017, National Adoption and Implementation of the Philippine
Professional Standards for Teachers (PPST)
DepEd Order No. 35, s. 2016, The Learning Action Cell as a K to 12 Basic Education
Program School-Based Continuing Professional Development Strategy for the Improvement
of Teaching and Learning
DepEd Order No. 11, s. 2015 RPMS
DepEd Order No. 32, s. 2011, Policies and Guidelines on Training and Development
(T&D) Programs and Activities
DepEd Order No. 44, s. 2010, Adoption of KRT 3: Quality Assurance Accountability
Framework (QAAF)
PROCESS MAP ON THE EVALUATION OF TRAINING EFFECTIVENESS

PERSONNEL
FLOWCHART DETAILS
INVOLVED

START

 Informs the concerned authority


TEMET (i.e., PSDS, School Head, Unit
Training Proponent Head, etc.) on the conduct of the
Inform the concerned evaluation on training
authority on the conduct of
effectiveness through issuance
evaluation of notice (See Appendix 4:
Notice of Conduct of Evaluation
of Training Effectiveness).
Will the evaluation

 The Training Proponent submits


push through?

and presents the Confirmation


Form (See Appendix 4).

NO  The training proponent submits


YES
request for rescheduling if “No”
(within five working days)

TEMET  Evaluate the training-participant/


Training Proponent personnel using the training
Evaluators effectiveness tools.
Concerned Authority
Competencies and Skills Tool
(See appendix 1)
- Self-evaluation

Evaluate the training- Service Delivery Tool


participant/personnel (See appendix 2)
- To be evaluated by the
immediate superior
(PSDS, School Head, Unit
Head, etc.)

Customer Satisfaction Tool


(See appendix 3)
- To be evaluated by the
customers (peers,
learners, stakeholders,
teachers, etc.)
 Conduct monitoring using the
TEMET training effectiveness tool
Get the data from the Concerned Authority
accomplished training Evaluators  Get the data of the accomplished
effectiveness tools training effectiveness tools from
the evaluators.

 Submit the data gathered from


TEMET the evaluation.
Submit to the Training Training Proponent
Proponent  Analyze the data with the
training proponent.

 Submit recommendation for


further improvement of training
(See appendices 5.1, 5.2, 5.3).

TEMET  Presents the Certification to the


Present the Certification on the Training Proponent Training Proponent (See
conduct of Evaluation of appendices 6).
Training Effectiveness with the
final results

END
APPENDICES

Appendix 1 – School SBM Level of Practice Validation Process Map

Appendix 2 – District SBM Level of Practice Validation Process Map

Appendix 3 – Division SBM Level of Practice Validation Process Map

Appendix 4 – SBM Validation Form

Appendix 5 – SBM Practices Assessment Tool

Appendix 6 – Certification on the conduct of School SBM Validation

Appendix 7 – Certification on the Validated SBM Level of Practice and Rating

Appendix 8 – Certificate of SBM Level of Practice

Appendix 9 – Feedback Form

Appendix 10 – Accomplishment Report Template of District SBM Validation

Appendix 11 – Accomplishment Report Template of School SBM Validation

Appendix 12 – SBM Validation Technical Assistance Form