Sunteți pe pagina 1din 17

Cooking Up Rubrics

CONSTRUCTING USEFUL AND HIGH QUALITY RUBRICS




What Is A Rubric?

Rubrics provide the criteria for assessing students' work. They can be used to
assess virtually any product or behavior, such as essays, research reports,
portfolios, works of art, recitals, oral presentations, performances, and group
activities. Judgments can be self-assessments by students; or judgments can be
made by others, such as faculty, other students, fieldwork supervisors, and
external reviewers. Rubrics can be used to provide formative feedback to
students, to grade students, and/or to assess courses and programs.

There are two major types of scoring rubrics:
Holistic scoring one global, holistic score for a product or behavior
Analytic rubrics separate, holistic scoring of specified characteristics of a
product or behavior


Rubrics have many strengths, including:

Complex products or behaviors can be examined efficiently.


Developing a rubric helps to precisely define faculty expectations.
Well-trained reviewers apply the same criteria and standards.
Rubrics are criterion-referenced, rather than norm-referenced. Raters ask,
Did the student meet the criteria for level 5 of the rubric? rather than How
well did this student do compared to other students? This is more compatible
with cooperative and collaborative learning environments than competitive
grading schemes and is essential when using rubrics for program assessment
because you want to learn how well students have met your standards.
Ratings can be done by students to assess their own work, or they can be done
by others, e.g., peers, fieldwork supervisions, or faculty.

Sharon Green & Julie Marty-Pearson, WASC ARC 2014


Used with permission from Mary Allen & Amy Driscoll, WASC ALA 2011 1

Rubrics Can:

Speed up grading
Clarify expectations to students
Reduce student grade complaints
Make grading and assessment more efficient and effective by focusing the
faculty member on important dimensions
Help faculty create better assignments that ensure that students display what
you want them to demonstrate


Two Common Ways to Assess Learning Outcomes Using Rubrics

1. Assess while grading.
2. Collect evidence and assess in a group session.
____________________________________________________

Assessment vs. Grading Concerns

Grading rubrics may include criteria that are not related to the learning outcome
being assessed. These criteria are used for grading, but are ignored for assessment.
Grading requires more precision than assessment.
If multiple faculty members will use the rubric for grading or assessment, consider
calibrating them. This is especially important when doing assessment.

Sharon Green & Julie Marty-Pearson, WASC ARC 2014


Used with permission from Mary Allen & Amy Driscoll, WASC ALA 2011 2


DEVELOPING USEFUL RUBRICS:
QUESTIONS TO ASK AND ACTIONS TO IMPLEMENT

QUESTIONS ACTION(S)

1. What criteria or essential elements must Place in rows
be present in the students work to ensure and label
that it is high in quality?

2. How many levels of achievement do I want Place as columns
to use? and label

3. What is a clear description of performance Place in
for each criteria at each level? appropriate cells

4. What are the consequences of performing Include in
at each level of quality descriptions of criteria

5. What is the weighting scheme for grading with Indicate weights
the rubric to the criteria

6. When I use the rubric, what aspects work well? Revise accordingly
What aspects need improvement?




Creating a Rubric

1. Adapt an already-existing rubric.
2. Analytic Method

Sharon Green & Julie Marty-Pearson, WASC ARC 2014


Used with permission from Mary Allen & Amy Driscoll, WASC ALA 2011 3
Drafting the Rubric

We generally find it easier to start at the extremes when drafting the criteria in
the rubric's cells, and then move up and down to draft the levels in the middle.
Starting at the lowest and highest cells, you ask:

What are the characteristics of an unacceptable product, the worst product
you could imagine, a product that results when students are very weak on the
outcome being assessed?
What are the characteristics of a product that would be exemplary, that would
exceed your expectations, which would result when the student is an expert
on the outcome being assessed?

Some words we find helpful:
(in)complete, (in)accurate, (un)reasonable, detailed, thorough, creative, original,
subtle, sophisticated, synthesizes, integrates, analyzes, minor/major conceptual
errors, flexibility, adaptability, complexity of thought, clarity, well-documented,
well-supported, professional, organized, insightful, relevant

Sharon Green & Julie Marty-Pearson, WASC ARC 2014


Used with permission from Mary Allen & Amy Driscoll, WASC ALA 2011 4

How to Prepare Rubrics
Four Key Stages

First, we must choose the recipe we want to prepare and review it.
Do we have all of the ingredients?
What prep work will be needed?

Stage 1 Reflection (Pre-Design)*
Questions: Why this assignment is created?
Have I given it before?
How does it relate to the rest of the course?
What skills do students need to successfully complete the
assignment?
What are the parts of the assignment task?
What are the highest expectations I have for student
achievement of this assignment?
What would be the worst example of student achievement
of this assignment?

Second, we must do any preparation work needed in the recipe-
Items to be washed, chopped, prepared ahead of time?

Stage 2 Specify Learning Outcomes Expected (1st step)*
Questions: What are the learning outcomes that are to be
demonstrated?
What are the skills, understandings, and attitudes of the
learning outcomes required to complete the assignment?
What has been the preparation for this task?
What is the course emphasis?
What are my highest expectations of this evidence of
achievement of the learning outcomes?

Third, we begin to put the ingredients together, mixing the dry
ingredients together, while mixing wet ingredients in separate bowl.
Sharon Green & Julie Marty-Pearson, WASC ARC 2014
Used with permission from Mary Allen & Amy Driscoll, WASC ALA 2011 5


Stage 3 Grouping/Labeling/Organizing Expectations

Using criteria as categories for performance expectations,
organize similar expectations into groups with criteria as
labels for each (Analytic).

Another approach is to organize the performance
expectations under different learning outcomes with
those LOs as the labels and levels of performance as the
columns (Holistic).

Finally, after mixing all of the ingredients together, we put it in the pan
and then into the oven to bake!

Stage 4 Applying Criteria and Descriptions

Using the criteria, place the descriptions of performance
into lists under each criteria and place those in a grid for
use in constructing a scoring guide.

From there, descriptions (standards) may be written at
different performance levels and placed under labels such
as Exemplary, Competent, or Beginning, or A, B, C, D, F, or
Excellent, Satisfactory, and Unsatisfactory.

*Stages 1 & 2 are very powerful when conducted with learners! Everyone
learns from the processes.
Some find it easier to begin with the highest expectations and some find it
easier to describe the lowest performance descriptions first.

Experiment with your rubrics, and use rubrics of others to begin your
processes.

Stevens, S. & Levi, A. (2005). Introduction to Rubrics. Sterling, VA: Stylus.
Sharon Green & Julie Marty-Pearson, WASC ARC 2014
Used with permission from Mary Allen & Amy Driscoll, WASC ALA 2011 6



Rating Scales
Typical Four-Point Rubric Levels

1. Below Expectations. Student's demonstrated level of understanding clearly
does not meet our expectations. Major ideas may be missing, inaccurate, or
irrelevant to the task.
2. Needs Improvement. Student needs to demonstrate a deeper understanding
to meet our expectations, but does show some understanding; student may
not fully develop ideas or may use concepts incorrectly.
3. Meets Expectations. Student meets our expectations, performs at a level
acceptable for graduation, demonstrates good understanding, etc.
4. Exceeds Expectations. Student exceeds our expectations, performs at a
sophisticated level, identifies subtle nuances, develops fresh insights,
integrates ideas in creative ways, etc.



Rubric Category Labels

Below Expectations, Developing, Acceptable, Exemplary
Novice, Apprentice, Proficient, Expert
Emerging, Developing, Proficient, Insightful
Below Basic, Basic, Proficient, Advanced (AAC&U Board of Directors, Our
Students Best Work, 2004)


Sharon Green & Julie Marty-Pearson, WASC ARC 2014


Used with permission from Mary Allen & Amy Driscoll, WASC ALA 2011 7

Preparing Students to Succeed in Assessment
The Value of Rubrics

Consider assessment central to the purposes of teaching and learning.

Help students understand assessment.

Make assessment criteria clear.

Promote student confidence.

Provide safe opportunities for practice.

Give students feedback on their processes, strategies, and work tasks.

Help students manage their time.

Explain different forms of assessment.

Be ready to talk openly about assessment.

Adapted from Brown, Race, & Smith, 1996).


Involving Learners in Rubric Construction: Advantages

Clarity (prevents misunderstandings and misinterpretations)
Ownership (students become stakeholders in the assessment process)
Feedback (immediately assesses student learning)
Efficiency (student help with the task; the task is both pedagogical and
assessment focused)
Motivation (greater student involvement in assignment tasks)

Sharon Green & Julie Marty-Pearson, WASC ARC 2014


Used with permission from Mary Allen & Amy Driscoll, WASC ALA 2011 8
Suggestions for Using Rubrics in Courses

1. Hand out the grading rubric with the assignment so students will know your expectations
and how they'll be graded.
2. Use a rubric for grading student work and return the rubric with the grading on it.
3. Develop a rubric with your students for an assignment or group project. Students can then
monitor themselves and their peers using agreed-upon criteria that they helped develop.
Many faculty find that students will create higher standards for themselves than faculty
would impose on them.
4. Have students apply your rubric to some sample products before they create their own.
Faculty report that students are quite accurate when doing this, and this process should
help them evaluate their own products as they are being developed. The ability to evaluate,
edit, and improve draft documents is an important skill.
5. Have students exchange paper drafts and give peer feedback using the rubric, then give
students a few days before the final drafts are turned in to you. You might also require that
they turn in the draft and scored rubric with their final paper.
6. Have students self-assess their products using the grading rubric and hand in the self-
assessment with the product; then faculty and students can compare self- and faculty-
generated evaluations.

Preparing Students to Succeed in Assessment


The Value of Rubrics

Consider assessment central to the purposes of teaching and learning.

Help students understand assessment.

Make assessment criteria clear.

Promote student confidence.

Provide safe opportunities for practice.

Give students feedback on their processes, strategies, work tasks.

Help students manage their time.

Explain different forms of assessment.

Be ready to talk openly about assessment.

Adapted from Brown, Race, & Smith, (1996).

Sharon Green & Julie Marty-Pearson, WASC ARC 2014


Used with permission from Mary Allen & Amy Driscoll, WASC ALA 2011 9
REFLECTION TO DETERMINE EFFECTIVENESS OF RUBRIC

Does the rubric help me to distinguish among the levels of quality in
students work?

Are there too many or too few levels of achievement specified?

Are the descriptions of performance incomplete or unclear?

Are there important aspects of the task missing from the rubric?

Do the criteria reflect the content or mastery of the knowledge associated
with the student work?

Is the process of achieving the learning outcome reflected in the rubric?

Will the rubric help students be successful in the learning and assessment
processes?

Will the rubric help students understand the assessment and evaluation
process?

Will the rubric provide useful guidance and feedback to students?

(Adapted from Huba, M., & Freed, J. (2000).
Learner-centered Assessment on College
Campuses. Boston, MA: Allyn & Bacon.

Sharon Green & Julie Marty-Pearson, WASC ARC 2014


Used with permission from Mary Allen & Amy Driscoll, WASC ALA 2011 10
COLLABORATIVE DEVELOPMENT OF RUBRICS BY
FACULTY SCHOLARS:

1. Begin with reviews of disciplinary and professional organizations current
literature on student learning.

2. Review the research literature on learning in a discipline or field of study to
determine what current research reveals.

3. Derive criteria from previously submitted student work work that represents
exemplary criteria and levels of achievement

4. Use student work to develop a continuum of learning with markers
determining levels of achievement or developmental levels.

5. Interview students to probe their learning experiences. Let their experiences
inform the rubric.

6. Discuss colleagues experiences and analyze their observations of student
learning for descriptions to place in the rubric.

7. Have colleagues take turns paraphrasing or summarizing the descriptions.

8. Pilot the rubric with more than one sample of student work and meet to

revise the rubric.
Adapted from Maki, P. (2004). Assessing for
Learning. Sterling, VA: Stylus.


Managing Group Readings

Before inviting colleagues to a group reading,
1. Collect the assessment evidence and remove identifying information.
2. Develop and pilot test the rubric.
3. Select exemplars of weak, medium, and strong student work.
4. Consider pre-programming a spreadsheet so data can be entered and analyzed
during the reading and participants can discuss results immediately.

Sharon Green & Julie Marty-Pearson, WASC ARC 2014


Used with permission from Mary Allen & Amy Driscoll, WASC ALA 2011 11
Group Orientation and Calibration

1. Describe the purpose for the review, stressing how it fits into program
assessment plans. Explain that the purpose is to assess the program, not
individual students or faculty, and describe ethical guidelines, including
respect for confidentiality and privacy.
2. Describe the nature of the products that will be reviewed, briefly summarizing
how they were obtained.
3. Describe the scoring rubric and its categories. Explain how it was developed.
4. Explain that readers should rate each dimension of an analytic rubric
separately, and they should apply the criteria without concern for how often
each category is used.
5. Give each reviewer a copy of several student products that are exemplars of
different levels of performance. Ask each volunteer to independently apply the
rubric to each of these products, and show them how to record their ratings.
6. Once everyone is done, collect everyones ratings and display them so
everyone can see the degree of agreement. The facilitator generally asks raters
to raise their hands when their score is announced, and results are displayed in
a simple chart.
7. Guide the group in a discussion of their ratings. There will be differences, and
this discussion is important to establish standards. Attempt to reach consensus
on the most appropriate rating for each of the products being examined by
inviting people who gave different ratings to explain their judgments. Usually
consensus is possible, but sometimes a split decision is developed, e.g., the
group may agree that a product is a 3-4 split because it has elements of both
categories.
8. Once the group is comfortable with the recording form and the rubric,
distribute the products and begin the data collection.
9. If you accumulate data as they come in and can easily present a summary to
the group at the end of the reading, you might end the meeting with a
discussion of five questions:
a. Are results sufficiently reliable?
b. What do the results mean? Are we satisfied with the extent of student
learning?
c. Who needs to know the results?
d. If we're disappointed with the results, how might we close the loop?
e. How might the assessment process, itself, be improved?

Sharon Green & Julie Marty-Pearson, WASC ARC 2014


Used with permission from Mary Allen & Amy Driscoll, WASC ALA 2011 12

Adapting Assessment Rubrics for Assessing and Grading

Heres an assessment rubrican analytic rubric with


three dimensions for assessing oral presentation skills.

Rubric for Assessing Oral Presentations


Below Expectation Satisfactory Exemplary

Organization No apparent The presentation has a


The presentation is
organization. focus and provides
carefully organized
Evidence is not used some evidence which
and provides
to support assertions. supports conclusions.
convincing evidence
to support
conclusions.
Content The content is The content is The content is
inaccurate or overly generally accurate, but accurate and
general. Listeners are incomplete. Listeners complete. Listeners
unlikely to learn may learn some are likely to gain new
anything or may be isolated facts, but they insights about the
misled. are unlikely to gain topic.
new insights about the
topic.
Delivery The speaker appears The speaker is The speaker is relaxed
anxious and generally relaxed and and comfortable,
uncomfortable, and comfortable, but too speaks without undue
reads notes, rather often relies on notes. reliance on notes, and
than speaks. Listeners are interacts effectively
Listeners are largely sometimes ignored or with listeners.
ignored. misunderstood.

Sharon Green & Julie Marty-Pearson, WASC ARC 2014


Used with permission from Mary Allen & Amy Driscoll, WASC ALA 2011 13
Alternative Format 1.
Points are assigned and used for grading, as shown below, and the categories
(Below Expectation, Satisfactory, Exemplary) can be used for assessment.
Faculty who share an assessment rubric might:
assign points in different ways, depending on the nature of their courses
decide to add more rows for course-specific criteria or comments.

Notice how this rubric allows faculty, who may not be experts on oral
presentation skills, to give detailed formative feedback to students. This feedback
describes present skills and indicates what students should do to improve.
Effective rubrics can help faculty reduce the time they spend grading and
eliminate the need to repeatedly write the same comments to multiple students.

Rubric for Grading Oral Presentations


Below Expectation Satisfactory Exemplary Score

Organization No apparent The presentation has a


The presentation is
organization. focus and provides
carefully organized
Evidence is not used some evidence which
and provides
to support assertions. supports conclusions.
convincing evidence
to support
conclusions.
(0-4) (5-6) (7-8)
Content The content is The content is The content is
inaccurate or overly generally accurate, but accurate and
general. Listeners are incomplete. Listeners complete. Listeners
unlikely to learn may learn some are likely to gain new
anything or may be isolated facts, but they insights about the
misled. are unlikely to gain topic.
new insights about the
topic.
(0-8) (9-11) (12-13)
Delivery The speaker appears The speaker is The speaker is relaxed
anxious and generally relaxed and and comfortable,
uncomfortable, and comfortable, but too speaks without undue
reads notes, rather often relies on notes. reliance on notes, and
than speaks. Listeners are interacts effectively
Listeners are largely sometimes ignored or with listeners.
ignored. misunderstood.
(0-5) (6-7) (8-9)
Total Score

Sharon Green & Julie Marty-Pearson, WASC ARC 2014


Used with permission from Mary Allen & Amy Driscoll, WASC ALA 2011 14
Alternative Format 2.
Weights are used for grading; categories (Below Expectation, Satisfactory,
Exemplary) can be used for assessment. Individual faculty can determine how to
assign weights for their course grading. Faculty may circle or underline material
in the cells to emphasize criteria that were particularly important during the
assessment/grading, and they may add a section for comments or other grading
criteria.

Rubric for Grading Oral Presentations


Below Expectation Satisfactory Exemplary Weight

Organization No apparent The presentation has a


The presentation is
organization. focus and provides
carefully organized 30%
Evidence is not used some evidence which
and provides
to support assertions. supports conclusions.
convincing evidence
to support conclusions
Content The content is The content is The content is
inaccurate or overly generally accurate, but accurate and
general. Listeners are incomplete. Listeners complete. Listeners 50%
unlikely to learn may learn some are likely to gain new
anything or may be isolated facts, but they insights about the
misled. are unlikely to gain topic.
new insights about the
topic.
Delivery The speaker appears The speaker is The speaker is relaxed
anxious and generally relaxed and and comfortable,
uncomfortable, and comfortable, but too speaks without undue 20%
reads notes, rather often relies on notes. reliance on notes, and
than speaks. Listeners are interacts effectively
Listeners are largely sometimes ignored or with listeners.
ignored. misunderstood.
Comments

Sharon Green & Julie Marty-Pearson, WASC ARC 2014


Used with permission from Mary Allen & Amy Driscoll, WASC ALA 2011 15
Alternative Format 3.
Some faculty prefer to grade holistically, rather than through assigning numbers.
In this example, the faculty member checks off characteristics of the speech and
determines the grade based on a holistic judgment. The categories (Below
Expectation, Satisfactory, Exemplary) can be used for assessment.

Rubric for Grading Oral Presentations


Below Expectation Satisfactory Exemplary

Organization No apparent The presentation The presentation is


organization. has a focus. carefully organized.
Evidence is not Student provides Speaker provides
used to support some evidence convincing
assertions. which supports evidence to support
conclusions. conclusions
Content The content is The content is The content is
inaccurate or generally accurate, accurate and
overly general. but incomplete. complete.
Listeners are Listeners may learn Listeners are likely
unlikely to learn some isolated facts, to gain new insights
anything or may but they are about the topic.
be misled. unlikely to gain
new insights about
the topic.
Delivery The speaker The speaker is The speaker is
appears anxious generally relaxed relaxed and
and and comfortable. comfortable.
uncomfortable. Speaker too often Speaker speaks
Speaker reads relies on notes. without undue
notes, rather than Listeners are reliance on notes.
speaks. sometimes ignored Speaker interacts
Listeners are or misunderstood. effectively with
largely ignored. listeners.
Comments:

Sharon Green & Julie Marty-Pearson, WASC ARC 2014


Used with permission from Mary Allen & Amy Driscoll, WASC ALA 2011 16
Alternative Format 4.
Combinations of Various Ideas. As long as the nine assessment cells are used
in the same way by all faculty members, grading and assessment can be done
simultaneously. Additional criteria for grading can be added, as shown below.

Rubric for Grading Oral Presentations


Below Satisfactory Exemplary Weight
Expectation 2 3
1
Organization No apparent The The presentation
organization. presentation has is carefully
Evidence is not a focus. organized. 20%
used to support Speaker Speaker provides
assertions. provides some convincing
evidence which evidence to
supports support
conclusions. conclusions
Content The content is The content is The content is
inaccurate or generally accurate and
overly general. accurate, but complete. 40%
Listeners are incomplete. Listeners are
unlikely to Listeners may likely to gain new
learn anything learn some insights about the
or may be isolated facts, topic.
misled. but they are
unlikely to gain
new insights
about the topic.
Delivery The speaker The speaker is The speaker is
appears generally relaxed and
anxious and relaxed and comfortable. 20%
uncomfortable. comfortable. Speaker speaks
Speaker reads Speaker too without undue
notes, rather often relies on reliance on notes.
than speaks. notes. Speaker interacts
Listeners are Listeners are effectively with
largely ignored. sometimes listeners.
ignored or
misunderstood.
References Speaker fails to Speaker Speaker integrates
integrate integrates 1 or 2 3 or more journal 20%
journal articles journal articles articles into the
into the speech. into the speech. speech.

Sharon Green & Julie Marty-Pearson, WASC ARC 2014


Used with permission from Mary Allen & Amy Driscoll, WASC ALA 2011 17

S-ar putea să vă placă și