Sunteți pe pagina 1din 19

Portfolio Project

EDUC 765: Trends and Issues in Instructional Design


By: Kristen Trader

Submitted 25 February 2016

PROJECT PROPOSAL
Project Title
Streamlining Instructor Feedback on Student Papers Using CMS Rubrics & Feedback Catalogs

Sponsoring Organization
UW Colleges English Department, Waukesha Campus
The mission of the UW Colleges English Department, and specifically the first-year writing
program, is to help students transition from high school to college-level writing to prepare them
to be successful communicators for the rest of their college education. Instructors within the
FYW program are tasked with collected and assessing several papers a semester for each course
they teach, providing effective feedback to guide student writing improvement.

Project Description
Providing feedback on student writing is one of the most valuable ways an instructor can provide
individualized guidance to students in order to improve their writing ability. Yet this process is
also the most time-consuming and least efficient element of writing instruction as it has been
traditionally practiced. Most instructors teaching today were taught to collect papers by hand,
marking up the margins with commentary to help the student revise that particular paper, and
then offering a few summative comments at the end of the paper on the main issues to be
addressed. Based on the instructors assessment, a grade will also be assigned to the paper and
returned to the student. Such a method is inefficient for the instructor and often ineffective for
the student.
Instructors put in unpaid overtime writing redundant feedback on each students paper and the
time taken results in a delayed turnaround to students who need the feedback to improve future
writing performance. With a course load of 3-4 sections each with a course maxima above the
industry-recommended cap of 20 students, the instructional academic staff in the UW-Waukesha
English Department devote this excessive time to grading 80-100 papers for each assignment,
with an average of four assignments given in each course.
Moreover, the feedback offered through this method appears to focus on fixing the current
paper rather than offering guidance for improving the students writing long-term. Directly
marking the paper shows where that particular text can and should be revised to make it better,
but it also encourages students to just edit the paper to correct those specific errors noted by the
instructor as if they are isolated occurrences.
Thus the opportunity for training includes:
Improving efficiency of delivering feedback on student papers to save the instructor time
and decrease turnaround time for revision opportunities
Increase the relevance and quality of the feedback offered to help students improve their
writing throughout the course
Help instructors manage a heavy workload of student writing to comment on and assess

Aim
Decrease burden on writing instructors and increase the timeliness and quality of instructor
feedback for students to improve the success of writing courses at UW-Waukesha.

Target Audience
Audience is female and male instructional academic staff primarily (though tenured faculty
would be invited to participate as well) with advanced education in English though not
necessarily formal training in writing pedagogies. They are solely responsible for the content,
structure, and execution of their own courses, including the designing and evaluating of
assessments. Yet they are also in a contingent position within the institution with their
employment subject to enrollment changes from semester to semester. Thus they are invested in
improving their teaching to increase their chances of retention but also honestly struggle with
investing more time and energy than is warranted given the inadequate job security and
compensation of the position. Most invest the time, but the result is a noticeable turnover in staff
who cannot sustain the workload of such tenuous employment.

Delivery Options
Instruction targeting the above audience would likely be hybrid, involving a face-to-face
workshop during our common hour when no one teaches as well as an online module to enhance
the workshop and offer a follow-up opportunity to practice the training individually.

FRONT-END ANALYSIS: INSTRUCTIONAL NEED


Instructional Need
The problem described above calls for instructional intervention as it addresses a performance
issue (out of Rossetts four opportunities) that has compromised the efficiency and long-term
applicability of instructor feedback on student papers.
In UW-Waukeshas English Department and the UW Colleges Writing Program, each
composition instructor is expected to provide demonstrable guidance for students to improve
their writing as a central responsibility of the instructors position. There is not an explicit
requirement to provide written feedback on student papers in order to give this guidance, but this
is considered a best practice in the writing studies discipline. Best practice also suggests that
this feedback should be delivered to the student in a timely manner to serve as a useful part of
their learning experience in a given course; delayed feedback is of limited use because it cannot
be applied forward to the next writing opportunity. Thus conscientious instructors who
understand how valuable feedback can be for student writers feel the pressure to conform to
these practices.
However, a performance gap occurs because many instructors are still using a method for
providing feedback on papers that cannot result in the timely turn around for the large amount of
papers submitted while still maintaining high quality in the commentary itself. Many instructors
still provide feedback on student writing by collecting paper copies and marking up each one by
hand with commentary in the margins and/or at the end of the paper. Handwriting feedback on
every single student paper is inordinately time-consuming and often results in writing the same
comments on multiple papers as students make similar errors.
Moreover, each instructor is expected to teach and assess 80-100 students (on average 22
students per course) for each assignment given during a semester to be considered a full-time
instructor receiving the most compensation available to those off the tenure track. Such a
workload further increases the amount of time an instructor takes to write and return feedback on
student papers for any given class. Yet the instructor must not take too much time in order to stay
on schedule for the semester calendar. Thus instructors sacrifice their own personal time in the
evenings and on weekends to provide commentary on papers while still staying on schedule.
Moreover, this amount of personal sacrifice can sometimes lead more quickly to burnout as
instructors see less improvement than hoped for in some students writing, either because the
students ignore the feedback or they only apply it to that particular paper and not to future
writing in the course.
The instructional intervention, then, is to educate instructors on an alternative method for
offering feedback on student writing that doesnt require them to invest as much time or sacrifice
the pedagogical value of that feedback. This cannot be addressed by a policy change, because
instructors need to be shown how to implement the new method and incorporate it into their
teaching effectively in order to improve their own workflow.

FRONT-END ANALYSIS: LEARNER CHARACTERISTICS MODULE 3


Learner Analysis
Primary Audience
Instructional Academic Staff in the UW-Waukesha English Department
Faculty in the UW-Waukesha English Department
Secondary Audience
Anyone teaching writing development who provides feedback on student papers
Instructional staff and faculty teaching writing at all UW Colleges campuses
General Learner Characteristics
Highly educated (majority have a Masters degree, some have a PhD) with previous
teaching experience in writing courses
Mostly women and some men, ranging from mid-20s to early 60s (most in late 20s or
early 40s)
Most are white European American; all are Midwestern in origin
Entry Characteristics
Prior experience with writing assessment to discern what requires feedback for
improvement (though not necessarily extensive formal training in providing such
assessment)
Ability to articulate the feedback in line with program learning outcomes and national
standards of effective student writing
Potentially resistant attitude towards changing assessment technique due to habits of
former practice but motivated to decrease time invested in this particular activity
(conflicted)

Contextual Analysis
Orienting Context
Learners are likely hoping to reduce the amount of time they need to put into providing
feedback on student writing without compromising the quality of this personalized form
of instruction; they want to decrease their time invested without jeopardizing the quality
of their teaching (and thus their jobs)
Learners would expect to receive strategies for streamlining their writing assessments but
the transition might be seen as being of future utility not immediate benefit for their
current writing courses
Learner accountability is not tangible (e.g. no grade or certificate) and the training cannot
be made mandatory
Learners likely misperceive feedback as necessarily customized to each students
individual paper rather than to general writing practices that are used by most successful
writers; the learners have likely focused on the paper (and making it right) rather than
the student whose writing needs to improve through general reflection and altered
practice across different writing situations
4

Instructional Context
Training session would likely need to be scheduled during the noon common hour
when no one is teaching class. But some instructors only teach on MWF and others on
TTh, so this might require two separate sessions for those two different teaching
schedules.
Lighting in one of the computer lab classrooms can be adjusted to focus attention on the
front screen during a demonstration but otherwise bright enough to maintain alertness in
the learners
Noise can be minimized by closing the door or reserving a room in a less populated
corridor during the noon hour
Temperature cant be controlled and will either be stuffy or very chilly depending on the
room reserved
Seating is organized in a semi-circle with learners facing each other for discussions and
turning outward to apply the training on a computer
Learners are allowed to bring lunch as it meal time for most of them, but they need to be
cautious to keep the food/beverages away from computer equipment; the food might be a
distraction but is a compromise given the limited opportunities in the schedule
Computer classroom with a smart board projection system and computers for each
learner, all with internet access to the CMS (D2L) and requisite software (Word or Excel)
No extra transportation required as this will happen during the semester when instructors
are already on campus to teach
Technology Inventory
Computers in their offices and likely all have a personal computer at home they use for
work or to access the internet (and thus the CMS)
Knowledge and access to CMS but only if they requested D2L sites for their classes
(would be of no use to those who are not using the CMS at all for their courses)
Familiarity with Word as a standard document program, though perhaps not with all of its
advanced functionality (enough to copy/paste text to and from a Word doc)
Training room would be computer classroom (see above)
Transfer Context
Learner can transfer the knowledge to help improve their job performance and make their
job easier and less time-consuming. The only delay might be in setting up the CMS to
collect assignments so the comments feature can be activated.
Learners would have ongoing opportunities to employ the new method for providing
feedback every time they assigned and collected new writing from their students during
the current and future semesters, and this will occur frequently through each course/term
Learners are fairly independent in their work, but would likely receive support from
colleagues who are also employing the new method. They wouldnt likely meet
resistance to this method from administrators except in terms of the quality of the
comments they develop through the new method.

INSTRUCTIONAL IMPACT BASED UPON LEARNER CHARACTERISTICS


Application of Learning Theories
My learners are adults with advanced education, so andragogy is certainly the correct approach
for developing theory-informed instruction. These learners are self-directed and even most of
their work already is independent, so I expect that they will embrace and apply the training to the
degree that they want to implement it on their own. Certainly the focus of this instruction is on
relevant, applied knowledge that solves an immediate problem that is not only felt but expressed
by the learners themselves in prior casual conversations.
Therefore, my instruction will focus on problem solving that relies on the learners themselves as
resources in developing some of the best strategies for streamlining feedback on student papers. I
plan to use the strategies outlined in the overview from Rochester Institute of Technologys
Teaching and Learning Services website. The entire training session will be premised on
applying new knowledge to the learners existing experience: the struggle to comment on papers
efficiently to improve the speed and quality of assignment turnaround. Thus elements of
constructivism also inform this approach, as the new method will only be adopted and take on
meaning for the learners if they find some way to connect it to their existing body of knowledge
about providing commentary. In practice, then, I will offer open-ended questions that draw out
expertise from the learners themselves and encourage group work among the participants in the
session as they attempt to develop feedback for a particular scenario that could be applied across
other papers. This will tap into the learners prior knowledge and experience with providing
feedback and interacting with students in and out of class.
To reach different learners, I could provide a mini-lecture and visual demonstration of the new
method for commenting and include group work that requires hands-on production of material to
use with the new method. But it is not appropriate to develop more kinetic or auditory work,
since the training is focused on methods that are visual. To overcome the likely skepticism of
these particular adult learners particularly those that have become habituated in their method of
commenting I would try to rely on testimony from those using the method as well as the
application exercises of developing a feedback catalog that can be copy/pasted as comments for
multiple students.
Yet I assume the best way to convert a skeptic attached to collecting physical papers is to show
them just how easy it is to create and use an electronic feedback catalog. Thus I would be
adopting ideas from social learning theory particular observational learning by modeling the
process of using the new method. Moreover, I would ideally have the learners practice the new
method during the session. In this way I can also inspire them to see themselves as competent in
this new area of ability, thus reinforcing a self-efficacy that motivates them to spend the time
developing this method on their own after the session.

Application of Motivational Theories


Aside from the theoretical approaches noted above, I could also apply John Kellers ARCS
model to motivate the learners, overcome skepticism, and inspire them to continue applying this
new method well beyond the time frame of the session.
6

Attention Perhaps my best approach for gaining and holding learner attention will be through
active, hands-on learning with the material from the training session. I would also include
different methods of instruction (as described above) and likely include a fair amount of humor
given my personality and the social context of the learners as colleagues who already know each
other fairly well.
Relevance This motivational outcomes is already the central focus of the session and will be
consistently and frequently reinforced throughout the session through overt references to learner
benefit, usefulness, and meeting needs. The best way to prove this, however, will likely be
modeling to show the learners how the new method for commenting isnt all that different from
what they already do.
Confidence Likely this area of motivation is directly related to the learners relative comfort
with the technology used in the new method. Each software program is familiar to them (Word
and our CMS), but its in how I combine them that would require a new level of confidence. Thus
I would build skills incrementally working from where they are with each technology and then
sequentially building students towards combining their features.
Satisfaction Ideally, if time permits, the learners will be able to apply the new method
themselves individual or in groups to practice the new method and experience how it saves
them time and energy when commenting. This should inspire satisfaction, but sustaining that
satisfaction will only come from having the learners apply the new method on their own to their
own students papers.

Impact of a Diverse Audience on Instruction


Since my audience is only diverse in age and area of study within English Studies, but not in
ethnicity or educational background, I dont anticipate that issues of diverse learning will be an
issue. The greatest disparity would likely be in each learners knowledge and comfort with the
technology used to implement the new method of commenting. There may be mistrust of
technology as divorced from the interpersonal communication of providing feedback or a general
discomfort with using technology in new ways.
If I were developing this for an international audience of instructors, however, I would likely
request a longer session to extract certain key cultural differences in how instructors perceive
students, how they approach feedback currently, and how the new method of commenting would
fit into those different perspectives on teaching. But presumably I would have assessed my
international audience sooner to rule out anyone who forgoes commenting on student writing
entirely, because those individuals would not be invested in the training at all.

TASK/GOAL/PERFORMANCE ANALYSIS MODULE 5


Task Analysis Method
Procedural Analysis

Task Analysis
There are three main procedures required for the successful completion of the new feedback
method:
1) Setting up a rubric in the CMS
2) Creating a feedback catalog in a Word file
3) Applying the rubric and catalog when evaluating student writing
However, procedures #2 & #3 could happen simultaneously during the first execution of this
new method. This will likely complicate the analysis and development of instructional strategies.
I could combine them the two procedures later, but here I fully analyze all three procedures.
PROCEDURE FOR SETTING UP A RUBRIC IN THE CMS
Actions
1. Determine criteria for
evaluating the assignment
based on lessons and
course outcomes and how
much each criteria is
worth in the overall points
value of the assignment
2. Log in to the course
management system
(Brightspace by D2L) and
select the course where
the assignment appears
3. Click on My Tools
(to the far right in the
main navigation menu) &
select Edit Course
4. Scroll down to the
menu section labeled
Assessment and click on
Rubrics icon
5. Click on New Rubric
button
6. Type in a name for the
rubric and set status to
Published from the drop
down menu below the
Name textbox

Knowledge Needed
Standards of writing
practices; how to identify
appropriate criteria for
evaluation

Cues/Feedback
Results in a set of distinct
characteristics the
resulting student writing
should contain

Difficulty?
Lack of familiarity with
developing overt rubrics
or identifying assessable
criteria in an objective
manner for students

Assumes the instructor


has already requested a
D2L site for the course
including the assignment
and knows how to do
basic navigation within
the site
(same as above)

Instructor homepage
opens with courses listed
along the left column OR
selectable from a dropdown menu at the top of
the screen

Have not requested a D2L


site for their courses or
has no familiarity with
the CMS

New screen opens titled


Course Administration
with a menu of different
site components
New screen opens titled
Rubrics

(same as above)

(same as above)

Opens new screen titled


New Rubric

Understanding of how to
properly name
instructional tools to help
distinguish one
assessment from another

Title appears in the Name


textbox

No anticipated difficult
since button is so
prominent on the screen
N/A

(same as above)

Finding the icon in the


array of options

Optional: Write in a
description of the rubric
by identifying what it will
be used to assess.
7. Leave the Rubric Type
as Analytic
NOTE: The Holistic
rubric type is not
appropriate for the
method covered in this
training because it does
not allow the instructor to
distinguish criteria that
can be used to provide
precise summative
feedback to the student
8. Type in the numbers for
the Initial # of Levels
(levels of performance
you will be grading on)
and the Initial # of
Criteria (whatever
amount of criteria you
created in Step 1)

9. Select Scoring Method


from the dropdown menu
Click on link for What
are scoring methods? to
learn the differences
among these types and
which one suits your
approach to assessment

Understanding of the
difference between
analytical assessment and
holistic assessment; but
the learner can click on a
link below the dropdown
menu that explains the
difference between those
types of rubrics

Analytic should appear


under Rubric Type and
Initial # of Criteria will
remain on the screen (the
latter disappears when a
user selects Holistic
instead)

No difficulty other than


confusing the types of
rubrics

Prior understanding of
grade levels; Typically 5
is used for the standard
letter grade scale of A, B,
C, D, and F

Correct numbers will now


appear in the small
textboxes

Overlook this step or not


type in the right numbers

Knowledge and
experience using different
scoring methods for
evaluating writing; again
prior experience with or
knowledge of rubrics as
assessment tools

Correct scoring method


will appear in the
dropdown menu window

N/A

New screen opens with


Edit Rubric [Title] at
the top and a table of
unidentified criteria and
levels

Determining which
scoring method is best for
them and their evaluation
preferences; aside from
the provided link about
the types, I might need to
offer my take on what
types of approaches align
with these different
scoring methods (e.g.
holistic graders might use
Text Only and those who
weigh all criteria equally
could do the Points
method)
N/A

N/A

New screen opens with


Edit Criteria Group
[Title] and lists of
textboxes for Levels and
Criterion names

NOTE: This training uses


the recommended custom
points to make grade
calculation more precise
10. Scroll back up to the
top and click on Levels
and Criteria
NOTE: The changes on
the New Rubric page will
save automatically when
you click over to the other
tab.
11. Click on the arrow
next to Criteria and select
Edit Criteria Group
from the menu that pops
up next to the arrow

Locating the arrow next


to the label or finding the
correct link from the
menu that pops up

12. Type in the titles you


want to give each of the
Levels and each of the
Criteria then click Save
at the bottom of the screen
Optional: You could also
change the name of the
criteria group, but its not
necessary and can be left
as Criteria
13. Click on the arrow
next to the title of a
criterion and select Edit
Criterion from the menu
that pops up

14. Assign a score to each


level of performance for
this criterion

15. Write different


measurable outcomes for
each grade level into the
Description textbox of
each performance level
Optional: You can also
add feedback comments
for each level as well if
you want this to appear in
your commentary as well,
but there will be another
method for providing
feedback beyond the
rubric assessment that
will be less likely to
overwhelm students.
16. Click on Save to
return to the Edit Rubric
page

17. Repeat Steps 13-16


for each of the criteria
identified in the rubric
18. Click on the arrow
next to Overall Score and
select Edit Levels

Understanding of how to
scale adjectives that
reflect the different levels
of performance being
assessed (e.g. Excellent v.
Passing) and how to
succinctly label criteria
used for evaluation with
or without qualifications
(e.g. Paragraph
Development v. Coherent
Paragraph Development)
N/A

Scaling grade levels that


align with institutional
standards and the
instructors own approach
to assessment
How to scale criteria
across different levels of
performance and develop
distinct performance
outcomes

N/A

Prior knowledge from


working through Steps
13-16 already
N/A

New labels should appear


in each of the textboxes
and you will return to the
Edit Rubric screen with
the labels added to the
table

Selecting clear language


for labeling the different
criteria

New screen opens title


Edit Criterion [Name
of Criterion] and a large
table that identified score,
description, and feedback
options for each level of
performance
Scores will appear in the
textboxes

N/A

Descriptions will now


appear in all of the
textboxes on the Edit
Criterion page showing
modifications of the
criterion across the five
grade levels with clear
distinctions between each
level

No prior experience
writing out measurable
outcomes even though
these are standards
implicitly applied in their
own teaching; resistance
to the amount of work
this requires to create 5
scaled versions of
assessment on each
criterion

Users will be taken back


to the Edit Rubric page
where they will see the
criterion fleshed out with
points and descriptions
for each level of
performance
Same as above

N/A

Opens up screen with a


table with places for
identifying the overall
levels of performance and
grade ranges

Not properly calculating


the scores, particularly if
creating the rubric while
setting it up in the CMS

Potentially overlooking a
criteria or not attending to
the details of Steps 13-16
N/A

10

19. Type in titles for the


Overall Levels and the
lowest grade for each
level that corresponds
with the minimum
percentage/points you
would award for a final
grade on the paper then
click Save at the bottom
Recommendation: match
the levels used at the top
of the rubric so you are
using consistent
terminology for
assessment
20. Click Close at the
bottom of the screen to
exist the Edit Rubric
page
21. Go to the dropbox of
the assignment for which
you are using the rubric
and click on Edit Folder

22. Click on Add Rubric,


select the corresponding
rubric from the pop-up
menu, click Add Selected

Knowledge of how to
calculate baseline
percentages for each
grade range e.g. 90% is
an A- and the lowest
percentage to earn an
A-range grade

Title and start range


scores will appear in the
textboxes and at the
bottom of the table on the
Edit Rubric page when
the user returns to it upon
hitting Save

Calculating start range


scores

N/A

Returns to list of rubrics


created for the course (for
first rubric only one will
appear)
Edit Folder screen will
open up with editable
textboxes for the title and
score of the dropbox

N/A

Menu will pop up with


rubric options listed; after
clicking Add Selected the
rubric title will appear
with a red X under the
Add Rubric section of
the Edit Folder page
User will be taken back to
Dropbox Folders page
listing all dropboxes for
the course
Menu will pop up with
rubric options listed; after
clicking Add Selected the
rubric title will appear
with a red X under the
Add Rubric section of
the Edit Grade Item
page

Failed to set rubric status


to Published, so the
rubric doesnt appear
from the list of options;
didnt successfully save
the rubric in Step 19

Cues/Feedback
Basic two-column table
with the same number of
rows as criteria selected
from the rubric

Difficulty?
Not familiar with the
Table feature on Word or
how to set up a table

Prior knowledge of how


to navigate over to the
dropbox for an
assignment and already
set up the dropbox for the
assignment (but not sure I
can assume this of an
instructor who doesnt
typically use D2L)
Title of the rubric created
for this dropbox
assignment

23. Click Save and Close


to stop editing the
dropbox

N/A

24. Go to the grade item


for the assignment and
repeat Steps 22 & 23

Prior knowledge of how


to navigate to the Grades
page on the CMS site;
successfully completed
Step 22 for the dropbox

Not knowing how to


navigate the course site to
find a dropbox OR no
dropboxes were set up for
the assignments

N/A

No prior experience
navigating the site,
locating the grades, or
setting up the grade items
for the assignments to be
collected through the
CMS

PROCEDURE FOR CREATING FEEDBACK CATALOG


Actions
1. Open a new Word
document and create a
table with at least two
columns one for the

Knowledge Needed
Previous knowledge of
Word, its menu system,
and the Table feature;

11

criteria assessed in the


assignment and one for
the feedback you would
typically offer if that
criteria isnt met
2. Add one extra row to
the table and write up a
neutral framework that
sets up the feedback and
closes the commentary
with those generalities
that dont change from
student to student
3. Write formative
feedback in the second
column that covers
actions to improve student
writing that you would
typically write to a
student who has not met
that criterion
OR
Copy/paste feedback that
you have actually given to
a student that might apply
to another student and
build the catalog as you
assess the first
submissions for the
assignment

understanding of how to
set up a basic table

Prior experience writing


summary statements at
the end of student paper

Framework in the right


column of the table with a
general set up of feedback
and a closing phrase
including the instructors
name

Resistance to offering
such generic language for
students because of a
belief in customizing
everything

Expertise in providing
written formative
feedback to students with
guidance on how to
improve their ability;
ability to develop
feedback tailored to
different errors and areas
of improvement within a
criterion category

Text will begin filling up


the second column with
phrasing that can then be
readily accessed to use for
different student
submissions on the CMS

Resistance to the work


involved in drafting or
copy/pasting the
feedback, as well as
skepticism about making
the feedback generic
enough that it can be used
for multiple students and
have it still be helpful

N/A

Extra space in each cell of


the feedback column to
encourage more
contributions to the
catalog as the instructor
continues to discover new

N/A

IMPORTANT: Generalize
the feedback so it doesnt
apply to just one students
paper but incorporates
language from the
lessons, course outcomes,
or criteria
NOTE: Since the rubric
provides summative
feedback, focus this
commentary on formative
feedback even if that is to
be applied forward on the
next assignment and not
as a revision of this one
4. Leave room in each
row to add more as you
apply the catalog to more
student papers

12

5. Save the Word file with


a clear title indicating
what assignment this
feedback is for so you can
access it quickly in the
future

N/A

areas needing
commentary
File is saved to the
appropriate folder to be
accessed the next time the
assignment is given

N/A

PROCEDURE FOR APPLYING RUBRIC WITH CATALOG


Actions
1. Open up the
corresponding feedback
catalog file in Word
2) Click on the dropbox
for the assignment you are
going to grade and/or
provide feedback on
NOTE: the feedback
catalog can be used
without the rubric when
just providing feedback on
ungraded drafts (or if the
instructor prefers holistic
grading)
3) Click on Evaluate for
the student submission
you want to grade

Knowledge Needed
Knowing where the file
was stored and how to
access it
Familiarity with CMS and
where the assignment
dropbox is located

Cues/Feedback
File opens

Difficulty?
Forgot where the file was
stored

Opens a screen listing all


student submissions to the
dropbox with default
organization by student
last name

Lacking familiarity with


CMS

N/A

Open screen Evaluate


Submission identifying
the student and showing
links for what the student
submitted to the dropbox
PDF-type document
viewer embedded in CMS
will show the paper
submitted along the left
column
Opens a pop-up window
Assess Rubrics that floats
over the students
submission

Lack of familiarity with


CMS to know where the
Evaluate link is on the
screen

Boxes for the criteria and


levels selected will
change to yellow and the
circle at the top of the box
will be filled in

Resistance or difficulty in
placing the student in one
of the 5 performance
levels rather than on a

4) Click on the link for the


students submission and
read through the
submitted paper

Expertise in critically
reading and evaluating
student writing

5) Click on link for the


rubric in the right-hand
column; the link should
be the title you gave to the
rubric

N/A

Recommendation: Open
the dropbox in two
different windows so you
can switch back and forth
between viewing the
students paper and
viewing the pop-up
window for the rubric
6) Click on the boxes
corresponding to the
students level of
performance for each of
the criteria in the rubric

Knowledge of and
experience in assessing
student writing accurately
given standards that the

Not knowing how to pull


up a student submission
in the CMS or struggling
with a file type that isnt
viewable within the CMS
Lack of familiarity with
CMS

13

instructor herself/himself
established

7) Click on Save and


Record to transfer the
rubric grade to the
dropbox score

8) Type the students


name into the Feedback
textbox and copy/paste
the general framework
feedback from the
feedback catalog Word
document
9) Personalize the
framework with
compliments on the
students strengths (1-2)
at the beginning and/or at
the end of the frame
10) Copy/paste just 3-4
comments from the
feedback catalog that
would be most helpful for
the student to improve
their writing in this paper
and in future papers
Optional: Customize the
pasted feedback with one
example each of where
their current text could be
improved for that issue
11) Click Publish to post
the students grade and
feedback for the
submitted assignment

N/A

N/A

NOTE: The total score is


automatically graded if
you use a Custom Points
rubric. Once the rubric is
filled out, the overall
performance will
automatically fill in for
the Points rubric as well.
You will need to manually
select an overall level for
the Text Only rubric
scoring method.
Rubric window closes and
a grade for the rubric will
now appear along with a
number in the Score
textbox that corresponds
with the calculated grade
NOTE: Those using
holistic or text only
methods will have to input
the score manually
Students name and
framework should now
appear in the Comments
textbox

Prior knowledge of how


to sandwich feedback
by buffering bad news
with some good

Additional text is typed


into the Comments
textbox but with room left
in the middle of the frame
work additional advice

Knowledge about how to


assess higher order from
lower order concerns
when evaluating student
writing; ability to inform
students about these
different levels of revision

List should appear in the


middle of the frame from
Step 8 with 3-4 bullet
points addressing the
most to least important
areas for improvement

N/A

Refreshes the same page


but with the text in; if the
grade item is checked, the
students grade will now
appear there

more gradual continuum


of quality

N/A

N/A

May struggle to find


something positive to say
about the students
writing or feel this entire
prescription is
unnecessary
May not be familiar with
distinguishing higher
order from lower order
concerns with assessing
student writing and thus
may offer a list of 3-4
items that are not the
most critical for
improving the students
writing

N/A

14

INSTRUCTIONAL OBJECTIVES MODULE 5


Project (Instructional) Goal
Learners will streamline their assessment and commentary on student writing by using the
CMS rubric and a digital feedback catalog.

Terminal Objectives and Enabling Objectives


Learners will be able to do the following by the end of training:

Build a Rubric in the CMS to Assess Student Writing


o Select clear criteria for a rubric-based assessment (Cognitive)
o Describe measurable levels of performance for each criterion (Cognitive)
o Create a fully-functional rubric within the CMS for the criteria selected
(Cognitive & Psychomotor)

Create a Digital Feedback Catalog Corresponding to Rubric Criteria


o Distinguish formative feedback from summative feedback (Cognitive)
o Synthesize old feedback or create new feedback on student writing as generalized
advice for improvement (Cognitive)
o Organize writing advice in a Microsoft Word document for each corresponding
criterion (Psychomotor)

Apply Both the CMS Rubric and Digital Catalog to Assess Student Writing
o Combine summative and formative assessment of student writing (Cognitive)
o Evaluate student submissions using the rubric feature of the CMS (Cognitive &
Psychomotor)
o Assess the type of commentary the student requires from the digital feedback
catalog (Cognitive)
o Copy commentary from the digital feedback catalog into the feedback textbox of
the CMS evaluation screen (Psychomotor)

15

ENABLING OBJECTIVES MATRIX & SUPPORTING CONTENT MODULE 6


Title of the unit/module: Creating an Assignment Rubric in D2L
List Terminal Objective Here: Build a Rubric in the CMS to Assess Student Writing
List Pre-instructional Strategy: Model Assignment Review (to prepare the trainees for selecting
and developing criteria for a rubric-based assessment)
Enabling
Objective

Level on
Blooms
Taxonomy

Fact, concept,
principle, rule,
procedure,
interpersonal,
or attitude?
Procedure

Select clear
criteria for a
rubric-based
assessment

Understand
& Analyze

Describe
measurable
levels of
performance
for each
criterion

Understand
& Evaluate

Procedure

Create a
fullyfunctional
rubric within
the CMS for
the criteria
selected

Apply &
Create

Procedure

Learner
Activity (What
would learners
do to master
this objective?)
Determine 3-5
criteria based on
the requirements
of the
assignment
(reviewed in the
pre-instructional
activity)
Write up
qualitative
statements that
describe
different levels
of performance
for each
criterion
selected above
Walk through
the instructional
procedures to
set up a
functional rubric
in the CMS

Delivery Method
(Group
presentation/lecture,
self-paced, or small
group)
Small group discussion
(in F2F session)
OR
Self-paced individual
tutorial (online)

Collaborative group
authoring (in F2F
session)
OR
Individual completion of
the task (online)

Work in groups or alone


as presenter walks
through the process
OR
Completes the
procedure individually
through online tutorial

NOTE: I have two optional delivery methods here because I foresee that some instructors will
want the face-time to work with others to learn the material while other instructors will prefer the
flexibility of doing an online tutorial. So there is the potential to offer both methods.

16

Supporting Content
To support the first terminal objective and the overall training students will be provided with
the following tip sheet to remind them how to critically reflect on assignments in order to
identify criteria to use in a rubric. They will then use this method to complete the first enabling
objective of the training program, which sets them up to complete the rest of the first module:
Creating a Rubric in the CMS.
Here is a screen shot of the tip sheet:

17

REFERENCES
Adult Learners. (n.d.) Innovative Learning Institute: Teaching and Learning Services. Retrieved
from: http://www.rit.edu/academicaffairs/tls/course-design/instructional-design/adultlearners
Blaga, Alexandra. (2014) What is the difference between competencies and behaviors when
establishing performance criteria? Performance Magazine. Retrieved from:
http://www.performancemagazine.org/what-is-the-difference-between-competencies-andbehaviors-when-establishing-performance-criteria/
Clark, Don. (2015a) Bloom's Taxonomy of Learning Domains. Big Dog and Little Dogs
Performance Juxtaposition. Retrieved from:
http://www.nwlink.com/~donclark/hrd/bloom.html
Clark, Don. (2015b) John Kellers ARCS Model of Motivational Design. Big Dog and Little
Dogs Performance Juxtaposition. Retrieved from:
http://www.nwlink.com/~donclark/hrd/learning/id/arcs_model.html
Constructivism. (2016a) Funderstanding. Retrieved from:
http://www.funderstanding.com/theory/constructivism/
Constructivism. (2016b) Learning-Theories.com. Retrieved from: http://www.learningtheories.com/constructivism.html
Hampton Roads ISPI. (2013) What is HPT? Hampton Roads International Society for
Performance Improvement. Retrieved from: http://www.hrispi.org/#!about-hamptonroads-ispi/c1enr
Kearsley, G., and Culatta, R. (2013a) Andragogy (Malcolm Knowles). Instructional Design.
Retrieved from: http://www.instructionaldesign.org/theories/andragogy.html
Kearsley, G., and Culatta, R. (2013b) Component Display Theory (David Merrill). Instructional
Design. Retrieved from: http://www.instructionaldesign.org/theories/componentdisplay.html
Kearsley, G., and Culatta, R. (2013c) Conditions of Learning (Robert Gagne). Instructional
Design. Retrieved from: http://www.instructionaldesign.org/theories/conditionslearning.html
Kearsley, G., and Culatta, R. (2013d) Constructivist Theory (Jerome Bruner). Instructional
Design. Retrieved from: http://www.instructionaldesign.org/theories/constructivist.html
Kearsley, G., and Culatta, R. (2013e) Kemp Design Model. Instructional Design. Retrieved
from: http://www.instructionaldesign.org/models/kemp_model.html
Kearsley, G., and Culatta, R. (2013f) Social Learning Theory (Albert Bandura). Instructional
Design. Retrieved from: http://www.instructionaldesign.org/theories/social-learning.html
McGovern, C., and Bray, B. (2007) Robert Gagne. Instructional Development Timeline.
Retrieved from: http://my-ecoach.com/project.php?id=12152&project_step=28465
Self-Efficacy. (2016) Funderstanding. Retrieved from:
http://www.funderstanding.com/educators/self-efficacy/
Social Cognitive Theory. (2013) Behavioral Change Models. Retrieved from:
http://sphweb.bumc.bu.edu/otlt/MPH-Modules/SB/SB721-Models/SB721-Models5.html

18

S-ar putea să vă placă și