Sunteți pe pagina 1din 48

Assessment

portfolio
Washington State University
Ed Psy 510 Assessment of Learning
Kassie Swenson
December 5, 2013

Reflections
Reflection #1
The Data Pyramid
Reflection #2
Chief Academic Officer Brian
Lowney
Reflection #3
Principal Nick Hedman
Reflection #4
Trevor Greene, Professional
Development Specialist, AWSP
Reflection #5
Dr. Gene Sharratt, Washington
Student Achievement Council
Reflection #6
Elizabeth City, Harvard School
of Education
Reflection #7
The Main Idea
Driven By Data: A Practical
Guide to Improve Instruction
Reflection #8
TPEP Student Growth Goals
Reflection #9
Taking Data to New Depths
By Nancy Love
Reflection #10
Data Integration - Education
Week video featuring Rudy
Crew





Kassie Swenson
Reflection # 1
Classroom Activity: The Data Pyramid, Figure 4.1 and 4.2 from The Data
Coach's Guide

The class activity around the data pyramid from Love, Stiles, Mundry and
DiRanna (2008) was an excellent activity to discuss the assessment schedule for
each of our sites. The recommended data pyramid (Figure 4.1) is a helpful tool to
use to plan the assessments throughout the year. Later, when we saw Figure
4.2 of the inverted data pyramid, it was rather eye-opening since most felt this is
where a lot of the focus on assessments is. So, in essence it was like comparing
a hoped-for model to an actual model. This activity provided an opportunity for
school leaders to turn the inverted pyramid upside down in their schools and
follow a more comprehensive approach for a data-driven culture focused on
multiple types of assessments at various frequencies throughout the year.


As seen in the data pyramid, daily-weekly assessments in the classroom,
based on formative assessments, is where the bulk of the assessment should be
taking place. How is this done at this frequency at the school level? Next,
formative common assessments should be gathered 1-4 times a month, and
could be writing samples, journals, etc. Again, how is this planned for and
organized; at the individual teacher level, building level, or district level? Third, at
the end-of-the-unit or done quarterly are the benchmark common
assessments. Next, assessments taking place approximately 2-4 times a year
according to the pyramid is "data about people, practices, perceptions" (Love, et.
al, p. 129). This was an interesting one to discuss in class since some of the
groups had placed it at differing levels of the pyramid. These types of
assessment reminded me of Jim Ball's story about a principal who seemed to
have used this information on a regular basis in the "war room". Utilizing
attendance sheets, demographics, surveys, and observation data can assist with
many types of data collection needs, including, but not limited to, reaching the
20% of parents that don't come to conferences or using the attendance sheets as
early warning signs for kids at risk of failing classes or assignments. Last, the
area that tends to get most of the focus when it comes to instructional time is the
summative district and state assessments. The state assessments are
standard, but the summative district assessments can vary from district to
district.









Kassie Swenson
Reflection #2
Speaker: Chief Academic Officer Brian Lowney

The perspective of a high school principal and how data is used according
to the school improvement plan and goals was helpful to understand how each
principal makes decisions according to data. One important goal that Brian
made in the beginning of his Emerald Ridge principalship was to reduce the large
number of goals to a handful for better intentional focus on achieving those
goals. The principal would also carry a backpack with a data file in order to
have it handy for having discussions around instruction and to make decisions
aligned to the schools goals and what the data showed. Brian also talked about
how relationships are needed to lever results to achieve the goals.

Renee brought up a good question as far as who came up with the five
goals - principal and/or staff? Since much time had passed, it was difficult to
recall who came up with the goals. Now with a new principal at the school, it
may be beneficial to take a look at the data to reassess the goals of the school.
Building upon the existing data culture that exists and expanding and inviting
staff to take it to the next level. The timing is also ripe with TPEP - for the
principals and the staff to further their data literacy and make specific goals
around what the data has to say to increase student achievement and growth.

How is data shared and accessed at the school? Who is on the data
leadership team to collect, review, collaborate, and make decisions around what
the data says? I think there are many exciting opportunities to further the work
that Brian did at Emerald Ridge, as well as additional professional development
opportunities in areas of student growth identified in the TPEP criteria at the
school. Packaging the data is also important. Is there a data integration platform
in place for each department to look at their data at various time spans and see
the effectiveness and areas of growth in the students? Is there a need for
additional data to be collected as part of the data inquiry taking place? Do
assessments need to be created or identified to seek out this information? Using
data to analyze, identify, and solve problems can be an important lever to align
with the school improvement plan and goals.

If the culture is ripe to deepen their use of data, and with the positive effect
on relationships that Brian had with staff and students, hopefully the new
principal AND leadership team will be able to stay focused on the needs of
students at Emerald Ridge and continue to build upon the awesome achievement
achieved thus far, and raise the bar even higher. !






Kassie Swenson
Reflection #3
Speaker: Nick Hedman, Principal

Data can only take school improvement plans so far. Principals need to
engage with all stakeholders and move data into action to achieve SIP
goals. Nick Hedman, Principal at Wilkeson Elementary in White River School
District, utilizes four critical questions in the Professional Learning Community
(PLC) including:
1. What do we want each student to learn?
2. How will we know if each student is learning it?
3. How will we respond when a student is experiencing a difficulty with learning
it?
4. How will we respond if the student already knows it?

Responding to interventions (RTI) depends on formulating a bigger
picture, looking at academics and behavior, and how positive behavior supports
academics. Looking at both sides helps identify and build systems to assess and
monitor academic and behavior-related data. At Wilkeson Elementary, the
principal and staff use the Positive Behavioral Interventions and Support (PBIS)
system and screen all the kids. Principal Hedman and the RTI team include the
counselor, psychologist, special education, LAP and Title staff, and the speech
and language pathologist, as they put their eyes on every single kid with the
systems in place to collaboratively analyze data and monitor how each student is
doing. Every student is accounted for on the data boards that the staff uses to
progress monitor and intervene as needed to help students overcome learning
difficulties. Hedman will bring this information to the School Board as needed if
more resources are necessary to get kids where they need to be. The data
guides the evidence of what students need help with.

There is a strong focus on the behavioral intervention at Wilkeson
Elementary. SWIS is used to determine the intervention and behavior plan, and
is shared out at staff meetings. When behavioral challenges were prevalent at
the early grade levels, the staff stepped in to teach kids a three-step system in
which they could take initiative and independence to deal with classmate
discipline problems. Kids are also taught to know what the school rules are and
benchmarks are used to assess what percentage of the kids know these rules.

What I enjoyed most about the practices at Wilkeson Elementary was the
focus on each and every child, which is evident in the data boards Hedman
demonstrated. Each child is shown on the board, with their picture and their
individual data. Additionally, stories were posted on the back of the board about
what some of the kids are facing outside of the classrooms. The child is more
than a test score; there are other factors that contribute to the student's
achievement. Additionally, this was the first time I had really seen the behavioral
data being used intentionally and effectively to address behavioral problems that
may be affecting student learning in the classrooms.
Kassie Swenson

Reflection #4
Speaker: Trevor Greene, Professional Development Specialist, AWSP, formerly
the Principal of Toppenish High School (2007-2013)

"Until the story of the hunt is told by the lion, the tale of the hunt will always glorify
the hunter." African Proverb

Trevor Greene is a true leader in education in Washington State. The
achievements he and the Toppenish High School staff made to engage students
and reduce the dropout rate increased opportunities for Toppenish High School
students. The data painted a picture, and it was up to Trevor and his staff to alter
the picture by building a culture focused on individual students and creating
courses that elevated the education and career opportunities for a high poverty
population. Culture-wise, Trevor's hill (the hill that you would die on - everyone
needs to know what their hill is) is student advisory. Trevor's belief is that
everyone deserves an adult advocate and the school may be the only time that
the student will have adult interaction, whether positive or negative.
As an educational leader advocating for students, Trevor believes "Principals
must have the courage of a lion, the heart of an angel, and the hide of a
rhino. This is a powerful message that guided his work as a principal, as he was
strongly committed to making positive impacts for students. At Toppenish,
Trevor created Advisory, changed the counseling structure, changed
requirements of contacting parents (face-to-face, voice-to-voice), and
immediately put in STEM.

STEM was a huge success at Toppenish High School, and Trevor and his
key staff put in place immediately ("the right people are your most important
asset"). The results that soon came brought in additional stakeholder support,
leading to community partnerships. Year by year, demand for additional STEM
courses increased, further engaging students in science, math, engineering and
technology in ways previous courses did not offer. Key staff at Toppenish High
School in the beginning and throughout went to powerful professional
development opportunities that allowed for rigorous courses. Furthermore,
Trevor's belief in having 1-2 programs a year is the most effective way of seeing
programs out to be successful, and this intentionality contributed to the success
of STEM at Toppenish.

Sharing a pearl of wisdom from his principal experience, Trevor learned
the hard way that he couldn't solve all the problems at the school on his own. In
the beginning, it felt as if the principal had to come up with the answers and
enact a solution. An effective strategy Trevor used with his staff was admitting
he didn't know all the answers and that the teacher(s) knew the subject better
than he did; therefore, the staff member bringing the problem needed to offer up
two solutions. The outcome of this strategy was that people would be coming
only when they had solutions, and the complaining was greatly reduced. This
Kassie Swenson
was another strategy utilized by Trevor to build culture focused on solving
problems; therefore, becoming a solution-focused positive results driven culture.

In summary, having the right people and supports, making tough
decisions, putting students first, knowing your core beliefs, what drives you and
the hill you would die on, are key points Trevor offered to our group of
educational leaders. Trevor Greene offered powerful lessons of how to change
systems and culture to impact data, which will serve him well in his current role
with the Association of Washington State Principals (AWSP), providing
professional development to other principals across the state.



Kassie Swenson
Reflection #5
WSU Principal Program, November 21, 2013
Dr. Gene Sharratt, Executive Director, Washington State Achievement Council
The Washington Student Achievement Council: Moving the attainment needle
forward in partnership with K-12 leadership
http://www.slideshare.net/fullscreen/GEMalone/wsu-principals-presentation-
november-21-2013/1

The trip to the capitol to hear from Marcie Maxwell and Dr. Gene Sharratt
was an excellent opportunity to experience how educational policy is made and
driven in Washington state. What happens in Olympia has a huge impact on
what happens locally, especially with the adoption of TPEP, Common Core State
Standards, and how monies allocated to various educational programs impact
student attainment. Dr. Gene Sharratt is the Executive Director of the
Washington Student Achievement Council (WSAC), leading the council to
propose educational goals, improvements and innovations, and advocating for
postsecondary education, focusing on educational attainment goals. In the
presentation, there were many statistics shared about the current reality of
education in Washington state today. This data is presented to stakeholders, in
which decisions about next steps are decided and acted upon.

WSAC Achievements and Challenges:
! Washington is #1 in Need-based aid ($1,077 WA; $482 U.S. Average)
! In Participation, Washington ranks:
o 47
th
in 4-year public higher education at undergraduate level
o 48
th
in public graduate education
o 49
th
in per student (state) funding
o Washington imports talent
! 2023 Attainment Goals
o At least 70% of Washington adults will have a postsecondary
credential.
o All adults in WA will have a high school diploma or equivalent.

Pearls of wisdom in Dr. Sharratts presentation:
"Sometimes, you just have to bow your head, say a prayer, and weather
the storm."
"Leaders build capacity, not dependency."
Leaders at all levels! "lead in transformation of people and
organizations."
"Leaders are responsible for building the capacity in individuals, teams
and organizations so that everyone is a leader." Hirsh and Killion (2009).
Its very important to look at how people have handled failure. Did it break
them? Did they start whimpering and blaming others? Or did they get up
and get going again? David McCullough
How do you handle failure? You will get knocked down and will have to
get back up. People will be watching you.
Kassie Swenson
Reflection #6:
Video: Elizabeth City, Harvard School of Education, What Do You See in These
Data?
http://m.youtube.com/watch?v=j13Gd7MSs7o


What Do You See in This Data? video by Elizabeth City is a valuable tool
providing steps on what to do with data once it has been collected. This video
would be great to use with staff that is brief, concise and offers guidelines of
working through the data. First, it is important to have protocols in order to have
structure-based conversations around the data. Since there is limited time for
common planning time, protocols can be extremely useful. During this time, it's
important to take it slow by asking questions and allowing the inquiry process to
take place before jumping to solutions. Asking questions such as "What do you
see? What do you notice? What other data should we explore before we dig in
further? What are our next steps when looking at the data?" are important
questions to use when working through this process.

Additionally, before the common planning time with staff, it's beneficial to
have someone process the data and numbers and synthesize into pictures, such
as bar graphs. Processing the numerical data into pictures that the staff can see
will help make the meeting more efficient. The staff will be able to discuss
questions such as, "What do you see? What patterns do you see? Is there a
storyline?"


Next, Liz recommends to think like a scientist and ultimately, ask the
question "Why?" five times to test whatever hypotheses you and the staff come
up with to drill down to what the data is telling us. Additionally, what additional
data would the staff like to collect to further test the hypothesis? In the video, Liz
offers excellent examples of how the why may be used to find additional ways to
explore the data and improve student learning.


These helpful hints in the video can be embedded into a school culture. A
culture that uses structured protocols, processing data in advance of staff
meetings, maintaining inquiry-based conversations asking "Why" "What other
data would we like to see?", is advantageous in that it follows a protocol and
allows for more sufficient time for inquiry to take place before jumping to
solutions.



Kassie Swenson
Reflection #7
Article: The Main Idea
Driven By Data: A Practical Guide to Improve Instruction

The focus on data-driven school culture has become increasingly relevant
not only due to the accountability measures that have driven some of the focus;
but more importantly, cultures focused on data can and have achieved positive
results for students and teachers. As the article states, the change in a data
driven culture can be seen from the focus on what was taught to what was
learned. The benefit of the Driven By Data text is that its based on actual
school practices that produced dramatic results in student achievement and
performance.

Balancing on four building blocks assessment, analysis, action and
culture these four principles build a foundation supporting data-driven
instruction. Assessment is the means of providing the data needed to analyze
and assess. Once gathering the data, its time to analyze the results and look at
what the data is telling us, and what causes, strengths, and areas of growth are.
Next, the action piece comes in after the inquiry has taken place and its time to
fill in the gaps where support is needed to help students learn. Last but not least,
the culture aspect cannot go unnoticed, as this is continuously developed and
nurtured to keep the data-driven focus alive and thriving in the school.

Before getting started, or in order to improve the data-driven culture of
your school, its important not only to know what to do, but also what NOT to do.
Knowing some of the common mistakes that are made to successfully implement
a data-driven culture is extremely important. Common mistakes indicated in the
article include:
1. Inferior interim assessments
2. Secretive interim assessments
3. Infrequent assessments
4. Curriculum-assessment disconnect
5. Delayed results
6. Separation of teaching and analysis
7. Ineffective follow-up
8. Not making time for data


With the many competing demands as a principal, incorporating a
leadership team to build the four building blocks of assessment, analysis, action,
and culture, takes intentional time and long-term commitment to sustain a data-
driven focused culture. The Driven By Data book offers practical knowledge,
resources, and professional development to support building the foundation in a
school to positively impact instruction and student achievement.


Kassie Swenson
Reflection #8
Website Resource: TPEP Student Growth Goals
http://tpep-wa.org/student-growth-overview/

The new TPEP evaluation system focus on student growth lends itself to
emphasize the need for a data-driven culture to produce growth in student
learning on a school-wide basis that includes teachers AND principals. Student
growth goals illustrate the outcome that determines the effectiveness of a
schools instruction and leadership practices, committing towards continuous
improvement.

Student growth overview on the TPEP-WA.org website is as follows:
For school leaders, there are three components of student growth embedded in
criteria three, five, and eight. They are also identical across both of the approved
leadership frameworks. The components are:
SG 3 Provides evidence of student growth that results from the school
improvement planning process.
SG 5 Provides evidence of student growth of selected teachers.
SG 8 Provides evidence of growth in student learning.

The principal ensures goals align with what the data says, and helps
teachers with aligning the data. Together, the principal and teachers can align
their student growth goals together. One example:























- Piincipal woiks with the PLCs using uata
to iuentify goals anu to guiue instiuction
towaius stuuent giowth foi all stuuents.
Piincipal anu teachei align theii stuuent
giowth goals.
8.S Pioviues
eviuence of giowth
in stuuent leaining
- Piincipal conuucts walkthioughs, ievisits
uata eveiy 6 weeks with teacheis, offeis
PB to suppoit. Engages with paients anu
communities to builu suppoit foi stuuents
below taiget.
S.S Assists staff in
implementing
effective
instiuction anu
assessment
piactices
- Piincipal is in the classioom, pioviuing
feeuback anu celebiating
accomplishments. Leaus in the uiscussion
aiounu cultuial pioficiency anu social
justice to inciease achievement among
black stuuent population.
S.4 Assists staff to
use uata to guiue,
mouify anu
impiove classioom
teaching anu
stuuent leaining.

- Neet the taiget AN0 foi all 7
th

giaue math stuuents.
Stuuent uiowth
8.1: Establish
Stuuent uiowth
uoals as Pait of a
PLC
- Iuentify necessaiy suppoits anu
implement uiffeientiation foi
stuuents below taiget in 7
th
giaue
math. Regulaily use common
assessments to monitoi piogiess.
Stuuent uiowth
6.1 Establish
Stuuent uiowth
uoals foi Whole
Class
- Specifically, iaise the achievement
of 7
th
giaue math black stuuents
meeting stanuaius in math by 1u%.
Stuuent uiowth
S.2 Achievement
of Stuuent uiowth
uoals foi
Inuiviuual oi
Subgioups
Kassie Swenson
The value of TPEP is growth and innovation, identifying areas of
alignment and areas to critically define yet improve. Highly effective
organizations are always raising the bar and aiming high to reach optimum
levels, and in the public sector and with multiple stakeholders involved, second-
change order can be difficult but also challenges us to see and do work in new
ways that otherwise may not have been possible. Mandates and initiatives exist
to aim towards student growth; the compilation of all human resources toward a
common vision, mission and goals utilizing data can greatly boost student
achievement.


Kassie Swenson
Reflection #9
Article: Taking Data to New Depths
By Nancy Love
http://courseweb.hopkinsschools.org/pluginfile.php/63192/mod_resource/content/
0/CFA/Data_to_new_depths.pdf

Love addresses an important point when she says, "Superficial data
analysis can be worse than none" (p. 22). By making a snap decision and not
digging deeper into the data, irresponsible choices can be made that are not in
the best interests of students' needs and be harmful. Additionally, not having a
process in place when making data driven decisions continues to superficially
"attempt" to address a problem without knowing what the problem is and how
best to address the problem. Love proposes "educators use data continuously,
collaboratively, and effectively to improve teaching and learning mathematics and
science" (p. 23). Placing an emphasis on collective responsibility to analyze the
data, collaborate, and have conversations focused on improving teaching and
learning, leads to a school culture willing to dive deeper into the data. As a
principal, engaging teachers in these data driven dialogues at a deeper level
generates momentum for teachers to be actively participating and continuously
constructing new ways to improve student learning in areas needed most.

TERCs Using Data Process is a multiple step process with an intentional
focus on a data rich environment that can be effective with data facilitators, a
schools leadership team, and a collaborative staff allowing for data to be taken
to a deeper level. Of course, this also requires a focus on trusting relationships
in which a leader is responsible to set the tone of trust. The principal provides an
environment of professional development growth, supporting teachers as they
identify and generate solutions to achieve student growth and continually
improve student learning with the use of data.

Kassie Swenson
Reflection #10
Video: Data Integration - Education Week video featuring Rudy Crew
http://edpsych510.blogspot.com/2013/09/data-integration-education-week-
video.html

In the video, Rudolf Crew, President of Global Partnership Schools, talks
about how to avoid the worse uses of data, as well as the successes that data
can bring. To be successful, the importance of creating a demand for data and
standing firm in the objective that what you are looking for is what is happening
for children in the classroom. The data can be a source for allocating support,
and isnt meant to be a gotchya. A valuable point Rudy made that really sticks
with me is the benefit of how data is packaged: valuable, timely and immediate.
Packaging the data this way leads to leadership and strategy development.

Data integration is also an important aspect of creating valuable, timely
and immediate use of the data, packaged in a way that can provide a way to see
the composite of each students footprint. The central office is responsible for
the data integration in order for schools to use the data to implement strategies.
In the past, there were siloed systems measuring different kinds of data that were
fragmented and hard to put together. The systems didnt talk to each other,
whereas now there are data integration platforms available to combine all of the
data together to create a big picture within a designated timeline. The
technology has come a long way and can be of enormous benefit to support data
driven cultures for schools and districts.

There is no doubt that data can improve instruction, and with the
technology, leadership and a focused plan, the data can be used in powerful
ways to improve instruction and change student behavior. Additionally as Rudy
says, it can also be powerful in building relationships. Sticking to what the data
says and gathering additional data to identify student-learning problems can be
an exciting venture, that through collaboration, the level of instruction improves.
Though the data can only go so far, as Rudy says, There is no substitute for
good leadership or supervision.


artifacts
Data Principles and Safety
Regulations
Implementation Rubric Data-
Driven Instruction & Assessment
Data Readiness Assessment
Data Drill Down A Snapshot
Causal Analysis Form: Why?
Why? Why?
The Data Pyramid
Multiple Measures
WAC 180-16-220 Supplemental
Basic Education Program
Approval Requirements
The Main Idea. Driven By Data:
A Practical Guide to Improve
Instruction
The Data Wise Improvement
Process
Using
Data
!" #$%&'($) $+$()$,)& $- ./01 23 !"# %&!& '(&'")* +,-%# !( -./0(1-2+ 3#&02-2+ 4(0 &33 *!,%#2!* 456'2#0 7 42'8(3 9'&--: ;<<!0
!"#$%#&'() +, -,,(%.#/( 01.1 2)(
- !" $%&'() *%+, +,- .(+(/ 01-(+- )(12-3 0")"14')3 (5. &%67)- .%&7)(8& "4 .(+( +" (%.
'5.-1&+(5.%52
- U:e ccIc Ic Lui|c uncer:Icncing cnc cwner:hip cf prcL|em:. Engcge in cic|cgue
*%+, .(+( &" +,- +-(6 "*5& +,- 71"9)-6 (5. -691(0-& +,- &")'+%"5& +"2-+,-1:
- Hcng cuI in uncerIcinIy": Icke Iime Ic |ecrn c: much c: pc::iL|e frcm Ihe ccIc.
Ihe fr:I :c|uIicn mighI ncI Le Ihe Le:I cne.
- SepcrcIe cL:ervcIicn frcm inference. Fu||y exp|cre whcI i: Ihere Ic Le |ecrnec
9-4"1- %67"&%52 %5+-171-+(+%"5& "5 +,- .(+(:
- Fcy cIIenIicn Ic Ihe prcce::: ccrefu||y :IrucIure DcIc Iecm meeIing: Ic mcximize
-52(2-6-5+3 )-(15%523 (++-5+%"5 +" -;'%+8 %&&'-&3 (5. +,- %5+-21%+8 (5. &(4-+8 "4
+,- 21"'7:
- /::ure IhcI civer:e vcice: cre LrcughI inIc Ihe cnc|y:i:. Mu|Iip|e per:pecIive:
71"$%.- +,- 1%0,-&+ %54"16(+%"5:
3+4( 531,(.6 7(89'1.#+$): .+ ;9#<( .=( 2)( +, 01.1
- <"5=+ '&- .(+( +" 7'5%&, >(.6%5%&+1(+"1&3 +-(0,-1&3 &+'.-5+&3 &0,"")&?:
- <"5=+ '&- .(+( +" 9)(6- &+'.-5+& "1 +,-%1 0%10'6&+(50-&:
- DcnI jump Ic ccnc|u:icn: wiIhcuI cmp|e ccIc.
- DcnI u:e ccIc c: cn excu:e fcr uick fxe:. Fccu: cn imprcving in:IrucIicnl
01.1 !"#$%#&'() 1$< 31,(.6 7(89'1.#+$)
!"#$%"%&'('!)& +,-+!.
/('(0/+!1%& !&2'+,.'!)& 3 (22%22"%&'
!"#$ &"'()*+,-."/0121 3 456 75"85)9 :1) 456 .+;11$9

<;5 )#()*+ *9 */05/858 01 (5 #958 01 "99599 0;5 =)595/0 90"05 1: 8"0"-8)*>5/ */90)#+0*1/ "/8 "99599'5/0 */ " 9+;11$? <;5
)#()*+ 9=5+*:*+"$$2 0")@509 */05)*' "99599'5/09 "/8 0;5 ,52 8)*>5)9 $5"8*/@ 01 */+)5"958 90#85/0 "+;*5>5'5/0?

A B CD5'=$")2 E'=$5'5/0"0*1/ F B !)1:*+*5/0 E'=$5'5/0"0*1/ G B &5@*//*/@ E'=$5'5/0"0*1/ H B 41 E'=$5'5/0"0*1/
/('(0/+!1%& .,$',+%
45 I*@;$2 "+0*>5 $67869:;<= '67>? :"+*$*0"05 05"+;5)-$5"85) 8"0" "/"$29*9 '550*/@9 ":05)
5"+; */05)*' "99599'5/0 "/8 '"*/0"*/ :1+#9 1/ 0;5 =)1+599 0;)1#@;1#0 0;5 25")
@5 !AB9C8DEBC9F #9CG6::<CA7H /6I6HC=>6ABJ 05"+;5)9 "/8 $5"85)9 ")5 5::5+0*>5$2
*/0)18#+58 01 8"0"-8)*>5/ */90)#+0*1/K0;52 #/85)90"/8 ;16 */05)*' "99599'5/09
85:*/5 )*@1) "/8 5D=5)*5/+5 0;5 =)1+599 1: "/"$2L*/@ )59#$09 "/8 "8"=0*/@ */90)#+0*1/
J5 !>=H6>6AB7B<CA .7H6A879? &5@*/ 9+;11$ 25") 6*0; " 850"*$58 +"$5/8") 0;"0 */+$#859
0*'5 :1) "99599'5/0 +)5"0*1/M"8"=0"0*1/N *'=$5'5/0"0*1/N "/"$29*9N =$"//*/@ '550*/@9N
"/8 )5-05"+;*/@ OGH6K<LH6 5/1#@; 01 "++1''18"05 8*90)*+0 +;"/@59M'"/8"059P
M5 )ANC<AN #9CG6::<CA7H /6I6HC=>6AB? !Q +"$5/8") *9 "$*@/58 6*0; 8"0"-8)*>5/
*/90)#+0*1/"$ =$"/J */+$#859 '185$*/@ "99599'5/0 "/"$29*9M"+0*1/ =$"//*/@ "/8 *9
:$5D*($5 01 "8"=0 01 90#85/0 $5")/*/@ /5589
O5 -D<H8 LF -C99CP<AN? E85/0*:2 "/8 *'=$5'5/0 (590 =)"+0*+59 :)1' ;*@;-"+;*5>*/@
05"+;5)9 3 9 9;")5 3 8*995'*/"05 )591#)+59M90)"05@*59 +;11$9J >*9*0 9+;11$9M+$"99)11'9N

MA
MA
MA
MA
MA
(22%22"%&'2
45 .C>>CA !AB69<> R99599'5/09 A-S 0*'59M25")
@5 '97A:=796AB 2B79B<AN #C<AB? 05"+;5)9 955 0;5 "99599'5/09 "0 0;5 (5@*//*/@ 1: 5"+;
+2+$5T 0;52 85:*/5 0;5 )1"8'"= :1) 05"+;*/@
J5 (H<NA68 BC :B7B6 B6:B: 7A8 ECHH6N6 9678<A6::
M5 (H<NA68 BC <A:B9DEB<CA7H :6QD6AE6 1: +$5")$2 85:*/58 @)"85 $5>5$M+1/05/0
5D=5+0"0*1/9
O5 +60(::6:: =)5>*1#9$2 0"#@;0 90"/8")89
$<B5 "7B;
MA MA
MA MA
MA MA
MA MA
MA MA
(&($R2!2
45 !>>68<7B6 0#)/")1#/8 1: "99599'5/0 )59#$09 O*85"$$2 AU;)9P
@5 ,:690G9<6A8HFS :DEE<AEB 8"0" )5=1)09 */+$#85J <B6>0H6I6H "/"$29*9N :B7A8798:0H6I6H
"/"$29*9 3 LCBBC> H<A6 )59#$09
J5 '67E;690CPA68 "/"$29*9 :"+*$*0"058 (2 5::5+0*>5 $5"85)9;*= =)5=")"0*1/
M5 '6:B0<A0;7A8 "/"$29*9 (50655/ 05"+;5)O9P 3 */90)#+0*1/"$ $5"859
O5 /66=? '1>59 (521/8 V6;"0W 90#85/09 @10 6)1/@ "/8 "/965)9 V6;2W 0;52 @10 *0 6)1/@

MA
MA
MA
MA
MA
(.'!)&
45 #H7A A6P H6::CA: +1$$"(1)"0*>5$2 01 85>5$1= /56 90)"05@*59 ("958 1/ 8"0" "/"$29*9
@5 E'=$5'5/0 5D=$*+*0 B67E;69 7EB<CA =H7A: */ 6;1$5-+$"99 */90)#+0*1/N 9'"$$ @)1#=9N
0#01)*"$9N "/8 (5:1)5M":05)-9+;11$ 9#==1)09
J5 )ANC<AN 7::6::>6ABJ #0*$*L5 */-0;5-'1'5/0 +;5+,9 :1) #/85)90"/8*/@ "/8 */-+$"99
"99599'5/0 01 5/9#)5 90#85/0 =)1@)599 (50655/ */05)*' "99599'5/09
M5 (EECDAB7L<H<BF? */90)#+0*1/"$ $5"85)9 )5>*56 $5991/M#/*0 =$"/9 "/8 @*>5 1(95)>"0*1/
:558("+, 8)*>5/ (2 0;5 "+0*1/ =$"/ "/8 90#85/0 $5")/*/@ /5589
O5 %AN7N68 2BD86AB: ,/16 0;5 5/8 @1"$N ;16 0;52 8*8N "/8 6;"0 "+0*1/9 0;52 ")5 0",*/@
01 *'=)1>5

MA
MA
MA
MA
MA

')'($J MHXX
Revised- November 10, 2008
Using Data/Getting Results 2005-2010 TERC, Inc. All rights reserved. Data Literacy Shift Assessment, page 1 of 2
Using Data 2067 Massachusetts Ave. Cambridge, MA 02140 617.459.9608 usingdata.terc.edu twitter.com/tercusingdata

Data Readiness Assessment
Exploring the Core Values and Beliefs of Your School


Directions: Each item below represents the extremes on a
continuum. Find the number on the 0-100 continuum that BEST describes the current position of your
own personal values and beliefs (circle). Next find the number that BEST describes the values and
beliefs of your district/school (X). Finally, place a (star) where you WISH your district/school were
operating relative to these values and beliefs.

Schools generate, provide and/or collect data
for the staff.
Staff members and stakeholders are active
data users and creators.

0 10 20 30 40 50 60 70 80 90 100

What evidence do you have? What makes you think this? Do you have an example that illustrates
your perspective?



Staff uses data alone and in isolation. Staff and stakeholders collaborate to make
sense of and use data.

0 10 20 30 40 50 60 70 80 90 100

What evidence do you have? What makes you think this? Do you have an example that illustrates
your perspective?


Staff and stakeholders rely on one test to
document results.
Staff and stakeholders use multiple measures
and multiple sources of data.

0 10 20 30 40 50 60 70 80 90 100

What evidence do you have? What makes you think this? Do you have an example that illustrates
your perspective?


For the most part, data is looked at
superficially and in the aggregate.
An in-depth analysis of multiple levels of data
is conducted, from aggregated trends to
student work.

0 10 20 30 40 50 60 70 80 90 100

What evidence do you have? What makes you think this? Do you have an example that illustrates
your perspective?



Using Data/Getting Results 2005-2010 TERC, Inc. All rights reserved. Data Literacy Shift Assessment, page 2 of 2
Using Data 2067 Massachusetts Ave. Cambridge, MA 02140 617.459.9608 usingdata.terc.edu twitter.com/tercusingdata

Data is used infrequently, usually only on an
annual basis.
Data is embedded throughout the
improvement process.

0 10 20 30 40 50 60 70 80 90 100

What evidence do you have? What makes you think this? Do you have an example that illustrates
your perspective?


Data is used to sort students. Data use is focused on the improvement of
student learning. Data is used to serve the
students.

0 10 20 30 40 50 60 70 80 90 100

What evidence do you have? What makes you think this? Do you have an example that illustrates
your perspective?


Data literacy is limited to a special few. Data literacy and data use is widespread.

0 10 20 30 40 50 60 70 80 90 100

What evidence do you have? What makes you think this? Do you have an example that illustrates
your perspective?


Staff and stakeholders are not required to use
data. No support for data use is provided.
Staff and stakeholders are required to make
use of the data for instructional improvement.
Professional development for effective use of
data is provided and support is continuous.

0 10 20 30 40 50 60 70 80 90 100

What evidence do you have? What makes you think this? Do you have an example that illustrates
your perspective?





Contact TERCs Using Data to arrange a complimentary consultation about your schools
data readiness: 617-873-9608.
D
a
t
a

D
r
i
l
l

D
o
w
n


A

S
n
a
p
s
h
o
t

S
t
u
d
e
n
t

L
e
a
r
n
i
n
g

D
a
t
a

F
i
n
d
i
n
g
s

T
y
p
e

o
f

D
a
t
a

D
a
t
a

S
o
u
r
c
e

1
.

S
t
a
t
e

C
R
T

2
.

S
t
u
d
e
n
t

W
o
r
k

3
.

C
o
m
m
o
n

A
s
s
e
s
s
m
e
n
t
s

(
o
r

o
t
h
e
r

l
o
c
a
l

d
a
t
a
)

A
g
g
r
e
g
a
t
e
/

S
u
m
m
a
r
y

D
i
s
a
g
g
r
e
g
a
t
e

S
t
r
a
n
d
/
C
l
u
s
t
e
r

I
t
e
m

(
i
f

a
v
a
i
l
a
b
l
e

v
i
a

s
t
a
t
e

C
R
T

S
t
u
d
e
n
t

W
o
r
k

(
i
f

a
v
a
i
l
a
b
l
e

v
i
a

s
t
a
t
e

C
R
T
)

S
t
u
d
e
n
t

L
e
a
r
n
i
n
g

P
r
o
b
l
e
m

S
t
a
t
e
m
e
n
t

u
s
i
n
g
d
a
t
a
.
t
e
r
c
.
e
d
u

Chiistophei-uoiuon Publisheis, Inc.


!"#$"% '("%)$*$ +,-./ 01)2 01)2 01)2
3333333333333333333333333333333333333333333333333
ul8LC1lCnS: lor each learnlng problem conflrmed by your daLa, ask Why?" aL leasL
Lhree Llmes. 8ecord Lhe answer, beglnnlng wlLh 8ecause _________." SLop asklng
Why?" when Lhe Leam reaches consensus on Lhe rooL cause of Lhe problem.

roblem/8arrler/lssue #1

_________________________________
_________________________________

Why.


Because:







Why.


Because:






Why.


Because:






roblem/8arrler/lssue #2

_________________________________
_________________________________

Why.


Because:







Why.


Because:






Why.


Because:






The Data Coach's Guide to Improving Learning for
All Students
Unleashing the Power of Collaborative Inquiry
Handout H5.3 The Data Pyramid

Lets talk about multiple measures. Many state and
federal regulations now require schools to report
multiple measures multiple measures of student
achievement, that is. While we applaud these
changes from the old method of using one
standardized achievement score to make decisions
about how well a school is doing, multiple measures
of student learning alone are not sufficient for
comprehensive school improvement, and, in fact,
can be misleading in this regard.
Many educators believe that over 50 percent of
student achievement results can be explained by
other factors. That being true, if we want to change
the results we are getting, we have to understand the
other 50 percent to know why we are getting the
results we are getting. Then we need to change what
we do in order to get different results.
Any definition of multiple measures should include
four major measures of data not just student
learning, but also demographics, perceptions, and
school processes. Analyses of demographics,
perceptions, student learning, and school processes
provide a powerful picture that will help us
understand the schools impact on student
achievement. When used together, these measures
give schools the information they need to improve
teaching and learning to get positive results.
In the figure that follows, the four major measures
are shown as overlapping circles. The figure
illustrates the type of information that one can gain
from individual measures and the enhanced levels
of analyses that can be gained from the intersections
of the measures.
One measure by itself gives useful information.
Comprehensive measures, used together and over
time, provide much richer information. Ultimately,
schools need to be able to predict what we must do
to meet the needs of all students they have, or will
have in the future. The information gleaned from
the intersections of these four measures
(demographics, perceptions, student learning, and
school processes) helps us to define the questions
we want to ask, and focuses us on what data are
necessary in order to find the answers.
Demographic data provide descriptive information
about the school community, such as enrollment,
attendance, grade level, ethnicity, gender, and native
language. Demographic data are very important for
us to understand. They are the part of our
educational system over which we have little or no
control, but with which we can observe trends and
glean information for purposes of prediction and
planning. Demographic data assist us in
understanding the results of all parts of our
educational system through the disaggregation of
other measures by demographic variables.
Perceptions data help us understand what students,
parents, teachers, and others think about the
learning environment. Perceptions can be gathered
in a variety of waysthrough questionnaires,
interviews, and observations. Perceptions are
important since people act in congruence with what
they believe, perceive, or think about different
topics. It is important to know student, teacher, and
parent perceptions of the school so school
personnel know what they can do to improve the
system. Perceptions data can also tell us what is
possible.
Student Learning describes the results of our
educational system in terms of standardized test
results, grade point averages, standards assessments,
and authentic assessments. Schools use a variety
of student learning measurementsusually
separatelyand sometimes without thinking about
Page 1 of 5 Bernhardt, V. L., (1998, March). Invited Monograph No. 4. California Association for Supervision and Curriculum Development (CASCD).
Mul t i pl e Measures
by Vi ct ori a L. Bernhardt
Page 2 of 5 Bernhardt, V. L., (1998, March). Invited Monograph No. 4. California Association for Supervision and Curriculum Development (CASCD).
Over time,
student learning data
give information about
student performance on
different measures.
Tells us:
The impact of the program on
student learning based upon
perceptions of the program
and on the processes used.
Over time,
perceptions
can tell
us about
environmental
improvements.
Tells us:
What processes/
programs work best
for different groups
of students with respect
to student learning.
Tells us:
If a program is making
a difference in student
learning results.
Tells us:
The impact of
student perceptions
of the learning
environment on
student learning.
Over time,
school processes
show how
classrooms
change.
Tells us:
Student participation
in different programs and
processes.
Tells us:
What processes/programs
different groups of
students like best.
Allows the prediction of
actions/processes/programs
that best meet the learning
needs of all students.
Over time,
demographic
data indicate
changes in the
context of
the school.
Tells us:
The impact of
demographic factors
and attitudes about the
learning environment
on student learning.
Tells us:
If groups of students
are experiencing
school differently.
DEMOGRAPHICS
P
E
R
C
E
P
T
I
O
N
S
STUDENT LEARNING
S
C
H
O
O
L

P
R
O
C
E
S
S
E
S
Enrollment, Attendance,
Drop-Out Rate
Ethnicity, Gender,
Grade Level
D
e
s
c
r
i
p
t
i
o
n

o
f
S
c
h
o
o
l

P
r
o
g
r
a
m
s
a
n
d

P
r
o
c
e
s
s
e
s
Standardized Tests
Norm/Criterion-Referenced Tests
Teacher Observations of Abilities
Authentic Assessments
P
e
r
c
e
p
t
i
o
n
s

o
f

L
e
a
r
n
i
n
g

E
n
v
i
r
o
n
m
e
n
t
V
a
l
u
e
s

a
n
d

B
e
l
i
e
f
s
A
t
t
i
t
u
d
e
s
O
b
s
e
r
v
a
t
i
o
n
s
Note. Adapted from Data Analysis for Comprehensive Schoolwide Improvement (p.15), by Victoria L. Bernhardt, 1998, Larchmont, NY:
Eye on Education. Copyright 1998 Eye on Education, Inc. Reprinted with permission.
how these measurements are interrelated. Schools
normally think of multiple measures as looking
only at different measures of student learning,
rather than including demographics, perceptions,
and school processes.
School Processes define what teachers are doing to
get the results that we are getting. For example, how
is reading being taught at grade two, or math at
grade six? School Processes include programs,
instructional strategies, and classroom practices.
This is the measure that seems to be the hardest for
teachers to describe. Most often, teachers say they do
what they do intuitively, and that they are too busy
doing whatever they do to systematically document
and reflect on their processes. To change the results
schools are getting, teachers and school personnel
must begin to document these processes and align
them with the results they are getting in order to
understand what to change to get different results,
and to share their successes with others.
A Snapshot of the Measures
Looking at each of the four measures separately, we
get snapshots of data in isolation from any other
data at the school level. At this level we can answer
questions such as
How many students are enrolled in the
school this year? (Demographic)
How satisfied are parents, students,
and/or staff with the learning
environment? (Perceptions)
How did students at the school score on
a test? (Student Learning)
What programs are operating in the
school this year? (School Processes)
By looking over time we can answer questions such
as, but not limited to:
How has enrollment in the school
changed over the past five years?
(Demographics)
How have student perceptions of the
learning environment changed over
time? (Perceptions)
Are there differences in student scores on
standardized tests over the years?
(Student Learning)
What programs have operated in the
school in the past five years? (School
Processes)
Intersection of Two Measures
Crossing two measures, we begin to see a much
more vivid picture of the school, allowing us to
answer questions such as:
Do students who attend school every day
perform better on the state assessment
than students who miss more than five
days per month? (Demographics by
Student Learning)
What strategies do third-grade teachers
use with students whose native
languages are different from that of the
teacher? (Demographics by School
Processes)
Is there a gender difference in students
perceptions of the learning
environment? (Perceptions by
Demographics)
Do students with positive attitudes
about school do better academically, as
measured by the state assessment?
(Perceptions by Student Learning)
Are there differences in how students
enrolled in different programs perceive
the learning environment? (Perceptions
by School Processes)
Do students who were enrolled in active
hands-on content courses this year
perform better on standardized
achievement tests than those who took
the content courses in a more traditional
manner? (Student Learning by School
Processes)
Page 3 of 5 Bernhardt, V. L., (1998, March). Invited Monograph No. 4. California Association for Supervision and Curriculum Development (CASCD).
Looking at the interaction of two of the measures
over time allows us to see trends as they develop
(e.g., standardized achievement scores disaggre-
gated by ethnicity over the past three years can help
us see if the equality of scores, by ethnicity, is truly a
trend or an initial fluctuation.) This interaction also
begins to show the relationship of the multiple
measures and why it is so important to look at all
the measures together.
Intersection of Three Measures
As we intersect three of the measures at the school
level (e.g., student learning measures disaggregated
by ethnicity compared to student questionnaire
responses disaggregated by ethnicity,) the types of
questions that we are able to answer include the
following:
Do students of different ethnicities
perceive the learning environment
differently, and are their scores on
standardized achievement tests
consistent with these perceptions?
(Demographics by Perceptions by Student
Learning)
What instructional process(es) did the
previously non-English-speaking
students enjoy most in their all-English
classrooms this year? (Perceptions by
Demographics by School Processes)
Is there a difference in students reports
of what they like most about the school
by whether or not they participate in
extracurricular activities? Do these
students have higher grade point
averages than students who do not
participate in extracurricular activities?
(Perceptions by Student Learning by
School Processes)
Which program is making the biggest
difference with respect to student
achievement for at-risk students this
year, and is one group of students
responding better to the processes?
(School Processes by Student Learning by
Demographics)
Looking at three measures over time allows us to
see trends, to begin to understand the learning
environment from the students perspectives, and to
know how to deliver instruction to get the desired
results from and for all students.
Intersection of Four Measures
Our ultimate analysis is the intersection of all four
measures, at the school level (e.g., standardized
achievement tests disaggregated by program, by
gender, within grade level, compared to
questionnaire results for students by program, by
gender, within grade level.) These interactions allow
us to answer such questions like:
Are there differences in achievement
scores for eighth-grade girls and boys
who report that they like school, by the
type of program and grade level in
which they are enrolled? (Demographics
by Perceptions by School Processes by
Student Learning)
It is not until we intersect all four circles, at the
school level, and over time that we are able to
answer questions that will predict if the actions,
processes, and programs that we are establishing
will meet the needs of all students. With this
intersection, we can answer the ultimate question:
Based on whom we have as students and
how they prefer to learn, and what
programs they are in, are all students
learning at the same rate? (Student
Learning by Demographics by Perceptions
by School Processes)
Focusing the Data
Data analysis should not be about gathering data. It
is very easy to get analysis paralysis by spending time
pulling data together and not spending time using
Page 4 of 5 Bernhardt, V. L., (1998, March). Invited Monograph No. 4. California Association for Supervision and Curriculum Development (CASCD).
the data. School level data analysis should be about
helping schools understand if they are achieving
their purpose and guiding principles and meeting
the needs of all studentsand, if not, why not? A
good way to avoid analysis paralysis is to consider
using key questions and building your analyses
around the answers to these questions.
This type of data analysis is easy when schools are
clear on their purpose and what they expect
students to know and be able to do. These analyses
comfortably flow from questions that teachers and
administrators naturally ask themselves to learn if
these purposes are being met. The good news is that
by looking at trends of the intersected four major
measures, schools do not have to conduct
complicated program evaluations or needs analyses.
These intersections can tell them just about
everything they would want to know, and the data
are fairly readily available.
Summary
The moral of the story is, if we want to get different
results, we have to change the processes that create
the results. Just looking at student achievement
measures focuses teachers only on the results, it does
not give them information about what they need to
do to get different results.
By asking for student achievement measures alone,
state and federal program officers can never use
these data because the context is missing. This
request might also mislead schools into thinking
they are analyzing student learning in a
comprehensive fashion. Just looking at student
learning measures could in fact keep teachers from
progressing and truly meeting the needs of students.
When we focus only on student learning measures,
we see school personnel using their time figuring
out how to look better on the student learning
measures. We want school personnel to use their
time figuring out how to be better for all students.
Page 5 of 5 Bernhardt, V. L., (1998, March). Invited Monograph No. 4. California Association for Supervision and Curriculum Development (CASCD).
WAC 180-16-220
Supplemental Basic Education Program Approval Requirements

The following requirements are hereby established by the state board of
education as related supplemental condition to a school district's
entitlement to state basic education allocation funds, as authorized by
RCW 28A.150.220(4).

(1) Current & Valid Certificates.
Every school district employee required by WAC 181-79A-140 to possess
an education permit, certificate, or credential issued by the superintendent
of public instruction for his/her position of employment, shall have a current
and valid permit, certificate or credential. In addition, classroom teachers,
principals, vice principals, and educational staff associates shall be
required to possess endorsements as required by WAC 181-82-105, 181-
82-120, and 181-82-125, respectively.

(2) Annual School Building Approval.
(a) Each school in the district shall be approved annually by the school
district board of directors under an approval process determined by the
district board of directors.
(b) At a minimum the annual approval shall require each school to have a
school improvement plan that is data driven, promotes a positive impact on
student learning, and includes a continuous improvement process that
shall mean the ongoing process used by a school to monitor, adjust, and
update its school improvement plan. For the purpose of this section
"positive impact on student learning" shall mean:
(i) Supporting the goal of basic education under RCW 28A.150.210, "!to
provide students with the opportunity to become responsible citizens, to
contribute to their own economic well-being and to that of their families and
communities, and to enjoy productive and satisfying lives!";
(ii) Promoting continuous improvement of student achievement of the state
learning goals and essential academic learning requirements; and
(iii) Recognizing nonacademic student learning and growth related, but not
limited to: Public speaking, leadership, interpersonal relationship skills,
teamwork, self-confidence, and resiliency.
(c) The school improvement plan shall be based on a self-review of the
school's program for the purpose of annual building approval by the
district. The self-review shall include active participation and input by
building staff, students, families, parents, and community members.
(d) The school improvement plan shall address, but is not limited to:
(i) The characteristics of successful schools as identified by the
superintendent of public instruction and the educational service districts,
including safe and supportive learning environments;
(ii) Educational equity factors such as, but not limited to: Gender, race,
ethnicity, culture, language, and physical/mental ability, as these factors
relate to having a positive impact on student learning. The state board of
education strongly encourages that equity be viewed as giving each
student what she or he needs and when and how she or he needs it to
reach their achievement potential;
(iii) The use of technology to facilitate instruction and a positive impact on
student learning; and
(iv) Parent, family, and community involvement, as these factors relate to
having a positive impact on student learning.

(3) Nothing in this section shall prohibit a school improvement plan from
focusing on one or more characteristics of effective schools during the
ensuing three school years.

(4) School involvement with school improvement assistance under the
state accountability system or involvement with school improvement
assistance through the federal Elementary and Secondary Education Act
shall constitute a sufficient school improvement plan for the purposes of
this section.

(5) Nonwaiverable requirements. Certification requirements, including
endorsements, and the school improvement plan requirements set forth in
subsection (2) of this section may not be waived.

See www.TheMainIdea.net to learn more or subscribe. The Main Idea 2010. All rights reserved.











S.O.S. (A Summary Of the Summary)

The main ideas of the book are:
Implemented well, data-driven instruction has the power to dramatically improve student performance.
This book presents the four building blocks of data-driven instruction used by effective data-driven schools and
provides the professional development activities to develop them.

Why I chose this book:
In my annual Survey Monkey survey the number one topic subscribers wanted to learn more about was data-driven instruction. I
was waiting for the right book to come along, and this is it. Paul Bambrick-Santoyo describes the four basic components that you
need to put in place to be truly data-driven:

Assessment Action
Analysis Data-Driven Culture

Also, the book provides the type of concrete tools to put data-driven instruction into practice rarely found in books. At the end of the
first four chapters are implementation suggestions for teachers, principals, and district leaders. Furthermore, the ENTIRE second
part of the book (over 50 pages!) outlines specific workshop activities to conduct with staff and the CD-ROM contains the materials
for these workshops. Note that these could not be summarized and are only found in the book.


The Scoop (In this summary you will learn)

! The eight common mistakes schools make when implementing data-driven instruction

! The key factors in designing or selecting interim assessments that lie at the heart of data-driven instruction

! How to analyze assessment results without getting overwhelmed by the data

! How to make sure that teachers use assessment results to actually make changes in their classroom practice

! The necessary components to create a data-driven culture


PROFESSIONAL DEVELOPMENT BUILT RIGHT INTO THE BOOK
NOTE: The Main Idea does not provide professional development suggestions because there are so many right in the book!

Take a look at the following which are not included in the summary:
1. See the Reflection Questions at the end of the introduction and first four chapters these help the school leader or leadership
team to prepare for implementation of data-driven instruction.
2. See the Application section at the end of the first four chapters these outline concrete steps teachers, principals, and district
leaders can take to implement data-driven instruction in their schools/districts.
3. See Part Two of the book which outlines workshop activities you can conduct to train staff in the four components of data-
driven instruction. The CD-ROM provides the materials needed to conduct these workshops.
Driven by Data: A Practical Guide to Improve Instruction
By Paul Bambrick-Santoyo (Jossey-Bass, 2010)
F
i
l
e
:

D
a
t
a
-
D
r
i
v
e
n

I
n
s
t
r
u
c
t
i
o
n


1 (Driven by Data, Jossey-Bass) The Main Idea 2010


hjjyffg.ne..net2009

Introduction What Is Data-Driven Instruction All About?

Education articles have captured numerous stories about schools that have improved their instruction based on data-driven practices
and achieved outstanding results within a few years. In fact, data-driven instruction has become one of the most discussed new
topics in education. However, at the same time, it is one of the most misunderstood topics. Some people believe data-driven schools
simply conform to NCLB dictates. Others believe that these schools forgo authentic learning and instead merely teach to the test.
Given this confusion, some leaders hope that they can bypass this data craze with the idea that this too shall pass.

However, it would be a mistake for leaders to give up on data. When conducted properly, using data to inform teaching practice is one
of the most effective ways to help students achieve success. Data-driven instruction involves changing a schools focus from what
was taught to what was learned. This book outlines exactly how you create such a data-driven culture in order to achieve academic
excellence. The ideas presented in Driven by Data are not based on a theoretical model, but rather come from the practices of schools
that, using data-driven instruction, have achieved dramatic gains in student performance.

There are many vignettes throughout the book describing how actual schools achieved impressive results using a data-driven
approach. For example, at Fort Worthington Elementary School, a school in which 85 percent of the students receive free or reduced
lunch and 98 percent are African American, the principal put the components of data-driven instruction in place and saw the following
tremendous gains. Note that these are more than numbers; these represent hundreds of additional students reaching proficiency.
Subject English and Language Arts Mathematics
Grade 3 Grade 4 Grade 5 Grade 3 Grade 4 Grade 5
2005-06 49% 50% 42% 44% 43% 44%
2006-07 55% 62% 55% 74% 71% 74%
2007-08 88% 92% 86% 86% 88% 86%
Overall Gains +39 +42 +44 +42 +45 +42

So how exactly did these schools, and many others that used this approach, get such remarkable results? They were able to implement
the four fundamental building blocks of effective data-driven instruction. These four principles are:
1. Assessment Create rigorous interim assessments that provide meaningful data.
2. Analysis -- Examine the results of assessments to identify the causes of both strengths and shortcomings.
3. Action Teach effectively what students most need to learn based on assessment results.
4. Culture Create an environment in which data-driven instruction can survive and thrive.

If there are so few fundamental principles, why havent more schools succeeded? Most schools have assessments and do some kind of
analysis, so shouldnt they see dramatic results as well? The truth is, while all schools make mistakes, there are certain mistakes when
it comes to data-driven instruction that make it difficult to succeed. Below is a description of those mistakes.

Eight Mistakes That Impede Successful Implementation of Data-Driven Instruction
Schools that implement data-driven instruction effectively avoid the following common pitfalls:

1. Inferior interim assessments -- Many schools fail to get results when they use interim assessments that set the bar too low, do not
align to other required tests, or neglect to include open-ended questions.

2. Secretive interim assessments -- Interim assessments are only useful if teachers and schools see them before they teach. For these
assessments to drive rigor, teachers must know the end goals before they plan instruction.

3. Infrequent assessments -- Some schools give these assessments only once every three to four months. This is not frequent enough
to provide the data needed to improve instruction.

4. Curriculum-assessment disconnect -- A common mistake that occurs is when the curriculum does not match the content of the
interim assessment. These assessment results have nothing to do with what happened in the classroom.

5. Delayed results -- Interim assessments are useless unless they are graded and analyzed promptly so teachers can make adjustments.

6. Separation of teaching and analysis -- Another problem occurs when teachers hand over the data analysis to a data team. Teachers
need to analyze the results themselves in order to take ownership over the process.

7. Ineffective follow-up -- One serious shortcoming is when there is only a vague commitment to make adjustments after analyzing
the results. If there is no specific plan for improvement that is scheduled to happen at a specific time, no real changes will be made.

8. Not making time for data Some schools fail to make time for assessments, data analysis, and follow-up. Schools are busy places
and if no time has been set aside in the calendar to make data-driven improvement a priority, it simply wont happen.


2 (Driven by Data, Jossey-Bass) The Main Idea 2010


hjjyffg.ne..net2009
Part I The Four Building Blocks of Effective Data-Driven Instruction

The Four Building Blocks of Effective Data-Driven Instruction
1. Assessment 2. Analysis 3. Action 4. Culture

The 1st Building Block ASSESSMENT

Assessment is the first of the four building blocks of data-driven instruction. Assessments are crucial in defining exactly what
instruction should take place. Consider this example below:

A principal intern brought a math teachers worksheet into Bambrick-Santoyos offce and asked, What do you notice?
Bambrick-Santoyo responded, This looks like a basic review of fractions.
Exactly, the intern responded, But the interim assessment we just gave asks students to solve word problems with
fractions, and in addition, those fractions are more complex.

There was clearly a disconnect between what the teacher was teaching and what was being assessed on the interim assessment. The
above example shows one of the reasons assessments are so important they help to clarify what students should be learning. Without
an assessment, teachers are often left with vague standards like the following:

Understand and use ratios, proportions and percents in a variety of situations.
New Jersey Core Curriculum Content Standards for Mathematics Grade 7, 4.1.A.3

Different teachers could choose many different ways to teach this standard and would assess it in very different ways. Look at the
varying types of assessments you might see from different teachers:
1. Identify 50% of 20
2. Identify 67% of 81
3. Shawn got 7 correct answers out of 10 questions on his science test. What percent did he get correct?
4. In the NCAA, J.J. Redick and Chris Paul were competing for best free-throw shooting percentage. Redick made 94% of his first
103 shots, while Paul made 47 out of 51 shots.
a. Which one had a better shooting percentage?
b. In the next game, Redick made only 2 out of 10 shots while Paul made 7 of 10. What are their new overall percentages?
c. Who is the better shooter?

While these all align to the state standard, they are quite different in scope, difficulty, and design. This shows that standards are
meaningless until you define how you will assess them. The types of questions students are expected to answer determines the level at
which students would learn. This may seem counterintuitive, but instead of standards determining the type of assessments used, the
type of assessments used actually define the standard that will be reached. So what does this mean for schools that wish to implement
data-driven instruction? That they should create rigorous tests and then provide the type of instruction to meet those standards. This
chapter outlines the five crucial elements, or drivers of effective assessments:

ASSESSMENT: Five Core Drivers
1. Common and interim
2. Transparent starting point
3. Aligned to state tests and college readiness
4. Aligned to instructional sequence
5. Re-assessed previously taught standards

Core Driver 1: Assessments Must Be Common and Interim
In effective data-driven instruction the most important assessments are interim assessments: formal written tests taken every six to
eight weeks. More than a traditional scope and sequence, interim assessments provide a roadmap to rigorous teaching and learning.
Then carefully analyzing interim assessment results on a regular basis provides the feedback teachers need to improve their teaching
rather than waiting for the results of a year-end test. Interim assessments hold teachers and principals accountable for student learning
by accurately measuring student performance without the teacher support normally given in a classroom. Furthermore, rather than
have individual teachers decide their own level of rigor, data-driven schools create rigorous interim assessments that are common to all
grade-level classes in each content area.

Core Driver 2: Assessments Must Be The Starting Point and Must Be Transparent
Traditionally, assessments are designed at the end of the quarter or semester and what is assessed is based on what is taught. In
effective data-driven instruction this process must be reversed such that interim assessments are created before the teaching begins. It
is the rigor of the assessment that drives the rigor of what is taught. In addition, everyone teachers, school leaders, parents,
community members should know what skill level students are expected to reach and the necessary steps to get there.

3 (Driven by Data, Jossey-Bass) The Main Idea 2010


hjjyffg.ne..net2009
Core Drivers 3 and 4: Assessments Must Be Aligned
All public and many private schools must take high-stakes tests. At the primary level these might be state or district exams. At the
secondary level it could include SAT/ACT or AP/IB assessments. To help students succeed on these tests, interim assessments should
be aligned to those tests in format, content, and length. The interim assessments should also help prepare students for college and
therefore be aligned to college readiness standards as measured by SAT/AP/IB exams, research papers, and other measures. Of course
the assessments should also be aligned to the schools clearly defined grade level and content expectations so teachers are teaching
what will be assessed.

Core Driver 5: Assessments Must Re-Assess Previously Taught Standards
If interim assessments only assess what was taught during one period of time, they would serve more as unit-end tests than interim
assessments. Including material that was previously taught helps ensure that students retain that material and also provides an
opportunity for teachers to see if their re-teaching efforts were successful. This is a common mistake that schools make they fail to
review past material.

WRITING OR SELECTING THE RIGHT INTERIM ASSESSMENT
Some schools that effectively implement data-driven instruction create their own interim assessments while others select from those
already available. Either process can lead to success as long as one applies the following core principles:

Core Principles in Writing/Selecting Effective Interim Assessments
* Start from the end-goal exam When designing or selecting interim assessments, make sure it is based on the exams students
must take at the end of the year (state, district, SAT, etc.) and not the vague standards discussed earlier.
* Align the interim assessments to the end-goal test Make sure interim assessments are aligned to the end-goal test not only in
content, but in format and length as well.
* If acquiring assessments from a third party, be sure to see the test Dont take the word of sales reps; ask to see the actual tests
to verify whether they align to the end goals. This step is often overlooked.
* Assess to college-ready standards Be aware the skills to pass state tests are often insufficient to ensure postsecondary success.
High schools have an easier time with this because they can align with the SAT or the demands of a college research paper. For
elementary and middle schools, consider increasing the rigor of your interim assessments by demanding higher levels. For example,
rather than expecting kindergarteners to meet the equivalent of the Fountas-Pinnell Level B, push for Level C or D. In math, one
school, North Star Elementary, using TerraNova as a guide, established interim assessments for kindergarteners that measure all of the
kindergarten standards and half of the first grade standards. First grade then measures all of the first and second grade math standards,
and so on. In middle school math, include more in-depth algebra, and in middle school reading, demand a closer reading of texts.
* Design the test to reassess earlier material Reviewing past material is essential in creating effective interim assessments. One
way to do this is to create longer and longer tests as the year progresses. Another way is to assess all of the material from the start, and
then track progress as students actually learn the concepts.
* Give teachers a stake in the assessment Teachers included in the writing or selecting of interim assessments will be much more
invested in making sure they are effective.

FIRST STEPS FOR TEACHERS AND LEADERS
Each of the first four chapters contains first steps that teachers, school leaders, and district leaders can take to help implement the
building block introduced in that chapter. Take a look at these sections for implementation suggestions.



The Four Building Blocks of Effective Data-Driven Instruction
1. Assessment 2. Analysis 3. Action 4. Culture

The 2nd Building Block ANALYSIS

Assessment, the first building block of effective data-driven instruction, points to the ultimate goals of instruction. Analysis, the second
building block, is what helps teachers reach those goals. Analysis involves systematically examining interim assessment data
thoroughly to determine students strengths and weaknesses and then taking the necessary steps to address their needs. This chapter
outlines the five core drivers of successful analysis and emphasizes the importance of looking closely at the data along the way.

Imagine a swimmer who needs feedback from her coach to improve, but the coach does not go to her meets. The swimmer goes to her
first competition, but does not win. Because the coach did not see her swim, he will probably read the results in the newspaper and
only be able to give her the vague advice to swim faster. If he had had a view from the pool, he would have seen that she was the
fastest swimmer, but she was the last one off the starting block. Unless educators look directly and carefully at their students
assessment results, like the coach, they may diagnose their students problems incorrectly and therefore provide an inaccurate remedy.
Below are the five core drivers of effective data-driven analysis that would help prevent this situation:


4 (Driven by Data, Jossey-Bass) The Main Idea 2010


hjjyffg.ne..net2009
ANALYSIS: Five Core Drivers
1. User-friendly data reports
2. Test-in-hand analysis
3. Deep analysis
4. Immediate turnaround of assessment results
5. Planned analysis meetings between teachers and leader

Core Driver 1: Analysis Must Include User-Friendly Reports
Great analysis is only possible if data is recorded in a useful form. Interim assessments yield a tremendous amount of raw data, but
unless it is put into a form that is teacher-friendly, the data may be rendered useless. Schools dont need lots of fancy data reports in
order to effect change. In fact, the more pages in an assessment report, the less likely teachers will be to actually use it! Instead,
schools need realistic templates (the best ones are one page per class) that allow for analysis at four important levels:
! Question level ! Individual student level
! Standard level ! Whole class level

What might a template that helps teachers analyze results at these four levels look like? One sample from North Star Academy is
excerpted below. Note that it contains the results for one class and fits on one page. In the multiple-choice section, each letter
represents the wrong answer a student chose and blank spaces represent correct answers. The school color codes the chart: above 75%
is green, 60 to 75% is yellow, and less than 60% correct is coded red. Below is a modified excerpt. The full template is on p. 43.

Multiple-Choice
% correct
Open-Ended
% correct
Combined
Proficiency
Score
Standard 1:
Computation: + and
decimals & money
Etc. Standard 4:
Fractions: + and
mixed numbers
Etc. Standard 7
Estimation &
Rounding:
division
Student Question 1 Etc. Question 5 Etc. Question 10
Moet 82% 81% 81%
Terrell 79% 42% 69% C
Aziz 74% 42% 65% A B
Kabrina 63% 38% 56% A B
Etc.
Total Class % Correct 95% Etc. 40% Etc. 60%





Core Driver 2: Analysis Must be Conducted With Test in Hand
It is essential that analysis is done test-in-hand with teachers constantly referring to the completed data report template. The data
report doesnt mean anything on its own it is like the coach reading the newspaper with the swimmers results.

Core Driver 3: Analysis Must be Deep
Good analysis means digging into the test results and moving beyond what students got wrong to answer why they got it wrong.
This involves finding trends in student errors or trends among groups of students. Combined with the above strategies of using clear
data reports and having the test in hand, performing deep analysis can quickly surface weaknesses the teacher needs to act upon.
Below are some suggestions to approach deep analysis.

Do Question-Level Analysis and Standard-Level Analysis Side by Side
Its often not sufficient to look at overall results alone. In examining results at the standard-level, consider the example below. On one
assessment, students scored 70% overall on Ratio-Proportion questions. If the analysis stopped here, the teacher would assume most
students are doing well and that about a third need remediation. However, if the teacher had looked at a breakdown of the standard a
different picture would emerge:
Ratio-Proportion General (Questions 12, 21): 82% correct Ratio-Proportion Rates (Questions 22, 30): 58% correct

After looking more closely the teacher might now conclude that it is necessary to re-teach rates. However, drilling even deeper into
the data by looking at the actual questions (35% got Question 22 correct while 80% got Question 30 correct) the teacher learns more:

22. Jennifer drove 36 miles in an hour. At this rate, how far would she travel
in 2! hours?
A. 72 miles (chosen most) B. 80 miles C. 81 miles D. 90 miles
30. If a machine can fill 4 bottles in 6 seconds, how many bottles can it fill in
18 seconds?
A. 24 B. 12 C. 8 D. 7

The question reveals that students knew how calculate a rate in Question 22, but they stopped after multiplying 36 and 2 because they
got stuck on multiplying by a mixed number. Without deeper analysis, the teacher would have wasted valuable time by re-teaching the
general topic of proportions or just as ineffectively, re-taught rates.

Repeated 6-1 Standards:
Comp: +/- decimals/money (Question 1): 95% correct
Multiply/divide in context (Questions 6, 8, 9): 87% correct
Etc.
Multiple-Choice
% correct
Open-Ended
% Correct
COMBINED
% Correct
Whole Class 69% 47% 63%

5 (Driven by Data, Jossey-Bass) The Main Idea 2010


hjjyffg.ne..net2009
Search by Separators
Look for questions on which the stronger students outperform their peers. These questions that separate students point to areas
where smaller groups or pullout groups could benefit from targeted instruction. For example, if the top third of the class answered
Question 11 correctly, they could be given a stretch assignment while the teacher re-teaches that concept to the rest.

Scan by Student
Another way to dig deeply into the data is to look at individuals. Consider Kenyattas results below (letters are wrong answers):

Question 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23
Kenyatta A D C D A B D D D C D A

Kenyattas overall score was the lowest in the class. Without looking at her individual results, a teacher would miss that she
outperformed her peers in the first half of the assessment. Perhaps she is a slow test taker or fell asleep. What these results do not
represent is a lack of academic skill. Without carefully examining individual results, a teacher might miss this.

Below are some questions to help with the process of digging deeply into the data results:

Larger Picture Questions
* How well did the class do as a whole?
* What are the strengths and weaknesses in different standards?
* How did the class do on old versus new standards taught?
* How were the results in the different question types (multiple choice vs. open-ended, reading vs. writing)?
* Who are the strong and weak students?
Dig in Questions
* Bombed questions did students all choose the same wrong answer? Why or why not?
* Break down each standard did students do similarly on each question within the standard? Why?
* Sort data by students scores are there questions that separate proficient and nonproficient students?
* Look horizontally by student are there any anomalies occurring with certain students?

Core Driver 4: Results From Analysis Must Be Turned Around Immediately
If assessment results are not turned around in a timely manner they cant be effective. Schools need to put systems into place to make
sure that insights learned from data analysis are put into practice quickly. Schools should try to design their calendars such that interim
assessments are analyzed within 48 hours of being scored. For example, at Greater Newark Academy, they set aside several half days
for analysis after giving each interim assessment.

Core Driver 5: Analysis Must Include Effective Analysis Meetings
A key component of effective data analysis is the analysis meeting. These are meetings between teachers and instructional leaders that
focus on the results of interim assessments. These are crucial meetings because unlike meeting with a teacher about an observation
from a specific day, these meetings cover months of student learning. Furthermore, they are essential to changing a schools culture
from one in which the focus is on what is taught to what students have actually learned.

These meetings ideally should be conducted by the principal, but in large schools this responsibility may be shared with other
instructional leaders such as assistant principals, coaches, team leaders, and head teachers. Conducting both one-on-one and group
meetings can be effective. Group meetings allow teachers to share best practices while individual meetings let teachers focus on their
own unique needs. This chapter focuses on individual meetings.

Preparing for the Meeting
Schools often assume that simply sitting down with the data is enough to conduct an effective meeting. Both leadership and teacher
training is necessary to make the meeting a success. The second half of the book provides training suggestions for modeling effective
and ineffective meetings. Preparation also contributes to the effectiveness of the meeting. Below are some suggestions to prepare:

Before Giving the Interim Assessment
! For each question teachers predict student performance by choosing one of the following:
a. Confident theyll get it right b. Not sure c. No way theyll get it right
! Teachers receive professional development on how to do data analysis, how to complete an action plan, and they see a
model of effective and ineffective analysis meetings (PD workshops are outlined in the second part of the book)

Immediately After Giving the Interim Assessment
! Teachers analyze results before the meeting trying to understand why the students did not learn
! Teachers complete an action plan based on the results from the assessment
! Leader analyzes the assessment results personally to prepare for the meeting
! Leader reviews the teachers action plan

6 (Driven by Data, Jossey-Bass) The Main Idea 2010


hjjyffg.ne..net2009

At the Meeting
It can be challenging to know how to begin an analysis meeting. Below are some tried-and-true ways to start:
So whats the data telling you?
Congratulations on your improvement in _________; you must be very proud!
So the [paraphrase the teachers main frustration for example, geometry scores did not improve]. Im sorry to hear that. So
where should we begin our action plan?

Then from this point on, to help the meeting run effectively, there are several principles to adhere to:

Let the data do the talking Rather than tell teachers what to do, point to the data and ask them what it means
Let the teacher do the talking Teachers must own the assessment and analysis and they will do so if they find answers on their own
Go back to specific test questions Have copies of the assessment at the meeting
Know the data yourself By knowing the data school leaders can ensure meetings will be productive
Make sure the analysis is connected to a concrete action plan Insights are meaningless unless written down as part of a plan

Below are some phrases leaders can use to ground analysis meetings in these principles:
Lets look at question ____. Why did the students get it wrong?
What did the students need to be able to do to get that question right?
Whats so interesting is that they did really well on question ___, but struggled on question ___ on the same standard. Why
do you think that is?
So, what youre saying is [paraphrase and improve good responses]
So, lets review your action plan and make sure we have incorporated all of these ideas.

It may take a while, but these analysis meetings can become part of the leaders repertoire of tools to help improve teaching and
learning. These meetings are a powerful way to propel the process from examining data to taking action.



The Four Building Blocks of Effective Data-Driven Instruction
1. Assessment 2. Analysis 3. Action 4. Culture


The 3rd Building Block ACTION

After implementing assessments and conducting deep analysis, the next step is to take action to address student strengths and
weaknesses. Without using what was learned from the assessments in actual classrooms, this data-driven approach is worthless.
Therefore it is crucial to develop and implement an effective action plan. As with the other components of data-driven instruction,
there are five core drivers that make it effective:

ACTION: Five Core Drivers
1. Planning
2. Implementation
3. Ongoing assessment
4. Accountability
5. Engaged students

Core Driver 1 Action Must Involve a Plan
Action plans describe how teachers will apply what theyve learned from assessment results in the classroom. For this to be
successful, it is imperative that the analysis itself is sound, that new strategies are used in re-teaching, and that there is a specified date
and time for implementation to make sure it happens. Below is a modified excerpt from an action plan designed by Amistad Academy
and Achievement First. See pp.73-74 for more details.

Action Plan Results Analysis
RE-TEACH STANDARDS: What standards
need to be re-taught to the whole class?
ANALYSIS: Why didnt the students learn it? INSTRUCTIONAL PLAN: What techniques
will you use to address these standards?








7 (Driven by Data, Jossey-Bass) The Main Idea 2010


hjjyffg.ne..net2009
6-Week Instructional Plan
Week 1 Dates: Week 2 Dates: Etc. Week 6 Dates:
Standards for Review & Re-teach Standards for Review & Re-teach Standards for Review & Re-teach Standards for Review & Re-teach

New Standards New Standards New Standards New Standards


Sample Action Plan
Which Standards for Review Date: 10/27 10/31 Date: 11/3-11/7 Date: 11/10-11/14
In Do Now: Ex. 10/27 Multiplication
10/30 Exponents
Etc. Etc.
In Mini Lesson: Etc.
In Heart of Lesson: Etc.
In Checking for Understanding:
In Assessment:
In Homework:

Core Drivers 2 and 3 -- Successful Implementation and Ongoing Assessment are Key to the Action Plan
The idea behind an action plan is to get teachers to change their actual classroom practices. If teachers store the action plan in a binder
and do their lesson planning at home, those action plans will gather dust. Instead, teachers should have their action plan in hand when
designing lessons. There are a number of strategies to ensure that the action plan will be implemented in the next cycle of lessons.
Below are some examples of these strategies:

* Re-write and tighten objectives: Teachers should use assessment results to focus the objectives of future lessons on the areas
students need improvement. The more specific the objective, the better.
* Do Nows: During the quick 5- to 10-minute assignment to start the class is a perfect time to review those standards outlined in the
action plan that require more attention. Teachers can include questions students struggled with in the last assessment.
* Differentiation: When the action plan calls for targeting certain groups in the class with specific needs, differentiation can be a good
strategy to work with those groups while others work independently.
* Ongoing assessment: Constantly checking for understanding (for example, having all students write an answer on a white board and
hold it up to show if they understand) is an effective way for teachers to see if the action plan is achieving results.
* Peer-to-peer support: A student who has mastered a standard can help another student, identified from the assessment results, as
needing help. For example, the helping student can use flash cards to help another student with sight words.
* Homework: Re-design homework to target those areas that students need to review according to the action plan.
* Outside of the classroom: Have students who struggled on the assessment come to a breakfast club for extra practice or provide
afterschool tutors with assessment results so they can help students with their specific weaknesses.
* Increase rigor throughout the lesson: To ensure students will learn when re-teaching standards, there is a list of over 80 strategies
from the highest achieving teachers at North Star Academy on pp. 81-84 to help teachers increase rigor.

The Results Meeting Protocol
When students have particular trouble with a standard on the interim assessment, teachers may need suggestions from other teachers
as to how they can teach this standard in a more effective way. To share ideas, it is useful to conduct a results meeting and use the
results meeting protocol to do so. This protocol takes about 55 minutes, keeps the meeting on task, and ends with action steps teachers
will take. To make sure these meetings are effective, it is helpful to focus on one standard at a time, model a results meeting for
teachers before implementing it, and make sure all suggestions are specific.

Excerpt of RESULTS MEETING PROTOCOL (Full protocol on p.92)
* Identify roles: timer, facilitator, recorder (2 minutes)
* Identify objective to focus on (recommended to focus on one standard) (3 minutes)
* What teaching strategies worked so far, or what did you try so far? (5 minutes)
* Chief challenges (5 minutes)
* Brainstorm proposed solutions (10 minutes)
* Reflect on feasibility of each idea (5 minutes)
* Put in calendar: when will tasks happen? When will re-teaching happen? (10 minutes)

Core Driver 4 Accountability is Necessary for Successful Action
School leaders play a vital role in ensuring that action is taking place. Here are a few ways to do this:
Observe with assessment results in mind: One principal, when she observes, brings the actual spreadsheet with her to the
classroom to see which students are struggling with which standards and to make sure they get to practice those standards.
One time she noticed the teacher primarily asked prediction questions in reading when the results showed the students had
already mastered that skill.
Review lesson and unit plans with the action plan in mind: The principal can look for whether class lessons, units, and
assessments match the rigor of the interim assessments.

8 (Driven by Data, Jossey-Bass) The Main Idea 2010


hjjyffg.ne..net2009
Change the focus of teacher-principal meetings: Rather than pre- and post-observation meetings, conduct pre- and post-
assessment meetings referring to the action plan for guidance.
Keep track of observations and plans: Consider using some kind of chart to keep track of which teacher is focusing on which
goals from their action plans.

Core Driver 5 Actions Must Engage Students
In a truly effective data-driven school the students will be engaged in improving their own learning. This can only happen when
students know the goal, how they are doing, and what they can do to improve. There are different ways to have students chart their
own performance (so they know how they are doing) and reflect on it (so they can understand what they need to improve). One way is
with the reflection template used at Williamsburg Collegiate School and which is excerpted here (full version is on pp. 97-98):

STUDENT REFLECTION TEMPLATE
Standard/Skill Did you get the question right or wrong? Why did you get the question wrong? Be honest.
Questions What skill was tested? Right Wrong Careless mistake Didnt know how to solve
1 Algebra substitution: add
2, etc. Algebra substitution: add 3 numbers

Using your test reflections, please fill out the following table:
Type of Error Careless Errors Did Not Know How to Solve
Number of Errors

If you
have
You are a In class you During class you should During assessments you should
More
careless
errors than
dont
knows
RUSHING
ROGER
* Are one of the first to finish
* Want to say your answer before
writing
* Often dont show work
* Are frustrated when you get
assessments back
* SLOW DOWN!
* Ask the teacher to check your work
or check with partner
* Push yourself for perfection, dont
just tell yourself I get it.
* SLOW DOWN you know you tend
to rush
* Really double check your work since
you know you make careless errors
* Use inverse operations when you
have time
More dont
knows than
careless
errors
BACK-SEAT
BETTY
* Are not always sure that you
understand how to do independent
work
* Are sometimes surprised by your
quiz scores
* Ask questions about HW if youre not
sure its perfect
* Do all of the problems with the class
at the start of class
* Use every chance to check in with
teachers and classmates
* Do the problems youre SURE about
first
* Take your time on the others and use
everything you know
* Ask questions right after the
assessment while fresh in your mind

1. If you are a Rushing Roger and you make careless errors, what should you do in your classwork and homework?
2. If you are a Backseat Betty, what should do when you get a low score on a quiz?



The Four Building Blocks of Effective Data-Driven Instruction
1. Assessment 2. Analysis 3. Action 4. Culture

The 4th Building Block CULTURE

In Bambrick-Santoyos school there was a great deal of resistance from a veteran teacher when they wanted to implement data-driven
instruction. She was well respected by peers and had a great deal of influence over them. Even though she was invited to join the
leadership team to plan the initiative, she was not prepared for her students poor results on the first interim assessment. Over the next
two years her students made dramatic gains in achievement and finally she was willing to buy in to data-driven instruction.

Faculty buy-in for data-driven instruction is not a prerequisite to start implementing it. Building a data-driven culture takes time and
this process usually goes through several phases before everyone sees the benefits of this approach. Below is an overview of what
those phases might look like:

Phase 1: Confusion and overload This is too much!
Phase 2: Feeling inadequate and distrustful How can two questions on a test possibly establish mastery of an objective?
Phase 3: Challenging the test That is a poor question. Answer b is a trick answer.
Phase 4: Examining the results objectively and looking for causes Which students need extra help and what topics need re-teaching?
Phase 5: Accepting the data as useful information, seeking solutions, and modifying instruction

So how does a school leader go about building a data-driven culture? By putting into place the three other components of data-driven
instruction: assessment, analysis, and action. Actually implementing data-driven instruction improves student achievement and this is
what helps to create teacher buy-in. When it is implemented effectively, data-driven instruction does not require teacher buy-in, it
creates it. Below are additional structures to help ensure buy-in for a data-driven culture.


9 (Driven by Data, Jossey-Bass) The Main Idea 2010


hjjyffg.ne..net2009
CULTURE: Five Core Drivers
1. Highly active leadership team
2. Implementation calendar
3. Build by borrowing
4. Introductory professional development
5. Ongoing professional development

Core Driver 1: Data-Driven Schools Must Have the Right Leadership Team
If the right people are identified for the leadership team they can serve as bridges to help win over the rest of the faculty. Most
leadership teams already have members with expertise (as a leader or an instructor). However, it is also important to include teachers
who are trusted by the faculty: those teachers to whom others turn for personal support. Once these staff members are chosen, they
should be involved in every aspect of implementing data-driven instruction.

Core Driver 2: Data-Driven Schools Need an Implementation Calendar
Whatever makes it onto the school calendar takes precedence over other activities that come later. To ensure that data-driven
instruction lies at the heart of a schools culture, it must be placed on the calendar first. Time for assessment, analysis, and action
should be prioritized on the school calendar. Without regular time set aside for these activities, they are likely to be overshadowed by
other pressing commitments. Below are a few tips to help with this:
Make time for all parts of the process Schools often block out time to take the assessments but no time to score and analyze
them. In addition, schools need to block out time for re-teaching, such as the week following the assessment (this does not mean the
whole week is spent in review teachers should integrate old and new material).
Take state/national tests into account Find out when state/national tests will occur, and then plan interim assessments every
six to eight weeks leading up to that.
Plan time for professional development Carve out time before and after each round of interim assessments to provide
content-focused PD in areas identified by the assessment.

Core Driver 3: Data-Driven Schools Build by Borrowing
A key component of a data-driven culture is the habit of identifying and adapting best practices from other successful data-driven
schools. This is referred to as building by borrowing and all of the high-achieving schools mentioned in the book have visited
schools that were more successful than theirs and borrowed any idea that would help to improve their own results. Leaders must build
the type of culture in which seeking out best practices is accepted and pursued regularly. Furthermore, seeing successful data-driven
instruction in action will show teachers how it looks and help provide hope that if implemented well, it can have tremendous impact.

Core Drivers 4 and 5: Effective Professional Development is Necessary to Prepare Teachers and Leaders
Providing effective training for both teachers and leaders is probably the most important element of building a data-driven culture and
the entire second part of the book is devoted to mapping out specific professional development plans for this purpose.


Part II Leading Professional Development on Data-Driven Instruction

The second half of the book outlines workshops to train teachers and leaders to put into practice the four core principles of data-driven
instruction: assessment, analysis, action, and developing a data-driven culture. The first chapter describes the key ingredients
necessary for effective professional development, and the subsequent chapters outline the specific activities to conduct each
professional development workshop.

Foundations of Effective Adult Professional Development
Traditionally, professional development often involves teaching by talking, poorly structured large-group sharing, and other
approaches that do not lead to real adult learning. For professional development to be effective, adults need to generate the content
they are learning to be truly invested in it and to retain it longer. Below are the key components of what effective professional
development should include:
1. Activity Design airtight activities which provide a learning experience (such as case studies, video clips, and role-plays) so
participants can come to the right conclusions.
2. Reflection Facilitate reflection individual, small group, and large group that allows participants to draw conclusions
from the activity.
3. Framing Use the vocabulary of the new principles to frame the participants conclusions so they can share one common
language.
4. Applying Provide opportunities for participants to apply the learning in simulated and real-world experiences.

Overall, manage time well and inspire by sharing a vision of success so participants can see that it can be done!


10 (Driven by Data, Jossey-Bass) The Main Idea 2010


hjjyffg.ne..net2009
Workshop Agendas, Activities, and Materials
The book provides fully fleshed-out workshops on assessment, analysis, action, and developing a data-driven culture and the
accompanying CD-ROM contains the materials needed to carry out these workshops. Because of the challenge of summarizing
workshops, below is just an overview. For implementation purposes, see pages 175-233 for the fully fleshed-out workshops.

Workshop Overview: SETTING THE RATIONALE FOR DATA-DRIVEN INSTRUCTION
Activity 1 Setting the Rationale 15 minutes
Objectives: Participants identify the core challenges facing urban education.
Participants identify schools (from a graph) that have succeeded despite the odds.
Participants agree on common goal for the workshop: to drive student achievement upward.
Activity 2 Highlighting Pitfalls 1 hour 20 minutes
Objectives: Presenter pre-assesses participants prior knowledge of data-driven instruction.
Participants analyze a case study on failed implementation of data-driven instruction and identify false
drivers of student achievement.
Participants understand that even with the best intentions, data-driven instruction can still fall short.
Workshop Overview: ASSESSMENT
Activity 3 Principles of Assessment 50 minutes
Objectives: Participants analyze actual assessment questions to understand how standards are meaningless until
defining how to assess them.
Participants identify the key principles of assessment.
Workshop Overview: ANALYSIS
Activity 4 Introduction to Analysis 15 minutes
Objectives: Presenter hooks the audience with the core principles of data-driven instruction through a film clip.
Participants identify the key foundations that make data-driven instruction powerful.
Activity 5 Teacher-Principal Role-Play of Assessment Analysis 1 hour 45 minutes
Objectives: Participants implement the key principles of deep analysis.
Participants read and correctly interpret an assessment data report.
Participants identify effective and ineffective assessment analysis.
Participants compare the differences between traditional post-observation teacher conferences and interim
assessment analysis meetings.
Participants identify the ways in which interim assessments drive change
Activity 6 Leading Effective Analysis Meetings 55 minutes
Objectives: Presenter introduces the teacher-principal data meeting as a critical system behind data-driven instruction.
Participants understand what is needed for teacher-principal analysis meetings to be effective.
Participants lead effective teacher-principal analysis meetings.
Participants overcome obstacles likely to arise during teacher-principal data conferences.
Workshop Overview: ACTION
Activity 7 Introduction to Action 45 minutes
Objectives: Participants generate conclusions as to how effective analysis goes beyond what is happening to why
it has happened after watching a movie clip.
Participants make the connection between deep analysis and effective action.
Activity 8 Results Meeting Protocol
Objectives: Participants learn the results meeting protocol for effective team meetings.
Participants implement the results meeting protocol.
Participants develop an explicit action plan to address a challenge facing their school during
implementation of data-driven instruction.
Workshop Overview: CULTURE
Activity 9 Building a Data-Driven Culture 1 hour 20 minutes
Objectives: Participants analyze a second case study on data-driven instruction.
Participants distinguish the core drivers of data-driven success from false drivers.
Participants understand complete faculty buy-in is not needed for data-driven instruction to be effective.
Participants identify the key principles of data-driven culture.
Activity 10 Start-Up Scenarios 25 minutes
Objectives: Participants overcome obstacles likely to arise in start-up stages of implementing data-driven instruction.



INTRODUCTION
Kathryn Parker Boudett, Elizabeth A. City, and Richard J. Murnane
THE PACKAGE CONTAINING DATA FROM LAST SPRING S
mandatory state exam landed with a thud on principal Roger Boltons desk.The local
newspaper had already published an article listing Franklin High (for the second year)
as a school in need of improvement for failing to increase the percentage of tenth
graders scoring well enough on the English language arts and mathematics exams to
receive high school diplomas. Now this package from the state offered the gory
details. Roger had five years of packages like this one, sharing shelf space with binders
and boxes filled with results from the other assessments required by the district and
state. The sheer mass of paper was overwhelming.
Frustrated as a teacher by how little Franklin expected of its students academi-
cally, Roger had vowed that when he became principal he would make it his mission
to get the learning up. But now, this heavy package reminded him that he would be
judged primarily by whether he could get the test scores up. He wanted to believe
that there was something his faculty could learn from all these numbers that would
help them increase student learning and get the scores up. But he didnt know where
to start.
Many school leaders across the nation share Rogers frustrationa lack of knowledge
about how to transform mountains of data on student achievement into an action plan
that will improve instruction and increase student learning. Others have made some
progress in responding to this challenge, but have become stymied along the way. Some
have learned to identify patterns in student assessment results, but have not figured out
what to do next. Some have not been able to convince their colleagues of the value of
this work. Some have developed action plans, but have not been able to implement
them. Some have implemented plans for improving instruction, but do not know how to
evaluate their effectiveness. The goal of this book is to help educators in all of these posi-
tions to learn how to analyze data in a manner that contributes to improved instruction
and increased student learning.
When we use the term data, we mean not only scores on high-stakes tests, but also
the broad array of other information on student skills and knowledge typically available
in schools. For example, a growing number of districts administer benchmark assess-
ments to gauge students readiness for high-stakes exams. Some districts also adminis-
ter end-of-course exams. Some schools assess student achievement with science fairs or
exhibitions at which student projects are graded using agreed-upon rubrics. Then, of
course, there are the classroom tests, projects, and homework that individual teachers
assign to students as they work their way through the curriculum. These are just some of
the kinds of data that educators can fruitfully examine in targeting areas for instruc-
tional improvement.
When we use the term school leaders, we mean not only principals, but also the
teacher leaders, directors of instruction, department heads, and coaches who are com-
mitted to engaging their colleagues in improving instruction at their school. A central
premise underlying this book is that a good school is not a collection of good teachers
working independently, but a team of skilled educators working together to implement a
coherent instructional plan, to identify the learning needs of every student, and to meet
those needs. We believe that the process of learning from data contributes to building an
effective school and to helping the school continue to improve its performance.
A NEW CHALLENGE
The long-term evidence from the National Assessment of Educational Progress (NAEP)
shows that average reading and math scores of todays 9-, 13-, and 17-year-olds are a little
higher than they were in the 1970s. This is consistent with the view of most educators
that they are working as hard as they can and are accomplishing at least as much as their
colleagues did 30 years ago. So why the enormous external pressure to improve schools,
as embodied in state accountability systems and the annual yearly progress (AYP)
requirements of the federal No Child Left Behind (NCLB) legislation?
To a large extent, the answer lies in changes in the economy that have dramatically
reduced earnings opportunities for Americans who leave school without strong reading,
writing, and math skills and the knowledge of how to use these skills to acquire new
2 DATA WISE
knowledge and solve new problems. These striking long-term changes in the American
economy provide much of the motivation for the standards movement and for the pres-
sure American schools face to improve student learning.
A complementary source of pressure is the persistent and sizable gap between the
average academic skills of white students and those of students of color. Unless this
gap is closed, workers of color will increasingly be denied access to the growing
number of jobs that require problem-solving and communication skills and that pay
enough to support a family. This achievement gap helps to explain why pressure to
improve education is particularly great in urban schools that serve high percentages
of students of color.
Although the economic changes that provided the impetus for the standards move-
ment were not created by the nations educators, educators are under great pressure to
respond to them by dramatically improving the quality of instruction children receive in
school. We believe that the ideas in this book will help educators improve instruction
and increase student learning. Moreover, we see this as a worthy goal not only because it
will help the next generation of Americans earn enough to support their children, but
also because it will give them the skills to contribute to civic life in a democracy beset by
a host of problems.
What effective schools look like is not a mystery. They have a coherent instructional
program well-aligned with strong standards. They have a community of adults commit-
ted to working together to develop the skills and knowledge of all children. They have
figured out how to find the time to do this work and are acquiring the skills to do it well.
This book is written for those educators who are committed to this work. We maintain
that analyzing a variety of student assessment results can contribute to fulfilling their
goals, if careful attention is paid to the limitations of tests and the technical challenges
in interpreting student responses.
When students receive consistent high-quality instruction, scores on high-stakes
tests rise. However, the converse need not be true. Faced with pressure to improve test
scores, some educators analyze student assessment results to identify students who need
just a few more points to pass a graduation exam, with the intent of improving these stu-
dents test-taking skills. Preparing students to pass the exams required for high school
graduation is clearly important. However, it is more important that the time be spent
helping students develop the skills they will need after graduation.
Some educators examine tests to identify frequently used questions and item
formats so they can devote instructional time to helping students do well on particular
tests. Familiarizing students with the format of high-stakes tests makes sense. So does
explaining strategies to improve scores, such as answering every open-ended response
Introduction 3
question. However, the line between ensuring that students are test savvy and focusing
scarce instructional time on preparing for a particular high-stakes test is a thin one.
While drill and kill may lead to improved scores, it will not prepare students to thrive
in our increasingly complex society.
STRUCTURING IMPROVEMENT: A ROAD MAP
For school leaders like principal Roger Bolton, the barriers to constructive, regular use
of student assessment data to improve instruction can seem insurmountable. There is
just so much data. Where do you start? How do you make time for the work? How do you
build your facultys skill in interpreting data sensibly? How do you build a culture that
focuses on improvement, not blame? How do you maintain momentum in the face of all
the other demands at your school? This book addresses all of these questions, providing
strategies and tools for identifying possible explanations for strong and weak student
performance, for examining the importance of alternative explanations, and for plan-
ning and executing instructional strategies to improve teaching and learning.
We have found that organizing the work of instructional improvement around a
process that has specific, manageable steps helps educators build confidence and skill in
using data. This process includes eight distinct activities school leaders can do to use
their student assessment data effectively. Each activity is the focus of one chapter. We see
the eight activities as falling into three categories: Prepare, Inquire, and Act.
We use the Data Wise Improvement Process graphic shown on the next page to illus-
trate the cyclical nature of the work. Initially, schools engage in a set of activities (i.e.,
prepare) to establish a foundation for learning from student assessment results. They
then inquire, and subsequently act on what they learned. They then cycle back to
further inquiry.
Prepare is about putting in place the structure for data analysis and looking at exist-
ing data from standardized tests. Chapter 1 describes tasks involved in organizing for
collaborative work, including setting up a data team and taking stock of existing data.
Chapter 2 explains key elements of assessment literacy that are critical to interpreting
test results correctly.
Inquire is about acquiring the knowledge necessary to decide how to increase
student learning. Chapter 3 describes the tasks involved in creating a data overview,
especially how to construct graphic displays that will allow school faculty to readily iden-
tify patterns in the results of standardized assessments. Chapter 4 explains how to dig
into student work, first in a single data source and then in other data sources, with the
goal of identifying and understanding a student learning problem. Chapter 5 shows how
to examine instruction in order to understand what current practice looks like and how
it relates to effective practice for the student learning problem.
4 DATA WISE
Act is about what to do to improve instruction and to assess whether the changes put
in place have made a difference. Chapter 6 describes the tasks involved in designing an
effective action plan. Chapter 7 addresses planning a process to assess whether students
are learning more. A key message is that the assessment strategy and the action plan
should be developed at the same time. Chapter 8 describes the key tasks involved in
making an action plan come alive in classrooms, and in assessing implementation and
effectiveness along the way. Chapter 9 describes steps school district central offices can
take to support school-based educators efforts to make constructive use of student
assessment results. It is designed to be a resource for school superintendents and other
district leaders committed to helping schools become data wise.
The Data Wise Improvement Process
Introduction 5
P
R
E
P
A
R
E
I
N
Q
U
I
R
E
A
C
T
1
Organize for
Collaborative Work
2
Build Assessment
Literacy
3
Create Data
Overview
4
Dig into
Student Data
5
Examine
Instruction
6
Develop
Action Plan
7
Plan to Assess
Progress
8
Act and
Assess
WHY START WITH HIGH-STAKES TESTS?
Although this book shows schools examining many types of evidence on student
achievement, chapters 2 and 3 focus on lessons for examining student performance on
externally imposed tests, such as state-mandated standardized tests or district-required
tests of basic competencies. One reason to start here is that under NCLB and state and
district accountability systems, schools are responsible for improving students scores on
these assessments. By looking carefully at what students are doing well and not so well
on these tests (keeping in mind that there are often many possible explanations for
poor performance on any given question), educators can begin to see connections
between what they are doing in the classroom and how students are performing on
external assessments.
Another reason for starting with results of externally imposed exams is that all faculty
members recognize them as important to their school, whether they like them or not. In
places like Franklin High, school leaders are often searching for ways to get teachers to
really communicate with colleagues from other departments and grade levels.
A final reason for beginning with results on externally imposed tests is that by their
very nature, these exams offer a measure of student achievement that is independent of
the judgments of the teachers within the building. Although we do not dispute the argu-
ment that teachers are in the best position to understand their students performance,
having an external checkpoint against which to measure students skills can catalyze
fruitful discussions about standards.
Of course, how much educators can learn from the results of externally imposed
standardized tests depends on the quality of the tests and on what information about
results is made available. More can be learned from results on tests that are tightly
aligned with state learning standards than from off-the-shelf tests used across the
country. More can be learned when educators can see individual questions and
responses or subscores indicating the degree of mastery of particular skills than when
they only have an aggregate score for each student.
A central premise of this book, therefore, is that it is important to examine a wide
range of data, not just results on standardized tests. Indeed, we will show that an analysis
of standardized test results raises more questions than it answers. Examining other types
of evidence on students skills and knowledge is needed to answer these questions.
6 DATA WISE
HOW TO USE THIS BOOK
Every chapter focuses on particular tasks school leaders face, tools to accomplish these
tasks, and lessons from schools that have done this work. The book ends with references
that readers can consult for more specialized knowledge on particular topics, and a few
protocols to use to structure conversations.
To bring alive the descriptions of these tasks, we have woven vignettes from two case
study schools throughout the book: Franklin High School, with students in grades 9-12,
and Clark K-8 School, with students in kindergarten through grade 8. Both of our case
study schools are working to improve student learning, not simply to improve test
scores. Clark faces the challenge of how to build a sense of urgency for continuous
improvement, rather than to accept as satisfactory the moderately strong performance
most of its students show on standardized tests. Franklin High School faces a different,
very difficult challenge: how to respond constructively to the enormous pressure to
reduce dropout rates and failure rates on the state graduation exam. Each chapter
describes the choices and challenges these schools face at each step of their respective
journeys and illustrates the messiness of applying the improvement process in prac-
tice. When we need to provide a broader range of responses than these two cases can
offer, we supplement our examples with brief descriptions of approaches taken by other
schools we have worked with.
For leaders relatively new to the process of using data, we recommend skimming the
whole book first and then working through the chapters sequentially with a group of
committed faculty. In a sense, each chapter can be read as a to-do list of the tasks that
will help move the work forward. By following the progress of the two case study schools
as they work their way around the improvement cycle, your group will see how other
schools handle these tasks. By using the protocols, exercises, and templates offered in
the chapters, you should find it relatively straightforward to plan effective faculty meet-
ings on each topic.
For school leaders with considerable experience in using data, it may not be neces-
sary to follow the chapter sequence. Each chapter is designed to stand alone, allowing
practitioners to focus on learning strategies that deal with the parts of the process that
they find most challenging. Alternatively, school leaders can pick up this book at the
point in the cycle where they find themselves, knowing they eventually will work their
way around the entire circle.
District-level or independent professional developers and graduate school faculty
may find this book useful in planning a year-long course that addresses one chapter per
month. In our experience, schools learn a lot by working through the material in a par-
ticular chapter on their own and then coming together with people from other schools
Introduction 7
to share their work, discuss their concerns, and receive technical and moral support from
instructors. School leaders are often energized by opportunities to show their schools
work to colleagues from other schools and relish the chance to borrow good ideas.
For central office personnel and others who want to learn more about how to
support school-level improvement, we recommend reading through the first eight chap-
ters to develop an understanding of the challenges school-based educators face in
attempting to learn from student assessment results. Then focus on chapter 9, which
recommends actions district central offices can take to support school faculties efforts
to make constructive use of student assessment results.
Database designers can use the book to help think through the processes that their
software needs to support. Test developers can use it as a window into what school-level
people need from assessmentsespecially formative onesand what they can do with
results once they get them. Finally, policymakers at all levels can use this book to help
understand how hard the work of using assessment data to improve schools is, how long
it takes, and how worthwhile it can be.
8 DATA WISE

S-ar putea să vă placă și