Sunteți pe pagina 1din 44

Measuring

Quality:
Choosing Among
Surveys and Other
Assessments of
College Quality

Victor M.H. Borden


with
Jody L. Zak Owens

Victor M. H. Borden is associate vice


chancellor for information management
and institutional research, and associate
professor at Indiana University Purdue
University Indianapolis.

Jody L. Zak Owens is a research assistant


and graduate student in higher education
and student affairs at Indiana University.

Association for Institutional Research


American Council on Education for Management Research,
Center for Policy Analysis Policy Analysis, and Planning
Copyright © 2001

American Council on Education


One Dupont Circle NW
Washington, DC 20036-1193

Association for Institutional Research


for Management Research,
Policy Analysis, and Planning

Association for Institutional Research


114 Stone Building
Florida State University
Tallahassee, FL 32306-4462

All rights reserved. No part of this book may be reproduced or


transmitted in any form or by any means electronic or mechani-
cal, including photocopying, recording, or by any information
storage and retrieval system, without permission in writing from
the publisher.

Single copies of this publication may be purchased from the


American Council on Education for $15.00 each plus shipping
and handling. Multiple copies are available for $10.00 each.
Orders may be paid by VISA®, check, or money order (made
payable to the American Council on Education) and sent to:

ACE Fulfillment Service


Department 191
Washington, DC 20055-0191
(301) 604-9073
Secure fax line: (301) 604-0158
PMDS Item Number: 309113
Table of
Contents

Introduction ....................................................................................................1
General Issues..................................................................................................3
National Assessments of Institutional Quality ..................................................5
Using Assessment Results Effectively ..............................................................11
Conclusion ....................................................................................................17
Table 1. Instrument, Administrator, Purpose, Use of Data,
History, and Information Collected ....................................................18
Table 2. Target Institutions and Samples, Participation, Format,
Administration Procedure, and Timeline............................................26
Table 3. Reporting, Data Availability, Local Items, Costs, and
Contact Information..........................................................................34

American Council on Education/Association for Institutional Research iii


Introduction

Dear College President:

As the public and political leaders have come to perceive higher education as both
more important and more expensive than ever, demand has grown for account-
ability data and consumer information on the relative quality of individual col-
leges. The Survey of College and University Quality (SCUQ) was developed by
leaders in the field of higher education assessment to help postsecondary institu-
tions meet the demands of governing boards, accrediting agencies, and other
stakeholders. Participating institutions have found this to be a rich source of data
for marketing and recruitment as well. Perhaps most importantly, college and
university faculty have embraced the results of the SCUQ as the most credible and
useful evidence of student learning in college.
We invite you to join the hundreds of leading colleges and universities that
participate in this survey . . .

Does this fictional solicitation sound familiar? In recent years, a proliferation of


national assessments of institutional quality has emerged in response to increasing
demands for accountability and consumer information. Many of these assessments
rely on survey responses from current and former students. They operate on a coop-
erative system in which campuses pay to participate in the survey and, in return,
receive one or more reports of the results and, sometimes, the raw data for further
local analysis. Other assessments use data already collected from students when they
take college entrance examinations. Standardized tests of college-level critical
thinking and subject area achievement also are available, allowing faculty and

American Council on Education/Association for Institutional Research 1


administrators to compare the perfor- This guide is divided into three major
mance of students progressing through sections. The first section poses some
their programs with that of students at general questions that are important to
other colleges and universities. College consider before deciding whether to par-
presidents and provosts often decide to ticipate (or continue to participate) in a
participate in these efforts, but they may national assessment. The second section
do so with little information on how best provides common descriptive informa-
to evaluate these assessments and how tion for some of the national assessments
to determine which assessment will pro- that were popular when the guide was
vide the most useful information for written. The third section reviews more
their campus. specific questions and issues regarding
the choice of a specific instrument or ser-
The purpose of this guide is to articu-
vice and how to optimize participation.
late a set of questions and issues that
campus leaders can review when decid- The appendix provides a tabular com-
ing whether to participate in a given parison of the major instruments and
survey or use a specific assessment services reviewed in the guide. New
instrument. The guide also describes products and services likely will become
some of the major national surveys and available and existing ones transformed
assessments. Although the guide does or even discontinued after publication
not rate or recommend these services, it of this guide. The Association for
suggests the criteria that campus leaders Institutional Research will maintain an
should employ to determine the use and updated version of the appendix tables on
usefulness of any such instrument or ser- its web site at http://www.airweb.org.
vice, based on specific campus needs,
The next section of this guide poses
capabilities, and goals.
some general questions to consider
before engaging in any of these assess-
ment efforts.

2 Measuring Quality: Choosing Among Surveys and Other Assessments of College Quality
General
Issues

Do these assessments live up to their promises?


As with all assessment efforts, the value of a national assessment survey or service
depends on whether faculty, administrators, and staff members can use the results
to support their ongoing processes and activities. Even if they see the questions and
results as interesting and informative, they may not find the information useful.
This guide’s descriptive information regarding specific instruments and services
may help an institution determine whether a particular instrument is relevant to its
needs. However, the guide cannot answer questions regarding an institution’s capa-
bility for most effectively using any particular type of assessment information.
The assessments described in this guide can be considered information tools
for both accountability and improvement. Their usefulness depends on three
general criteria:
■ The appropriateness of the tool for the specific job at hand.

■ The skills and experiences of users.

■ The availability of sufficient financial, personal, and material resources.

How do we determine which survey is best suited to our purposes?


If you can ask this question effectively, you are halfway to choosing an appropriate
instrument. The key to determining which instruments and assessment will work
best for your institution is articulating a shared purpose among those most likely to
use the results. For example, finding out about entering students’ expectations and
attitudes can help only if that information can be used by academic and student
support service managers to develop, refine, and evaluate support programs; by

American Council on Education/Association for Institutional Research 3


marketers and recruiters to improve What is generally involved in
strategies for attracting students; or by participating?
external affairs staff to develop print and Participation requirements vary among
electronic publications. the national assessments considered in
this guide, but there is always some sig-
Who at my institution needs to be nificant institutional commitment
involved? beyond the cost of participation. For
survey instruments, campus staff must
In addition to involving the faculty, staff,
generate at least a sample of students to
and administrators most likely to use the
be queried or tested. In many cases,
results, you should consider involving
campus staff administer the instrument,
faculty and staff with expertise in institu-
either in a group setting (for example, at
tional research, higher education assess-
freshman orientation) or through class-
ment, public opinion polling, or related
room, electronic, or mail distribution.
areas. If you have an institutional
Typically, the supplier processes com-
research or assessment office, it is espe-
pleted instruments and prepares a
cially important that you confer with
report for the institution. The supplier
those staff prior to committing to a
also may provide customized reports for
national assessment. The proliferation of
an additional cost. Often the most useful
surveys and assessment instruments at
information comes from the subsequent
all levels—in the classroom, by depart-
analyses that campus faculty and staff
ments and programs, by campus offices,
perform for specific decision applica-
and so forth—has resulted in exception-
tions. The direct cost of participation
ally high demand for student time and
is less than half of the total resource
attention. This demand is beginning to
commitment required. However, the
compromise students’ responsiveness as
cost of participating in one of these
well as the quality of those responses.
assessments is likely to be small com-
Staff in a centralized institutional
pared to the cost of improvements in
research or assessment office often can
programs, services, or accountability
best manage the overall load of assess-
that may be facilitated by assessment
ment activity and help determine the
results. As you consider participation, it
technical merits of a specific instrument
is advisable to think about the resources
or service.
that you are willing to commit to follow-
up activities based on the results of the
assessment.
The next section introduces some of
the most popular national assessment
surveys and services currently available
for institutional use. It also provides an
overview of the detailed information
presented in the appendix tables.

4 Measuring Quality: Choosing Among Surveys and Other Assessments of College Quality
National Assessments of
Institutional
Quality

The three tables at the end of this guide summarize the characteristics of 27 national
assessment instruments and services. The first 21 instruments and services
assess the attitudes, experiences, and learning goals and gains of entering students
(6), various groups of enrolled undergraduates (8), student proficiencies and
learning outcomes (5), and alumni (2). Two services offer a series of instruments for
students at varying points in their academic careers. The final four instruments and
services assess institutional and program effectiveness through the views of various
constituents, including faculty, administrators, students, and board members.

Profiles of entering students


UCLA’s Higher Education Research Institute (HERI) has been conducting the
Cooperative Institutional Research Program (CIRP) Freshman Survey for more than
30 years. The resulting annual national report receives considerable press attention
as the primary indicator of trends in new college student attitudes, expectations,
and experiences. By some measures, this is the most widely used freshman survey,
with more than 1,700 institutional participants since 1966. Any type of college or
university can use the CIRP Freshman Survey, but HERI also offers the Entering
Student Survey (ESS), tailored to the needs of two-year public and private colleges.
Many college and university presidents do not realize that most students complete
one of two entering student surveys before they begin college. When students take
the SAT or ACT college entrance exams, they complete an extensive information
form that includes questions about their expectations, attitudes, and past academic
behaviors. Moreover, the results of these surveys are available to colleges and

American Council on Education/Association for Institutional Research 5


universities at little or no cost. Schools Experiences of enrolled undergraduates
that use ACT COMPASS placement test HERI’s College Student Survey (CSS)
services also can inexpensively incorpo- and the CSEQ (administered by the
rate the student profile questionnaire Indiana University Center for
into their data collection processes. Postsecondary Research and Planning
[CPRP]) are two relatively long-standing
Unlike the CIRP survey, the ACT and
assessments of the undergraduate stu-
SAT profile questionnaires are adminis-
dent’s college experiences. As a follow-
tered at varying points in time, accord-
up to the freshman survey, the CSS
ing to when students take their entrance
focuses on students’ level of satisfaction
exams (which can range from sopho-
with various aspects of their college
more to senior year in high school). The
experiences. Although the CSEQ also
data from these profiles can be useful,
includes a satisfaction index, this survey
but they do not replace a true survey of
focuses more on students’ views of their
entering students that is administered
learning experiences. The CSEQ is
immediately prior to, or at the beginning
guided by the principle that students
of, a student’s college career.
learn best when actively engaged in col-
The College Board offers two versions lege activities and experiences.
of an entering student survey—the
Both the CSS and CSEQ focus on the
Admitted Student Questionnaire (ASQ)
four-year baccalaureate experience.
and the ASQ Plus—that focus on students’
However, a Community College Student
experiences with the college admissions
Experiences Questionnaire (CCSEQ)
process. The ASQ provides feedback on
also is available through the University
the marketing and recruitment functions
of Memphis Center for the Study of
and processes for college admissions and
Higher Education. The CCSEQ follows
so most directly serves enrollment man-
CSEQ principles but targets the nontra-
agement operations.
ditional, commuter students who typi-
It is often useful to track changes in cally attend community colleges. The
students’ attitudes, expectations, and American Association of Community
experiences as they progress through Colleges (AACC) and American College
college. HERI’s College Student Survey, Testing (ACT) have teamed up to pro-
described in the next section, provides duce Faces of the Future, a survey that
this possibility as a follow-up to the CIRP captures the background characteristics
freshman survey. The last survey consid- and academic interests of community
ered in this section, the College Student college students taking either credit-
Expectations Questionnaire (CSXQ), bearing or non-credit classes.
allows for the same type of tracking.
Recently, faculty at Indiana
The CSXQ is a prequel to the longer-
University’s CPRP and UCLA’s HERI
standing College Student Experiences
have contributed to the development of
Questionnaire (CSEQ) reviewed in the
new undergraduate student assessments
next section. The CSXQ gathers baseline
that are closely tied to national efforts at
information from students regarding
transforming undergraduate education.
their expectations for their forthcoming
The IU Center now administers the
educational experiences, as assessed
National Survey of Student Engagement
subsequently in the CSEQ.

6 Measuring Quality: Choosing Among Surveys and Other Assessments of College Quality
(NSSE), which was developed by a panel Student proficiencies and learning
of leading assessment scholars as a outcomes
model for quality in undergraduate The enrolled student surveys reviewed
education. The NSSE uses principles to this point focus on attitudinal and
similar to those that guided the develop- behavioral aspects of the student experi-
ment of the CSEQ; however, the CSEQ ence. Faculty at a number of colleges
is a longer instrument that covers spe- and universities now use one of several
cific outcomes and experiences in more assessment instruments focusing on stu-
detail. The NSSE, although relatively dent learning outcomes in specific con-
new, has a larger base of institutional tent areas and general education. ACT’s
participants than the CSEQ. Collegiate Assessment of Academic
HERI’s newest assessment instru- Proficiency (CAAP) consists of a set of
ment was developed in collaboration modules that assess student proficiency
with the Policy Center on the First Year in writing, reading, math, science rea-
of College at Brevard College. Your First soning, and critical thinking. Users may
College Year (YFCY) is both a follow-up customize these modules based on insti-
to the CIRP freshman survey and an tutional needs. The Academic Profile
assessment of students’ experiences developed by the Educational Testing
with first-year programs such as learn- Service (ETS) provides a similar assess-
ing communities, residential interest ment of general education skills and
groups, and introductory courses. proficiencies. ETS also offers a specific
Similar to the NSSE, the YFCY focuses assessment for critical thinking as
on specific types of programs and stu- well as a set of major field tests in 14
dent behaviors that have emerged from academic subject areas, such as biology,
higher education literature as best prac- economics, music, and psychology.
tices in undergraduate learning. The Project for Area Concentration
Achievement Testing (PACAT), housed
During the past 10 years, many insti- at Austin Peay State University, offers
tutions have adopted the Noel-Levitz flexible content in eight subject-specific
Student Satisfaction Inventory (SSI) as Area Concentration Achievement Tests
part of a strategic enrollment manage- (ACAT), which it can customize for
ment initiative. The Noel-Levitz instru- individual institutions. PACAT also
ment uses a gap analysis technique to offers three additional subject area
array students’ satisfaction against their tests that do not yet have the flexible
perceived importance of various aspects content option.
of the college experience. Noel-Levitz
also has recently released a version of Institutions typically use the general
the SSI tailored to the needs of adult education and subject area instruments
learners, called the Adult Student from ACT, ETS, and PACAT for internal
Priorities Survey (ASPS). assessment and improvement purposes.
However, many institutions tap into
the rich source of learning outcomes
information from these instruments to
demonstrate their effectiveness to
accrediting agencies. College adminis-

American Council on Education/Association for Institutional Research 7


trators, faculty, and staff should familiar- Tracking changes in student attitudes
ize themselves with the different options and behaviors
for assessing institutional effectiveness Many of the organizations already men-
and the college student experience. tioned offer multiple instruments for dif-
Using too many assessment instruments, ferent populations, such as entering and
especially in uncoordinated ways, can continuing student surveys. However,
undermine efforts to assess institutional several assessment programs provide a
quality by compromising the quality of series of instruments that use common
student participation in these efforts. questions to help assess changes in stu-
dent attitudes and behaviors over time.
Alumni status and achievement These survey programs are often appeal-
ing to colleges and universities that seek
Graduates can provide valuable informa-
to implement a comprehensive range of
tion about how their experiences in col-
survey assessments.
lege served them in pursuing their
postgraduate goals and objectives. The The Student Outcomes Information
Comprehensive Alumni Assessment System (SOIS), offered by NCHEMS,
Survey (CAAS), produced by the includes surveys of entering, continuing,
National Center for Higher Education and noncontinuing students, and recent
Management Systems (NCHEMS), is and previous graduates. ACT offers a set
available for this purpose. Alumni sur- of 15 standardized instruments for col-
veys also are offered as a part of several leges and universities, under the name
comprehensive survey programs, which Evaluation and Survey Services (ESS),
are described in the next section. which includes surveys for entering, con-
tinuing, noncontinuing, and graduated
More recently, Peterson’s, the pub-
students. The ACT program includes
lisher of college guidebooks and a
specific instruments for assessing func-
provider of web-based college search ser-
tions such as academic advising and for
vices, has incorporated into its services
assessing the needs of both traditional
the College Results Survey (previously
and adult learners.
known as the Collegiate Results
Instrument). This instrument uses real-
life scenarios to assess alumni percep- Faculty and other constituent views of
tions of their preparedness for their jobs, institutional programs
community participation, and civic The last section of each table in the
responsibilities. Several institutions par- appendix lists several instruments that
ticipated in a comprehensive pilot of this assess institutional quality by soliciting
instrument. Anyone who accesses the views of faculty and other con-
Peterson’s web site now can complete stituents about the campus climate for
this survey. Visitors self-select their alma living, working, and learning. HERI’s
mater and then complete a four-section Faculty Survey explores the attitudes and
survey. Peterson’s has not analyzed or opinions of college and university faculty
reported yet on these unverified about their work environment. The
responses, and will determine how best Noel-Levitz Institutional Priorities
to deploy this assessment in consultation Survey (IPS) enables an institution to
with institutions. explore similarities and differences

8 Measuring Quality: Choosing Among Surveys and Other Assessments of College Quality
among the priorities of various con-
stituents, including students, faculty,
and staff. NCHEMS offers an instru-
ment, in both two- and four-year institu-
tion versions, that asks faculty, students,
and staff to answer similar sets of ques-
tions about institutional performance
and effectiveness. Finally, ETS offers the
Program Self-Assessment Service, in
both undergraduate and graduate ver-
sions, which assists academic programs
engaging in a self-guided review.
The set of instruments and services
reviewed in this section is by no means
an exhaustive representation of all the
assessments currently available. They
offer a sampling of the various tools that
college and university faculty and admin-
istrators can use in their assessment and
accountability efforts. Too often, institu-
tions see the results of one of these
assessments as the endpoint of the pro-
cess. The next section considers the ques-
tions and issues that determine whether
such assessment efforts provide useful
information for planning, evaluation,
and decision-making processes that pro-
mote accountability and improvement.

American Council on Education/Association for Institutional Research 9


Using Assessment
Results Effectively

How well do assessments reflect student experiences on our campus?


Before acting upon the results of any assessment, it is important to understand how
well the results reflect actual student experiences. Answering this question requires
institutions to examine several technical issues, including the representativeness of
the sample (what is the response rate and response bias?), the reliability of the
instrument (would it yield the same results if administered to a different but equally
representative group?), and the validity of the instrument (does it actually measure
what it purports to measure?). For several reasons, an assessment instrument cre-
ated by a nationally recognized organization likely will be more reliable and valid
than one developed locally. However, measurement in higher education and in the
social sciences is by no means exact.
The assessment instruments that this guide describes have a respectable level of
reliability, but reliability is the easiest measurement characteristic to achieve.
Sampling representativeness is entirely related to how the instrument is adminis-
tered, which in many cases the institution partly or entirely determines. Validity is
the thorniest issue. It encompasses questions related to the simplicity or complexity
of what is being measured (for example, where students live while attending college
versus their engagement in the academic community), as well as how well students’
recollections reflect their actual experiences.
Validity also relates to how institutions interpret assessment results. It is unclear
whether students’ responses to questions about college experiences reveal more
about student or institutional differences. For example, business majors’ responses
to questions about required general education courses may differ from those of

American Council on Education/Association for Institutional Research 11


liberal arts majors taking the same the assessments must be shared with
courses at the same institution. these same individuals. Given the techni-
Therefore, the institutional differences cal issues raised in the previous section,
reflected through these assessments it is equally important to involve individ-
may well reflect differences in student uals who understand the technical and
profiles rather than differences in the contextual limitations of such assess-
quality of institutional programs and ments, such as institutional researchers
services. or faculty with expertise in assessment or
survey research.
For this reason, it is important to con-
sider how comparative institutional The respondents also can help institu-
benchmarks are generated; the assess- tions use assessment results effectively.
ment instruments and services described Discussions with relevant student,
in this guide vary greatly in this regard. alumni, and faculty groups often can
Some offer only global national compar- provide keen insights into how questions
isons. Others allow survey administrators were interpreted and, therefore, what
to select comparison groups according the results may mean. Respondents’
to general institutional characteristics, interpretations of results provide an
such as selectiveness or Carnegie classi- additional perspective that helps the
fication type. Still others allow for more information user further understand the
flexible choices, including designating a context and limitations of the results.
comparison group among a minimum
It is important to consider carefully
set of institutions (e.g., at least eight)
where and at what level assessment
from among all participants.
results will be used before engaging in
Despite all the limitations presented the effort. Given the size and variability
by issues of reliability, sample represen- of many college student bodies, data
tativeness, and validity, the results of often must be disaggregated into mean-
these assessments still can be quite use- ingful subgroups based on characteris-
ful for internal improvements and tics such as class level or major before
external accountability. But campus they can be valuable for program
administrators need to understand these improvement. Survey administrators
limitations to make informed decisions. must draw samples in a way that allows
The next two sections offer some ways to generalization to the subgroups for
maximize these uses. which results are desired. Unfortunately,
comparative institutional data are not
always available for subgroups that may
How can we use the data for assessment
be meaningful to a particular campus.
and improvement?
Several voices need to come together Most of the national assessments
to ensure that institutions can put the introduced in this guide make provisions
results of these assessments to good use. for including local items in the survey
As mentioned previously, those who are administration, thereby customizing the
in a position to impact the quality of instrument for a particular campus or
relevant institutional programs and pro- group of institutions. However, campus
cesses must be at the table when choosing officials must consider some important
assessment instruments. The results of limitations. The number of local items is

12 Measuring Quality: Choosing Among Surveys and Other Assessments of College Quality
often limited. Furthermore, adding to national assessments for public account-
the length of any assessment instrument ability may become more complicated
detracts from the response rate and in the future. In the short run, these
response quality. Response formats for instruments provide a valuable source of
local items usually are limited to five- campus-level accountability information
point Likert-type scales (for example, for governing boards, public constituen-
strongly disagree, disagree, neutral, cies, and accreditation agencies. In the
agree, and strongly agree). The institu- long run, the use of these and other
tion may be able to vary the response types of assessments for internal plan-
scale, but doing so often detracts from ning and improvement may be the best
the reliability of the instrument. support for external accountability.
Colleges and universities that aggres-
sively evaluate their programs and
How can campuses, governing boards,
services—and act on that information to
policy makers, or other constituents use
improve those programs and services—
the results for public accountability?
will gather a rich body of evidence to
With all the inherent limitations of any support their claims of institutional
particular assessment, college and uni- effectiveness.
versity presidents must concern them-
selves with who receives the results of
these assessments and how their institu- What are the advantages and
tion packages and disseminates these disadvantages of using national surveys,
results to external audiences. The data compared to local instruments?
from most of the instruments described National surveys are a relatively cost-
in this guide are considered the property effective way to gather assessment infor-
of the institution. As such, the adminis- mation. The organizations that develop
trators, faculty, and staff of the institu- these instruments often devote greater
tion control the way the information is technical and financial resources than
packaged and presented. can an individual institution. They also
test these instruments on a broader pop-
Several of the instruments discussed
ulation than can a single institution. The
in this guide were designed with public
availability of comparative data from
accountability as a primary purpose.
other institutional participants is an
One explicit goal of the NSSE is to
important advantage to using a national
impact national rankings of colleges and
assessment, but, as mentioned earlier,
universities. Peterson’s College Results
comparative benchmarks may be of
Survey, which students and alumni can
limited use if the comparison group does
complete outside the control of their
not include institutions with student
institution, may become a centerpiece of
profiles similar to the target institution
Peterson’s consumer information ser-
or if student profile differences other-
vices. An increasing number of state and
wise are not taken into account.
university systems are using common
Moreover, comparative data often are
assessment surveys to benchmark insti-
not available at disaggregate levels,
tutional effectiveness.
where results can be most potent.
Because of this trend, the use and
Local assessment instruments provide
control of results from some of these
much greater control and attention to

American Council on Education/Association for Institutional Research 13


local issues, and survey administrators human subjects review board.
can more closely integrate such instru- Institutional and program evaluation
ments across populations and samples. efforts are, in themselves, not considered
A local assessment of decent quality typi- research. Many college and university
cally costs more to administer than a review boards define research according
national assessment but often yields to the use of the results. Once faculty and
results that are more directly applicable staff prepare information for dissemina-
to the sponsoring campus. The lack of tion to an audience external to the insti-
comparative data is a major limitation, tution, it becomes research; however,
but institutions working in data-sharing this does not always require that the ini-
consortia can circumvent this limitation tial data collection effort be reviewed.
by introducing common items and meth- Faculty and staff who prepare articles or
ods among local instruments. presentations to professional organiza-
tions that use data from institutional
Developing quality local assessments
assessments should consult with their
requires a significant commitment of
local review boards about the proper pro-
time and resources that some colleges
cedures to follow.
and universities cannot afford; however,
effective use of national assessments Regardless of whether assessment
requires a similar significant commit- results are disseminated beyond the
ment. The reports produced by the ser- institution, survey administrators should
vice providers are informative and have treat data collected through national or
some direct use, but far more use comes local assessment instruments as confi-
from local follow-up analyses that dential student records. Many colleges
address issues of immediate concern to and universities develop specific policies
individuals and groups working on regarding the use and storage of student
improvement efforts. and faculty data. In some cases, institu-
tional policies need to be revisited when
considering assessment data, since these
Do we need to use the human subjects
data usually are not stored in the same
review process when administering one of
information systems environment as
these assessments?
operational student and faculty data.
The use of human subjects in assessment
research is coming under increased
scrutiny from two sources. In recent How do these national assessments
years, federal agencies that monitor the compare with other ways to assess
use of human subjects in research have institutional quality?
tightened the enforcement of their regu- Assessments of institutional quality
lations. The privacy of student and fac- can take a variety of forms, including
ulty records also is garnering increased classroom-based activities, student per-
attention among both state and federal formance assessments, program reviews,
agencies, as guided by the Family focus groups, exiting senior interviews,
Education Rights and Privacy Act and so forth. The national surveys, stan-
(FERPA). dardized tests, and other assessment
services considered in this guide can be
Technically, all research on human
an important part of these assessments.
subjects requires review by a sanctioned
Colleges and universities that have

14 Measuring Quality: Choosing Among Surveys and Other Assessments of College Quality
well-developed assessment programs organizations who interact with and hire
typically use a variety of assessments, college and university graduates:
including but not limited to national employers, graduate school administra-
instruments. No single method offers tors, social agencies, and so forth.
the best approach. Each has benefits and Several specialized accrediting agencies
limitations, which is why a comprehen- require college program administrators
sive assessment program usually to survey employers, but there is, as of
includes multiple, convergent methods. yet, no popular national instrument.
For institutions that do not have well- Many of the popular surveys of the
established assessment programs, the college student experience (for example,
national surveys described in this guide the CIRP Freshman Survey, CSEQ, and
can be catalysts for further development. NSSE) originate from research on
Specific findings can lead to action. traditional-aged students attending resi-
Institutions can use perceived limitations dential universities. Some instruments
as points of departure to identify the are tailored for two-year colleges, such as
focus of future assessments. the CCSEQ and Faces of the Future.
Among the 15 instruments included in
ACT’s Evaluation and Survey Services
Do these assessments encompass the
program is an assessment designed to
views of all major student and other
evaluate the experiences of adult learn-
constituent groups?
ers (for example, nontraditional-aged
Most of the instruments reviewed in this and commuter students). Noel-Levitz’s
guide draw information about the col- new Adult Student Priorities Survey
lege student experience directly from (ASPS) also is tailored toward this group.
students: new, continuing, and gradu- However, the research base still is some-
ated. A few instruments gather input on what skewed toward the definitions of
institutional quality from faculty, admin- quality that emerge from the traditional
istrators, and staff. One voice that is college experience.
completely absent from these assess-
ments is that of the individuals and

American Council on Education/Association for Institutional Research 15


Conclusion

The American system of higher education is characterized by institutional diversity,


and the increasing variety of national assessments is only beginning to reflect this
diversity. As colleges and universities respond to the increasing demand for public
accountability and consumer information, we hope to see an attendant increase in
the number of assessment instruments that better reflect the complete range of
college and university missions and clientele. College and university presidents can
contribute to this end by initiating discussions on their campuses and among similar
types of institutions about the kinds of evidence that would best reflect the quality of
their institutions. The most exemplary local and consortia efforts to assess quality in
terms of institutional mission likely will exert the strongest influence on the future
developments of national assessments.

American Council on Education/Association for Institutional Research 17


TABLE 1. Instrument, Administrator, Purpose, Use of Data,
History, and Information Collected

INSTRUMENT/ADMINISTRATOR PURPOSE USE OF DATA

ENTERING UNDERGRADUATES

Cooperative Institutional Collects demographic and attitudinal Admissions and recruitment; academic
Research Program (CIRP) information on incoming students. program development and review; self-
Freshman Survey/Entering Serves as baseline for longitudinal study and accreditation; public rela-
Student Survey (ESS) follow-up. Measures trends in higher tions and development; institutional
education and characteristics of research and assessment; retention
Higher Education Research Institute
American college freshmen. studies; longitudinal research about
(HERI) at UCLA and
the impacts of policies and programs.
American Council on Education
(ACE)

Freshman Class Profile Service Summarizes the characteristics of ACT- Institutional marketing and recruit-
tested enrolled and nonenrolled students ment: knowledge of competitors,
American College Testing (ACT)
by institution. characteristics of enrolled students,
feeder high schools, etc.

Student Descriptive Questionnaire Provides a basic profile of students who Admissions and recruitment; institu-
(SDQ) took the SAT. tional research and assessment; reten-
tion studies.
The College Board

Admitted Student Questionnaire Studies students’ perceptions of their Recruitment; understanding of market
(ASQ) and Admitted Student institution and its admissions process. position; evaluation of institutional
Questionnaire Plus (ASQ Plus) Facilitates competitor and overlap com- image; calculation of overlap win/loss;
parisons. evaluation of financial aid packaging.
The College Board

College Student Expectations Assesses new students’ expectations Comparison with CSEQ data to identify
Questionnaire (CSXQ) upon matriculation. Findings can be areas where the first-year experience
compared with student reports of their can be improved. Also can be used
Center for Postsecondary Research
actual experiences as measured by the for campus research and assessment
and Planning (CPRP) at Indiana
College Student Experiences initiatives.
University
Questionnaire (CSEQ).

ENROLLED UNDERGRADUATES

College Student Survey (CSS) Evaluates students’ experiences and sat- Student assessment activities; accredi-
isfaction to assess how students have tation and self-study; campus planning;
HERI
changed since entering college. Can be policy analysis; retention analysis; and
used longitudinally with the CIRP study of other campus issues.
Freshman Survey.

18 Measuring Quality: Choosing Among Surveys and Other Assessments of College Quality
HISTORY INFORMATION COLLECTED

ENTERING UNDERGRADUATES

Established in 1966 at ACE, the CIRP Demographic characteristics; expecta-


was transferred to HERI at UCLA in tions of the college experience; sec-
1973. ondary school experiences; degree goals
and career plans; college finances; atti-
tudes, values, and life goals; reasons for
attending college.

Student Profile Section (SPS) is a set Demographics: background informa-


of 189 items included in the ACT tion; high school characteristics and
Assessment Program. Certain items evaluation; needs assessment; career
are updated every few years for cur- interests; college plans; and achievement
rency (e.g., racial/ethnic categories). test scores.
The current format of the SPS was
developed in 1973.

Not available. Prior academic record; high school


course-taking patterns; student demo-
graphics; and family background.

ASQ was developed in 1987, ASQ Plus Student assessment of programs, admis-
in 1991. sions procedures, literature, institutional
image; financial aid packages; common
acceptances; comparative evaluations.
ASQ Plus provides specific institutional
comparisons.

Originally developed in 1997 for a Background information; expectations


FIPSE-funded project, CSXQ is an for involvement in college activities; pre-
abbreviated version of CSEQ. Second dicted satisfaction with college; and
edition available since 1998. expected nature of college learning envi-
ronments.

ENROLLED UNDERGRADUATES

The CSS was initiated in 1993 to per- Satisfaction with college experience; stu-
mit individual campuses to survey dent involvement; cognitive and affective
undergraduates at any level and to development; student values, attitudes,
conduct follow-up studies of their and goals; degree aspirations and career
CIRP Freshman Survey respondents. plans; Internet, e-mail, and other com-
puter uses.

American Council on Education/Association for Institutional Research 19


TABLE 1 (continued). Instrument, Administrator, Purpose, Use of Data, History, and Information Collected

INSTRUMENT/ADMINISTRATOR PURPOSE USE OF DATA

ENROLLED UNDERGRADUATES

Faces of the Future Assesses the current state of the commu- Community college students; bench-
nity college population and explores the marking; comparisons to national data;
American Association of
role community colleges play in stu- tracking of trends in student population.
Community Colleges (AACC)
dents’ lives.
and ACT

College Student Experiences Measures quality of students’ experi- Outcomes of college; accreditation
Questionnaire (CSEQ) ences inside and outside the classroom, review; institutional research, evalua-
perceptions of environment, satisfaction, tion, and assessment; student recruit-
CPRP
and progress toward 25 desired learning ment and retention; assessment of
and personal development outcomes. undergraduate education.

Community College Student Measures students’ progress and experi- Self-study and accreditation review;
Experiences Questionnaire ences. assessment of institutional effectiveness;
(CCSEQ) evaluation of general education, transfer,
and vocational programs; use of technol-
University of Memphis, Center for the
ogy; measurement of student interest,
Study of Higher Education
impressions, and satisfaction.

National Survey of Student Gathers outcomes assessment, under- Institutional improvement and bench-
Engagement (NSSE) graduate quality, and accountability marking; monitoring of progress over
data. Measures students’ engagement time; self-studies and accreditation; and
CPRP
in effective educational practices other private and public accountability
(level of challenge, active learning, efforts.
student-faculty interaction, supportive
environment, etc.).

Your First College Year (YFCY) Designed as a follow-up survey to the Admissions and recruitment; academic
CIRP Freshman Survey. Assesses student program development and review; self-
HERI and Policy Center on the First
development during the first year of study and accreditation; public rela-
Year of College at Brevard College
college. tions and development; institutional
research and assessment; retention
studies; longitudinal research; first-
year curriculum efforts.

Student Satisfaction Inventory Measures students’ satisfaction. Student retention; student recruitment;
(SSI) strategic planning and institutional
effectiveness.
Noel-Levitz

Adult Student Priorities Survey Measures satisfaction of students age 25 Student retention; student recruitment;
(ASPS) and older. strategic planning and institutional
effectiveness.
Noel-Levitz

20 Measuring Quality: Choosing Among Surveys and Other Assessments of College Quality
HISTORY INFORMATION COLLECTED

ENROLLED UNDERGRADUATES

Developed in 1998 and piloted in early Background information (general,


1999, the survey is now in its third employment, education); current college
year of administration. experiences (access and purpose, learn-
ing and satisfaction, expected outcome
and intent, transitions).

Developed by C. Robert Pace in the Background information; level of stu-


1970s, CSEQ is in its fourth edition dent engagement in learning activities;
(second edition, 1983; third edition, student ratings of college learning envi-
1990). Since 1994 George Kuh ronment; estimate of student gains
(Indiana University) has directed the toward learning goals; index of student
research program. satisfaction with the college.

Co-authored by Jack Friedlander, C. Amount, breadth, and quality of effort


Robert Pace, Patricia H. Murrell, and expended in both in-class and out-of-
Penny Lehman (1991, revised 1999). class experiences; progress toward edu-
cational outcomes; satisfaction with
community college environment; demo-
graphic and background characteristics.

Designed in 1998 by a group of assess- Student reports of quality of effort inside


ment experts chaired by Peter Ewell, and outside the classroom, including
NCHEMS. Project director is George time devoted to various activities and
Kuh, Indiana University. amount of reading and writing, higher
order thinking skills, quality of interac-
tions, educational and personal gains,
and satisfaction.

Administered by HERI in partnership One-third of items are CIRP post-test


with the Policy Center on the First items. Remaining questions address stu-
Year of College, and funded by The dents’ academic, residential, and employ-
Pew Charitable Trusts, YFCY was ment experiences; self-concept and life
pilot-tested in spring 2000. Second goals; patterns of peer and faculty inter-
pilot is scheduled for spring 2001. Full action; adjustment and persistence;
administration will begin in 2002. degree aspirations; and satisfaction.

SSI was piloted in 1993 and became Ratings on importance of and satisfac-
available to institutions in 1994. tion with various aspects of campus.
The survey covers most aspects of stu-
dent experience.

ASPS was piloted and became avail- Ratings on importance of and satisfac-
able to institutions in 2000. tion with various aspects of campus.
The survey is specific to the experience
of adult students.

American Council on Education/Association for Institutional Research 21


TABLE 1 (continued). Instrument, Administrator, Purpose, Use of Data, History, and Information Collected

INSTRUMENT/ADMINISTRATOR PURPOSE USE OF DATA

STUDENT PROFICIENCIES AND LEARNING OUTCOMES

Collegiate Assessment of Academic Assesses college students’ academic Document levels of proficiency; compare
Proficiency (CAAP) achievement in general education skills. local populations via user norms; estab-
lish eligibility requirements; report edu-
ACT
cational outcomes for accountability and
accreditation; improve teaching; and
enhance student learning.

Academic Profile Assesses college-level general education Describe performance of individuals and
skills. groups; measure growth in learning; use
Educational Testing Service (ETS) and
data as a guidance tool and performance
The College Board
standard.

Tasks in Critical Thinking Assesses proficiency in college-level, Each student receives a confidential
higher order thinking skills. report on skills performance. Data can
ETS
help institution learn more about teach-
ing and program effectiveness.

Major Field Tests Assesses students’ academic achieve- Measure student academic achievement
ment in major field of study. and growth and assess effectiveness of
ETS
departmental curricula for planning and
development.

Area Concentration Achievement Assesses outcomes and provides Provide specific program analysis.
Tests (ACAT) curriculum-specific feedback on
student achievement.
Project for Area Concentration
Achievement Testing (PACAT) at
Austin Peay State University

ALUMNI

Comprehensive Alumni Measures evidence of institutional effec- Help clarify institutional mission and
Assessment Survey (CAAS) tiveness and reports on alumni personal goals and assist in developing new goals.
development and career preparation. Evaluate mission attainment and impact
National Center for Higher Education
of general education programs, core
Management Systems (NCHEMS)
requirements, and academic support
services.

College Results Survey (CRS) Identifies personal values, abilities, Peterson’s uses data collected online
occupations, work skills, and participa- for consumer information at
Peterson’s, a Thomson Learning
tion in lifelong learning of college gradu- http://www.bestcollegepicks.com.
Company
ates. Uses alumni responses to establish Institutions use data collected in collabo-
a unique institutional profile. ration with Peterson’s for self-study.

22 Measuring Quality: Choosing Among Surveys and Other Assessments of College Quality
HISTORY INFORMATION COLLECTED

STUDENT PROFICIENCIES AND LEARNING OUTCOMES

CAAP was introduced in 1988. Assessment of proficiency in core gen-


eral education skills, including writing
(objective and essay), reading, math,
science reasoning, and critical thinking.

Introduced in 1992 to assist institu- Norm-referenced and criterion-


tions with accreditation, account- referenced scores measure college-level
ability, and program improvement. reading, college-level writing, critical
thinking, and mathematics.

Introduced in 1992 to address the Measures college-level inquiry, analysis,


need to assess college-level higher and communication skills.
order thinking skills and to improve
teaching and learning.

These tests originally were based on Factual knowledge; ability to analyze and
the GRE subject tests and are jointly solve problems; ability to understand
sponsored by ETS and the GRE Board. relationships; and ability to interpret
Each test is periodically updated to material. Available for 15 disciplines;
maintain currency with standard see www.ets.org/hea for listing.
undergraduate curricula.

Established in 1983 and expanded in Discipline-specific surveys cover agricul-


1988 by a FIPSE grant, ACAT is a ture, biology, criminal justice, geology,
nationally normed instrument with history, neuroscience, political science,
items written by faculty in the various psychology, art, English literature, and
disciplines. social work.

ALUMNI

Not available. Employment and continuing education;


undergraduate experience; development
of intellect; achievement of community
goals; personal development and enrich-
ment; community participation; demo-
graphic and background information.

Formerly the College Results Lifelong learning; personal values; confi-


Instrument, CRS was developed by dence; occupation and income; and work
Robert Zemsky at the University of skills.
Pennsylvania with support from the
U.S. Department of Education and the
Knight Higher Education Collaborative.

American Council on Education/Association for Institutional Research 23


TABLE 1 (continued). Instrument, Administrator, Purpose, Use of Data, History, and Information Collected

INSTRUMENT/ADMINISTRATOR PURPOSE USE OF DATA

SERIES OF INSTRUMENTS

Student Outcomes Information Collects information about students’ Longitudinal assessment of students’
Survey (SOIS) needs and reactions to their educational experiences and opinions.
experiences.
NCHEMS

Evaluation/Survey Services Assesses needs, development, attitudes, Accreditation; program and service
and opinions of students and alumni. assessment; outcomes assessment;
ACT
retention; alumni follow-up; institu-
tional self-study.

FACULTY AND INSTITUTIONAL SURVEYS

Faculty Survey Collects information about the work- Accreditation and self-study reports;
load, teaching practices, job satisfaction, campus planning and policy analysis;
HERI
and professional activities of collegiate faculty development programs; bench-
faculty and administrators. marking faculty characteristics.

Institutional Performance Survey Assesses institutional performance and Self-study; marketing.


(IPS) effectiveness.
NCHEMS

Institutional Priorities Survey Assesses faculty, staff, and administrative Student retention; student recruitment;
(IPS) perceptions and priorities (recom- strategic planning and institutional
mended with the SSI to determine where effectiveness. Institutions can pinpoint
Noel-Levitz
priorities overlap with those of students). areas of consensus on campus.

Program Self-Assessment Service Assesses students’ opinions on under- Used by departments for self-study and as
(PSAS) and Graduate Program graduate and graduate programs. additional indicators of program quality
Self-Assessment Service (GPSAS) for accreditation purposes.
ETS

24 Measuring Quality: Choosing Among Surveys and Other Assessments of College Quality
HISTORY INFORMATION COLLECTED

SERIES OF INSTRUMENTS

In use since 1978. Background, personal goals, and career


aspirations; factors influencing college
choice; satisfaction with college experi-
ence; activities while in college; educa-
tional plans and accomplishments;
career choices; career successes.

Established in 1979. Fifteen standardized instruments include


alumni surveys, outcomes assessment
surveys, satisfaction surveys, opinion
surveys, entering student surveys, and
nonreturning student surveys. See
www.act.org/ess/index.html for
complete list of available surveys.

FACULTY AND INSTITUTIONAL SURVEYS

In seven faculty surveys conducted Background characteristics; teaching


since 1969, HERI has collected data practices and research activities; interac-
on more than 500,000 college faculty tions with students and colleagues;
at more than 1,000 institutions. The professional activities; faculty attitudes
next faculty survey is scheduled for and values; perceptions of the institu-
2001–02. tional climate; job satisfaction.

IPS is a by-product of a national More than 100 items measure eight


research study to assess how various dimensions of institutional performance.
institutional conditions are related to
the external environment, strategic
competence, and effectiveness.

IPS was developed as a parallel instru- Perceptions on the importance of meet-


ment to the Noel-Levitz SSI. IPS was ing various student expectations, and
piloted and made available in 1997. their level of agreement that institution
actually is meeting these expectations.

GPSAS was developed in conjunction Quality of teaching; scholarly excellence;


with the Council of Graduate Schools faculty concern for students; curriculum;
in the 1970s. PSAS was developed in students’ satisfaction with programs;
the 1980s using GPSAS as a model. resource accessibility; employment assis-
tance; faculty involvement; departmental
procedures; learning environment.

American Council on Education/Association for Institutional Research 25


TABLE 2. Target Institutions and Samples, Participation,
Format, Administration Procedure, and Timeline

INSTRUMENT/ADMINISTRATOR TARGET INSTITUTIONS/SAMPLES PARTICIPATION RATES

ENTERING UNDERGRADUATES

Cooperative Institutional Research All types/Incoming students (ESS specif- Since 1966, 1,700 institutions and 10
Program (CIRP) Freshman ically designed for two-year institutions). million students have participated. In
Survey/Entering Student Survey fall 2000, 717 institutions and 404,000
(ESS) students participated.
Higher Education Research Institute
(HERI) at UCLA and American
Council on Education (ACE)

Freshman Class Profile Service All types/All ACT test-takers. Over 1 million high school students are
tested each year. This service includes
American College Testing (ACT)
more than 550,000 enrolled students
from 900 institutions each year.

Student Descriptive Questionnaire All types/All SAT test-takers. All students who participate in the SAT
(SDQ) complete the SDQ. Responses only sent
if student indicates ‘Yes’ to being
The College Board
included in Student Search Service.

Admitted Student Questionnaire All types/All admitted students. Every year, 220 institutions participate
(ASQ) and Admitted Student and 400,000 students are surveyed.
Questionnaire Plus (ASQ Plus)
The College Board

College Student Expectations Four-year public and private institu- More than 33,000 students at two
Questionnaire (CSXQ) tions/ Incoming students. dozen different types of colleges and
universities participate.
Center for Postsecondary Research
and Planning (CPRP) at Indiana
University

ENROLLED UNDERGRADUATES

College Student Survey (CSS) All types/All students. CSS has collected data from more than
230,000 students at 750 institutions.
HERI

26 Measuring Quality: Choosing Among Surveys and Other Assessments of College Quality
FORMAT ADMINISTRATION PROCEDURE TIMELINE

ENTERING UNDERGRADUATES

Four-page paper survey. Colleges order surveys from HERI, Register for survey in the spring.
administer surveys on campus, and Surveys are administered in the sum-
return completed surveys for processing. mer and fall, usually during orienta-
Most campuses administer survey in tion. Report available in December.
proctored groups.

Responses are collected via paper as Students complete this when registering Institutions register in July of each year.
part of the registration materials for for the ACT. Responses and ACT scores Enrollment information is sent to ACT
the ACT Assessment. They are later are sent to schools and institutions. from September through June; reports
electronically combined with assess- are produced within 30 to 60 days.
ment results for reporting and
research purposes.

This paper-and-pencil instrument Students complete questionnaire prior to Tapes and/or diskettes are sent to insti-
is completed as part of the test regis- taking the SAT. Responses and SAT tutions six times per year as part of SAT
tration process. scores are sent to schools. Test reports.

Each program has matriculating and Colleges administer and collect surveys, Institutions determine when to mail
nonmatriculating student version of a then send them to The College Board for surveys, but The College Board recom-
standardized paper survey. Optional processing. ASQ Plus asks colleges to mends that they do so as soon as they
web version also is available. identify their major competitors, and know who will enroll (usually mid- to late
students rate their college choice vs. May). Follow-up strongly recommended.
other top choices.

Four-page paper survey; web version Institutions administer surveys and Most institutions administer the survey
under development. Takes 10 minutes return completed instruments to CPRP during fall orientation. To compare stu-
to complete. Demo version at for processing. Web version is adminis- dent expectations with actual experi-
www.indiana.edu/~cseq. tered via a server at Indiana University; ences, colleges administer the CSEQ to
institutions provide student contact the same students the following spring.
information.

ENROLLED UNDERGRADUATES

Four-page paper survey. Campuses administer surveys and return Register January 1 or May 1. Two admin-
them to data processing center. Campuses istration periods available: January
may choose to survey students who through June and July through
completed the CIRP for the purposes of December. Reports from first period
longitudinal study. available in fall, from second period in
February of subsequent year.

American Council on Education/Association for Institutional Research 27


TABLE 2 (continued). Target Institutions and Samples, Participation, Format, Administration Procedure,
and Timeline

INSTRUMENT/ADMINISTRATOR TARGET INSTITUTIONS/SAMPLES PARTICIPATION RATES

ENROLLED UNDERGRADUATES

College Student Experiences Four-year public and private institu- More than 500 colleges and universi-
Questionnaire (CSEQ) tions/All students. ties and approximately 250,000 stu-
dents since 1983 (when second edition
CPRP
was published) have participated.

Community College Student Community colleges/All students. The 1991 edition collected data from
Experiences Questionnaire 45,823 students at 57 institutions. The
(CCSEQ) 1999 edition collected data from
18,483 students at 40 institutions.
University of Memphis, Center for the
Study of Higher Education

Faces of the Future Community colleges/All students In fall 1999, more than 100,000 stu-
(credit and noncredit). dents at 250 institutions participated
American Association of Community
in the survey.
Colleges (AACC) and ACT

National Survey of Student Four-year public and private institu- After 1999 field test, the first national
Engagement (NSSE) tions/First-year and senior students. administration was in spring 2000 with
195,000 students at 276 institutions.
CPRP
CPRP annually surveys approximately
200,000 students at 275 to 325 colleges
and universities.

Your First College Year (YFCY) All types/Students near the end of the Total of 58 institutions and 19,000 first-
first year of college. year students will participate in spring
HERI and Policy Center on the First
2001 pilot. Participation expected to be
Year of College at Brevard College
open to all institutions in spring 2002.

Student Satisfaction Inventory All types (four-year, two-year, and career SSI is used by more than 1,200 colleges
(SSI) school versions are available)/All stu- and universities. More than 800,000
dents. student records are in the national
Noel-Levitz
database.

Adult Student Priorities Survey All types/All students 25 years and older. ASPS was piloted by more than 30 insti-
(ASPS) tutions and more than 4,000 students in
spring 2000.
Noel-Levitz

28 Measuring Quality: Choosing Among Surveys and Other Assessments of College Quality
FORMAT ADMINISTRATION PROCEDURE TIMELINE

ENROLLED UNDERGRADUATES

Eight-page paper survey; identical Institutions administer surveys and Most institutions administer the survey
web survey also available. Takes 20 to return completed instruments to CPRP at the mid-point or later in the spring
25 minutes to complete. Demo ver- for processing. Web version is adminis- term so that students have enough
sion at www.indiana.edu/~cseq. tered via a server at Indiana University; experience on campus to provide valid,
institutions provide student contact reliable judgments. For research pur-
information. poses, the CSEQ also can be adminis-
tered at other times.

Paper survey, self-report (Likert Instruments can be mailed to students or The Center provides surveys upon
scale). distributed in classes, through student receipt of an order. Scoring is completed
organizations, or other student assem- and results are mailed two to three weeks
blies. Completion of instrument takes 20 after colleges return instruments to the
to 30 minutes. Center.

Paper survey. Colleges order materials from ACT and Surveys can be administered during
administer surveys on campus. AACC/ACT fall administration
Completed surveys are returned to ACT (October) for a reduced cost, or can
for scoring and processing. be administered at other times at
regular cost.

Students can complete either a four- Schools send student data files, letter- Institutions send data files to CPRP in
page paper survey or the identical head, and invitation letters to CPRP, late fall. Surveys are mailed to students
online version. Students at one-fifth of which handles data collection, including in late winter and early spring. Follow-
participating schools complete the random sampling, sending surveys to ups continue through the spring. CPRP
web survey. Demo version at students, and conducting follow-ups. sends institutional reports and data to
www.indiana.edu/~nsse. Students return surveys to CPRP. schools in late summer.

Four-page paper survey; web survey HERI oversees administration of paper Institutions register for survey in the
also available. or web-based survey instrument; stu- fall and administer survey in the
dents return completed survey forms to spring. Reports are available in late
data processing center. summer.

Paper survey. In spring 2001, the sur- SSI is generally administered in a class- Students can complete the survey any-
vey also will be available on the web. room setting and takes 25 to 30 minutes. time during the academic year. Surveys
Web version takes 15 to 20 minutes. The generally arrive on campus within one
URL is e-mailed to students along with a week of ordering. Institutions send com-
specific student numeric password to pleted surveys to Noel-Levitz for process-
enter the survey area. ing. Reports are ready for shipment in 12
to 15 business days.

Paper survey. In spring 2001, the sur- ASPS is administered in a classroom set- Students can complete the survey any-
vey will be available on the web. ting and takes 25 to 30 minutes. Web time during the academic year. Surveys
completion takes students 15 to 20 min- generally arrive on campus within one
utes. For the web version, the URL and week of ordering. Institutions send
password are e-mailed to students. completed surveys to Noel-Levitz for
processing. Reports are ready for ship-
ment in 12 to 15 business days.

American Council on Education/Association for Institutional Research 29


TABLE 2 (continued). Target Institutions and Samples, Participation, Format, Administration Procedure,
and Timeline

INSTRUMENT/ADMINISTRATOR TARGET INSTITUTIONS/SAMPLES PARTICIPATION RATES

STUDENT PROFICIENCIES AND LEARNING OUTCOMES

Collegiate Assessment of Academic All types/All students. More than 600 institutions have used
Proficiency (CAAP) CAAP since 1988. More than 450,000
students have tested between 1998 and
ACT
2000.

Academic Profile All types/All students. This survey has been used by 375 insti-
tutions and 1 million students.
Educational Testing Service (ETS)
and The College Board

Tasks in Critical Thinking All types/All students. This instrument is administered by 35


institutions to 200 to 500 students at
ETS
each institution.

Major Field Tests Four-year colleges and In the 1999–2000 academic year, more
universities/Senior students. than 1,000 departments from 606 higher
ETS
education institutions administered
nearly 70,000 tests. Current national
comparative data include accumulated
scores from 96,802 seniors.

Area Concentration Achievement Two- and four-year public and private Approximately 300 institutions and
Tests (ACAT) institutions/Generally seniors, although more than 50,000 students have partici-
ACAT can serve as a pre-test. pated.
Project for Area Concentration
Achievement Testing (PACAT) at
Austin Peay State University

ALUMNI

Comprehensive Alumni All types (two-year and four-year versions Information not available.
Assessment Survey (CAAS) available)/Alumni.
NCHEMS

College Results Survey (CRS) Bachelor degree—granting institutions/ The pilot study included 80 institutions
Alumni, preferably four to 10 years and 40,000 instruments. The web-based
Peterson’s, a Thomson Learning
following degree attainment. survey is open to any graduate. There is
Company
Recommended sample size is 2,000. no limit on the number of participants.

30 Measuring Quality: Choosing Among Surveys and Other Assessments of College Quality
FORMAT ADMINISTRATION PROCEDURE TIMELINE

STUDENT PROFICIENCIES AND LEARNING OUTCOMES

Demographic questions collected on Colleges order assessment battery from Flexible administration schedule. Each
paper with assessment battery. Users ACT, administer it during a locally deter- assessment module can be administered
may add up to nine additional items; mined two-week test period, and return within a 50-minute class period.
they also may design their own assess- it to ACT for processing. Institutions must order assessments at
ment test battery by choosing from the least two weeks prior to administration
six different skill modules. period.

Paper survey (long and short forms). Colleges order materials from ETS and Institutions administer tests on their
Long form contains 108 multiple- administer them to students. Colleges own timeline. Tests are scored weekly,
choice questions and takes 100 min- return tests to ETS for scoring. and reports are issued approximately
utes. Short form contains 36 three weeks after ETS receives tests.
questions. Optional essay is available.

Open-ended or performance-based 90- Colleges order materials from ETS and Colleges decide who and when to test.
minute “tasks” in humanities, social administer them to students. ETS trains Faculty decides scoring schedule or ETS
sciences, or natural sciences. The faculty to score students’ responses, or provides a three- to four-week
score range for each skill is 1 to 6, with ETS scores the tasks. There are nine turnaround for issuing a report.
4 as the core score. separate tasks; three tasks can be used
for assessing fewer than 100 students.

Paper-and-pencil test. Institutions order tests, administer them Must order three to four weeks prior to
onsite to students, and return them to administration for standard shipping.
ETS for processing. Answer sheets received by the beginning
of each month are scored that month
(no scoring in January or September).
Reports are mailed three weeks after
scoring.

Paper survey. Multiple-choice test Institutions order surveys, administer Must order surveys at least 15 days
requiring 48 to 120 minutes, depend- them to students, and return them to prior to administration date. PACAT
ing on content. PACAT for scoring and analysis. scores surveys during the last full work-
ing week of the month and mails
reports the first working week of the
month.

ALUMNI

Paper survey. Colleges order surveys from NCHEMS, NCHEMS mails results three weeks from
administer surveys, and return to date surveys are returned for scoring.
NCHEMS for scoring.

Web-based survey comprised of four Alumni visit web site to complete sur- Unlimited online availability or as
sections. Takes 15 to 20 minutes to vey. Models for working with individual arranged.
complete. institutions are under development.
Institutions identify alumni cohorts,
who Peterson’s then contacts and
directs to the online instrument.

American Council on Education/Association for Institutional Research 31


TABLE 2 (continued). Target Institutions and Samples, Participation, Format, Administration Procedure,
and Timeline

INSTRUMENT/ADMINISTRATOR TARGET INSTITUTIONS/SAMPLES PARTICIPATION RATES

SERIES OF INSTRUMENTS

Student Outcomes Information All types (two- and four-year versions Information not available.
Survey (SOIS) available)/ Questionnaires for entering
students, continuing students, former
NCHEMS
students, graduating students, recent
alumni, and long-term alumni.

Evaluation/Survey Services All types/New students, enrolled stu- Since 1979, 1,000 institutions have
dents, non-returning students, and administered more than 6 million stan-
ACT
alumni. dardized surveys nationwide.

FACULTY AND INSTITUTIONAL SURVEYS

Faculty Survey All types/Full-time undergraduate In 1998–99, data were collected from
faculty and academic administrators. more than 55,000 faculty at 429 colleges
HERI
and universities.

Institutional Performance Survey All types (two-year and four-year versions Information not available.
(IPS) available)/Faculty, administrators, and
board members.
NCHEMS

Institutional Priorities Survey All types (two-year and four-year versions More than 400 institutions have used
(IPS) available)/Faculty, administrators, and the IPS.
staff.
Noel-Levitz

Program Self-Assessment Service College and university programs/ In 1999–2000, 65 institutions and
(PSAS) and Graduate Program Students, faculty, and alumni (separate 12,000 students, faculty members, and
Self-Assessment Service (GPSAS) questionnaires for each group). GPSAS alumni participated.
has separate questionnaires for master’s
ETS
and Ph.D. programs.

32 Measuring Quality: Choosing Among Surveys and Other Assessments of College Quality
FORMAT ADMINISTRATION PROCEDURE TIMELINE

SERIES OF INSTRUMENTS

Paper survey. Colleges order surveys from NCHEMS, NCHEMS mails results two weeks from
administer surveys, and return them to date surveys are returned for scoring.
NCHEMS for scoring.

Most surveys are four-page paper doc- Administration procedures are estab- Institutions mail completed surveys to
uments; one is two pages in length. lished at the discretion of the institution. ACT for processing. Scanning occurs
every second and fourth Friday; ACT
produces and mails reports three to four
weeks after scanning.

FACULTY AND INSTITUTIONAL SURVEYS

Four-page paper survey. Faculty surveys are sent to campuses in Institutions register in the spring and
the fall. Campuses are responsible for summer. HERI administers surveys in
survey distribution. HERI provides out- the fall and winter. HERI issues campus
going envelopes and pre-addressed, profile reports the following spring and
postage-paid return envelopes that summer.
respondents mail directly to HERI’s
survey processing center.

Paper survey. Colleges order surveys and distribute NCHEMS returns results three weeks
them. Surveys include a postage-paid after institutionally determined cut-off
return envelope for respondents to date.
return survey directly to NCHEMS to
maintain anonymity.

Paper survey. In spring 2001, the sur- The paper survey takes about 30 minutes Institutions can administer the IPS any-
vey also will be available on the web. and can be distributed via various meth- time during the academic year. Surveys
ods on campus, including campus mail, generally arrive on campus within a week
face-to-face distribution, and staff meet- of ordering. Institutions return com-
ings. The web version takes about 20 pleted surveys to Noel-Levitz for process-
minutes. URL and password can be ing. Reports are ready for shipment
e-mailed to staff. within 12 to 15 business days.

Paper survey. Institutions purchase and administer the Processing begins the first working day
questionnaires and send completed ques- of each month. ETS ships reports about
tionnaires back to ETS for reporting. three weeks after start of processing.

American Council on Education/Association for Institutional Research 33


TABLE 3. Reporting, Data Availability, Local
Items, Costs, and Contact Information

INSTRUMENT/ADMINISTRATOR REPORT INFORMATION NATIONAL DATA AVAILABLE?

ENTERING UNDERGRADUATES

Cooperative Institutional Research Paper report with local results, and Yes—national results included in stan-
Program (CIRP) Freshman aggregate results for similar institutions dard report.
Survey/Entering Student Survey derived from the national norms.
(ESS) Separate profiles for transfer and part-
time students. Special reports and data
Higher Education Research Institute
file available for a fee.
(HERI) at UCLA and American
Council on Education (ACE)

Freshman Class Profile Service Paper report containing an executive Yes—national user data and college
summary; college attractions; academic student profiles available.
American College Testing (ACT)
achievement, goals and aspirations;
plans and special needs; high school
information; competing institutions;
and year-to-year trends. A range of free
and for-fee reports available.

Student Descriptive Questionnaire Institutions receive SDQ responses for Yes—national and state-level benchmark
(SDQ) students who indicate “yes” to the reports available on paper and on
Student Search Service on the registra- The College Board web site.
The College Board
tion form.

Admitted Student Questionnaire Highlight report (executive summary), Yes—included in standard report.
(ASQ) and Admitted Student detailed report with all data, competitor
Questionnaire Plus (ASQ Plus) report for ASQ Plus only, norms report
with national data. Data file also available.
The College Board

College Student Expectations Computer diskette containing raw insti- No—tentative norms are under develop-
Questionnaire (CSXQ) tutional data file and output file with ment and will be available summer
descriptive statistics. Schools also 2001. Norms reports will include
Center for Postsecondary Research
receive a hard copy of the output file. relevant comparison group data by
and Planning (CPRP) at Indiana
Additional analyses available for a fee. Carnegie type.
University

34 Measuring Quality: Choosing Among Surveys and Other Assessments of College Quality
LOCAL ITEMS & CONSORTIA OPTIONS COST CONTACT INFORMATION/URL

ENTERING UNDERGRADUATES

Contains up to 21 additional local Participation fee of $400 plus $1 per Higher Education Research Institute,
questions. Consortia analyses avail- returned survey for processing. UCLA Graduate School of Education and
able for a fee. Information Studies, 3005 Moore Hall—
Box 951521, Los Angeles, CA 90095-
1521. Phone: 310-825-1925. Fax: 310-
206-2228. E-mail: heri@ucla.edu
www.gseis.ucla.edu/heri/cirp.htm

By providing additional data, cam- There is no cost for the basic information. Freshman Class Profile Service
puses can use this service to summa- Coordinator
rize variables at all stages of the Phone: 319-337-1113
enrollment funnel: students who sub-
www.act.org/research/services/
mitted their ACT scores, those who
freshman/index.html
applied, those who were admitted, and
those who enrolled.

None. No cost. Educational Testing Service


Phone: 609-771-7600
E-mail through: www.collegeboard.org/
html/communications000.html#SAT
Information about data tapes:
www.collegeboard.org/sat/html/
admissions/serve013.html

Standard overlap with all common ASQ $600; ASQ Plus $925. Phone: 800-927-4302
acceptances in both surveys; specific Questionnaire Printing Fee: ASQ $.55 E-mail: info@aes.collegeboard.org
overlap analysis includes five competi- per form; ASQ Plus $.60 per form.
www.collegeboard.org/aes/asq/html/
tor schools in ASQ Plus. Both surveys Processing Fee: ASQ $2.00 per form
index000.htm
can be customized by specifying returned; ASQ Plus $2.25 per form
characteristics of interest to school. returned.
Limited local questions are available.

Local additional questions and consor- For regular paper survey administered by College Student Expectations
tia analyses are available. the institution, the cost is $125 plus $.75 Questionnaire, Center for Postsecondary
per survey and $1.50 scoring fee per Research and Planning, Indiana
completed questionnaire. University, Ashton Aley Hall Suite 102,
1913 East 7th St., Bloomington, IN
47405-7510. Phone: 812-856-5825.
Fax: 812-856-5150. E-mail: cseq@
indiana.edu
www.indiana.edu/~cseq

American Council on Education/Association for Institutional Research 35


TABLE 3 (continued). Reporting, Data Availability, Local Items, Costs, and Contact Information

INSTRUMENT/ADMINISTRATOR REPORT INFORMATION NATIONAL DATA AVAILABLE?

ENROLLED UNDERGRADUATES

College Student Survey (CSS) The Campus Profile Report includes the Yes—national aggregates for similar
results of all respondents. The Follow-up institutions. Complete national aggre-
HERI
Report contains matched CIRP and CSS gates available from HERI.
results for easy comparison. Special
reports and data files available for a fee.

College Student Experiences Computer diskette containing raw insti- No—an annual national report is not
Questionnaire (CSEQ) tutional data file and output file with planned; however, norms reports are
descriptive statistics. Schools also regularly updated and institutional
CPRP
receive a hard copy of the output file. reports include relevant aggregated
Additional analyses available for a fee. comparison group data by Carnegie
type.

Community College Student Diskette containing all responses and Yes—national data can be found in
Experiences Questionnaire scores for students and a summary com- the CCSEQ manual, which is available
(CCSEQ) puter report are available for a fee of $75. for $12.
University of Memphis, Center for the
Study of Higher Education

Faces of the Future Participating schools receive national Yes.


results, an individualized report with
American Association of Community
information about their student popula-
Colleges (AACC) and ACT
tion, a report comparing their data to the
national data, and a data file.

National Survey of Student Comprehensive institutional profile, Yes—aggregated comparative informa-


Engagement (NSSE) aggregated comparison data for similar tion included in standard institutional
schools, and national benchmark report. report and annual national report.
CPRP
Includes data file, means and frequency
distributions on all items, and signifi-
cance tests. Special analyses available for
a fee.

36 Measuring Quality: Choosing Among Surveys and Other Assessments of College Quality
LOCAL ITEMS & CONSORTIA OPTIONS COST CONTACT INFORMATION/URL

ENROLLED UNDERGRADUATES

Local questions are available. $450 participation fee plus $1 for each Higher Education Research Institute,
Consortia analyses are available for a survey returned for processing. UCLA Graduate School of Education
fee. and Information Studies, 3005 Moore
Hall—Box 951521, Los Angeles, CA
90095-1521. Phone: 310-825-1925. Fax:
310-206-2228. E-mail: heri@ucla.edu
www.gseis.ucla.edu/heri/cirp.htm

Local questions are available for a For regular paper administration: $125 College Student Experiences
$250 charge. Consortia analyses are institutional registration fee plus $.75 per Questionnaire, Center for
available. survey ordered and $1.50 scoring fee per Postsecondary Research and Planning,
completed questionnaire. Web adminis- Indiana University, Ashton Aley Hall
tration cost is $495 institutional registra- Suite 102, 1913 East 7th St.,
tion fee plus $2.25 per completed survey. Bloomington, IN 47405-7510.
Phone: 812-856-5825. Fax: 812-856-
5150. E-mail: cseq@indiana.edu
www.indiana.edu/~cseq

Up to 20 local questions are available. $.75 per survey purchased and $1.50 per Center for the Study of Higher Education,
CCSEQ can be used in statewide assess- survey for scoring; $75 for print report 308 Browning Hall, The University of
ment efforts to provide data for strate- and data on diskette. Memphis, Memphis, TN 38152.
gic planning and staff development. Phone: 901-678-2775. Fax: 901-678-
4291. E-mail: ccseqlib@memphis.edu
www.people.memphis.edu/~coe_cshe/
CCSEQ_main.htm

Colleges may add up to 10 local items. AACC/ACT administration: $.75 per Contact Kent Phillippe, Senior Research
Statewide administration is available. survey (includes scoring) plus $50 pro- Associate, AACC. Phone: 202-728-0200,
cessing and reporting fee. Standard ext. 222
administration: $13.65 per 25 surveys E-mail: kphillippe@ aacc.nche.edu
plus $.80 each for scanning, $50 process-
www.aacc.nche.edu/initiatives/faces/
ing fee, and $50 reporting fee.
f_index.htm

Schools or state systems (i.e., urban, $275 participation fee plus per-student National Survey of Student Engagement,
research, selective privates) may form sampling fee based on undergraduate Center for Postsecondary Research and
a consortium of at least eight institu- enrollment. Total cost range varies, from Planning, Indiana University, Ashton
tions and can ask up to 20 additional approximately $2,500 to $5,500. Aley Hall Suite 102, 1913 East 7th St.,
consortium-specific questions. Targeted over-sampling is available for Bloomington, IN 47405-7510. Phone:
additional per-student fee. 812-856-5824. Fax: 812-856-5150.
E-mail: nsse@indiana.edu
www.indiana.edu/~nsse

American Council on Education/Association for Institutional Research 37


TABLE 3 (continued). Reporting, Data Availability, Local Items, Costs, and Contact Information

INSTRUMENT/ADMINISTRATOR REPORT INFORMATION NATIONAL DATA AVAILABLE?

ENROLLED UNDERGRADUATES

Your First College Year (YFCY) Paper report provides in-depth profile of Yes.
first-year students by sex, and compara-
HERI and Policy Center on the First
tive data for similar institutions. Data file
Year of College at Brevard College
also available.

Student Satisfaction Inventory The standard campus report includes the Yes—four national comparison groups
(SSI) mean data for all students alongside are standard, are available based on insti-
national averages. Optional reports and tution type, and are updated twice a year.
Noel-Levitz
raw data are available for an additional fee.

Adult Student Priorities Survey The standard campus report includes the Yes—the national comparison group
(ASPS) mean data for all students alongside includes data from four-year and two-
national averages. Optional reports and year institutions and is updated twice a
Noel-Levitz
raw data are available for an additional year. As of May 2000, the national
fee. group included 4,063 students from 32
institutions.

STUDENT PROFICIENCIES AND LEARNING OUTCOMES

Collegiate Assessment of Academic Institutional summary report and two Yes—for freshmen or sophomores at
Proficiency (CAAP) copies of each student’s score report. two- or four-year, public or private
Certificate of achievement for students institutions.
ACT
scoring at or above national average on
one or more modules. Supplemental
reports and data file available for a fee.

Academic Profile Summary score report contains both Yes—provided by class level and by
criterion-referenced proficiency levels Carnegie classification.
Educational Testing Service (ETS) and
and norm-referenced scores. Scores vary
The College Board
slightly from long form to short form.
Data diskette included in fee.

Tasks in Critical Thinking Scores are reported as the percentage of No.


students demonstrating proficiency in
ETS
each of the three skill areas—inquiry,
analysis, and communication, as
measured by the tasks.

38 Measuring Quality: Choosing Among Surveys and Other Assessments of College Quality
LOCAL ITEMS & CONSORTIA OPTIONS COST CONTACT INFORMATION/URL

ENROLLED UNDERGRADUATES

Not available during pilot stages. No fees during pilot stages. Higher Education Research Institute,
Local items and consortia options UCLA Graduate School of Education
will be available with the full-scale and Information Studies, 3005 Moore
administration beginning in 2002. Hall—Box 951521, Los Angeles, CA
90095-1521. Phone: 310-825-1925. Fax:
310-206-2228. E-mail: yfcy@ucla.edu
www.gseis.ucla.edu/heri/yfcy
www.brevard.edu/fyc/Survey/
YFCYsurvey.htm

Special comparison group reports are $50 processing and setup fee plus $1.50 Julie Bryant, Program Consultant: julie-
available for a fee. to $1.95 per survey, depending on the bryant@noellevitz.com or Lisa Logan,
quantity ordered. Program Consultant: lisa-logan@noelle-
vitz.com. Phone: 800-876-1117
www.noellevitz.com

Special comparison group reports are $50 processing and setup fee plus $1.50 Julie Bryant, Program Consultant:
available for a fee. to $2.95 per survey, depending on the julie-bryant@noellevitz.com or Lisa
quantity ordered. Logan, Program Consultant: lisa-
logan@noellevitz.com. Phone: 800-
876-1117
www.noellevitz.com

STUDENT PROFICIENCIES AND LEARNING OUTCOMES

Nine optional local questions may be $330 participation fee plus $8.95 to ACT, Outcomes Assessment, P.O. Box
added at no additional charge. $16.55 per student, depending on the 168, Iowa City, IA 52243-0168. Phone:
number of students and the number of 319-337-1053. Fax: 319-337-1790.
modules purchased (includes instru- E-mail: outcomes@act.org
ments, scoring, and reporting).
www.act.org/caap/index.html

Up to 50 local questions are available. $300 annual institutional fee. Price Jan Lewis at 609-683-2271. Fax: 609-
Institutions can customize compari- varies by form and number purchased 683-2270. E-mail: jlewis@ets.org
son groups from list of participating ($9 to $11.25 for short form and $14.50
www.ets.org/hea/heaweb.html
schools (minimum of eight per group). to $16.75 for long form). $2.25 each for
optional essay (includes scoring guide).
Minimum order of 50 tests.

None. $16.50 each for first 30 to 100. Jan Lewis at 609-683-2271. Fax: 609-
683-2270. E-mail: jlewis@ets.org
www.ets.org/hea/heaweb.html

American Council on Education/Association for Institutional Research 39


TABLE 3 (continued). Reporting, Data Availability, Local Items, Costs, and Contact Information

INSTRUMENT/ADMINISTRATOR REPORT INFORMATION NATIONAL DATA AVAILABLE?

STUDENT PROFICIENCIES AND LEARNING OUTCOMES

Major Field Tests Reports include individual scaled scores, Yes—for each test. Percentile tables for
departmental summary with department all seniors taking the current form of
ETS
mean-scaled scores, and demographic each test are published each year.
information. Special score reports Departments may obtain custom com-
available for an additional fee. parative data for an additional fee.

Area Concentration Achievement Schools receive two copies of the score Yes.
Tests (ACAT) report for each student. Standard scores
compare students to five-year national
Project for Area Concentration
sample. Raw percentage scores of items
Achievement Testing (PACAT) at
correct also included. Additional analy-
Austin Peay State University
ses and data file available for a fee.

ALUMNI

Comprehensive Alumni Analysis includes one analytical report. No.


Assessment Survey (CAAS) Data file available for a fee.
National Center for Higher Education
Management Systems (NCHEMS)

College Results Survey (CRS) Institutions receive data file of responses No. Analytic tools for peer comparisons
in spreadsheet format for analyses. have been developed and are available
Peterson’s, a Thomson Learning
Analytic tools for institution-based analy- to participating institutions at a secure
Company
ses and peer comparisons are being web site.
explored.

SERIES OF INSTRUMENTS

Student Outcomes Information Analysis includes one analytical report. No.


Survey (SOIS) Data file available for a fee.
NCHEMS

Evaluation/Survey Services Basic reporting package includes a sum- Yes.


mary report, graphics report, and nor-
ACT
mative report. Other reports and data
file available for a fee.

40 Measuring Quality: Choosing Among Surveys and Other Assessments of College Quality
LOCAL ITEMS & CONSORTIA OPTIONS COST CONTACT INFORMATION/URL

STUDENT PROFICIENCIES AND LEARNING OUTCOMES

Group scores are reported for up to 50 $23.50 per test ($23 for 100 or more), Dina Langrana at 609-683-2272
locally written questions. plus shipping. Includes Test E-mail: dlangrana@ets.org
Administration Manual, standard pro-
www.ets.org/hea
cessing, and national comparative data.

Schools can customize most tests to Price ranges from $4 to $11 per student PACAT, Box 4568, Austin Peay State
model the test after major require- survey depending on discipline, pre-test University, Clarksville, TN 37044.
ments. Art and Literature in English vs. senior test, and two-year vs. four-year Phone: 931-221-7451. Fax: 931-221-
cannot be customized. Social work school. Price includes use of materials, 6127. E-mail: pacat@pacat.apsu.edu
customization will be available in scoring, two copies of the score report,
http://pacat.apsu.edu/pacat
June 2001. and long-term maintenance of score
histories.

ALUMNI

Up to 20 local questions are available $.85 per questionnaire plus shipping and NCHEMS, P.O. Box 9752, Boulder, CO
for a data entry fee of $1.25 per handling. $200 for analysis (includes 80301-9752. Clara Roberts at
question. one analytical report). 303-497-0390
E-mail: clara@nchems.org
www.nchems.org/surveys/caas.htm

Collaborative administration among There is no respondent cost to complete Rocco P. Russo, VP, Research,
institutions can be explored. the online CRS. Peterson’s, a Thomson Learning
Company, Princeton Pike Corporate
Costs for institutional applications of the
Center, 2000 Lenox Drive, P.O. Box
CRS are being explored as collaborative
67005, Lawrenceville, NJ 08648. Phone:
models are identified.
609-896-1800 ext. 3250, toll-free: 800-
338-3282 ext. 3250. Fax: 609-896-4535
E-mail: rocco.russo@petersons.com
www.petersons.com/collegeresults

SERIES OF INSTRUMENTS

Up to 15 local questions are available $.30 per questionnaire plus shipping and NCHEMS, P.O. Box 9752, Boulder, CO
for a data entry fee of $1.25 per handling. $150 for analysis, which 80301-9752. Clara Roberts at
question. includes one analytical report. 303-497-0390
E-mail: clara@nchems.org
www.nchems.org/sois.htm

Up to 30 local questions are available. $14.35 for 25 four-page surveys. $.84 per ACT, Postsecondary Services, Outcomes
Consortia reports are available for survey returned for processing. $168 Assessment, P.O. Box 168, Iowa City, IA
a fee. for basic reporting package (summary 52243-0168. Phone: 319-337-1053
report, graphics report, and normative Fax: 319-337-1790
report). E-mail: outcomes@act.org
www.act.org/ess/index.html

American Council on Education/Association for Institutional Research 41


TABLE 3 (continued). Reporting, Data Availability, Local Items, Costs, and Contact Information

INSTRUMENT/ADMINISTRATOR REPORT INFORMATION NATIONAL DATA AVAILABLE?

FACULTY AND INSTITUTIONAL SURVEYS

Faculty Survey Campus profile report includes faculty Yes—in normative profile report.
responses by gender. Separate profiles
HERI
of teaching faculty and academic admin-
istrators also are provided. Normative
profile includes national data by institu-
tional type. Data file is also available.

Institutional Performance Survey Report contains data for total campus, No.
(IPS) total faculty, and targeted populations.
National Center for Higher Education
Management Systems (NCHEMS)

Institutional Priorities Survey The standard campus report includes the Yes—three national comparison groups
(IPS) mean data for all respondents alongside are standard, are available based on
national averages for like-type institu- institution type, and are updated twice
Noel-Levitz
tions. Optional reports (including a year.
IPS/SSI reports) and raw data are avail-
able for an additional fee.

Program Self-Assessment Service Summary data report includes separate No.


(PSAS) and Graduate Program analyses for faculty, students, and
Self-Assessment Service (GPSA) alumni. Optional subgroup reports and
data file available for a fee.
ETS

42 Measuring Quality: Choosing Among Surveys and Other Assessments of College Quality
LOCAL ITEMS & CONSORTIA OPTIONS COST CONTACT INFORMATION/URL

FACULTY AND INSTITUTIONAL SURVEYS

Local questions are available. $325 plus $3.25 per returned survey. Higher Education Research Institute,
Consortia analyses are available for UCLA Graduate School of Education
a fee. and Information Studies, 3005 Moore
Hall—Box 951521, Los Angeles, CA
90095-1521. Phone: 310-825-1925. Fax:
310-206-2228. E-mail: heri@ucla.edu
www.gseis.ucla.edu/heri/cirp.htm

Up to 20 local questions are available. $1,600 for 100 questionnaires. Includes NCHEMS, P.O. Box 9752, Boulder, CO
survey, pre-paid return postage, stan- 80301-9752. Clara Roberts at
dard analyses, and report summary. 303-497-0390
After first 100 questionnaires, $150 for E-mail: clara@nchems.org
each additional 50.
www.nchems.org/surveys/ips.html

Special comparison group reports are $140 processing and setup fee plus $1.50 Julie Bryant, Program Consultant:
available for a fee. to $2.95 per survey, depending on the julie-bryant@noellevitz.com or Lisa
quantity ordered. Logan, Program Consultant: lisa-logan
@noellevitz.com. Phone: 800-876-1117
www.noellevitz.com

Local questions are available. $37 for 25 questionnaires plus shipping Karen Krueger at 609-683-2273
and handling (minimum purchase of 75 Fax: 609-683-2270
questionnaires). $150 for summary data E-mail: kkrueger@ets.org
report plus $3.99 per booklet processed.
www.ets.org/hea/heaweb.html#psas

American Council on Education/Association for Institutional Research 43

S-ar putea să vă placă și