Sunteți pe pagina 1din 11

!

Yardstick White Paper Series

Stage 3: Assembling exam form(s)


Authors: Greg Sadesky, PhD Vice President, Examination Services Greg Pope, MSc Vice President, Operations

Copyright 2011 Yardstick Software Inc.

Table of contents
Introduction ................................................................................. 3 1. Using Test Specifications to Set Constraints ..................... 4 2. Assembling first drafts of the exam form(s)........................ 6 3. Validation and feedback of draft exam forms ..................... 8 4. Reassembling of forms based on feedback ........................ 8 5. Setting new item development scope and targets.............. 9 Next stage: Test administration .............................................. 10 6. Closing .................................................................................. 10 Information about the authors ................................................. 11

Pg.2

Introduction
The Assessment Life Cycle is a way of organizing the processes involved in creating sound assessments into a series of easy-to-understand stages. The life cycle looks like this:

Stage 3 of the Assessment Life Cycle is where we construct one or more exam forms from the quality items that have been authored and reviewed in the item bank. Sometimes this is as easy as finding the right items from the right subject areas, but often, and particularly for certification exams, this step requires meticulous effort! This white paper will focus on Stage 3 in the Assessment Life Cycle (Assembling Your Test) detailing: ! Using the test specifications from Stage One to build constraints

Pg.3

! ! ! ! ! Assembling first drafts of the exam form (s) Validation and feedback of draft exam forms Reassembling of forms based on feedback Setting new item development scope and targets

An overview of stage 3 is provided below:

1. Using Test Specifications to Set Constraints


The first process in this step involves taking the constraints determined in Stage One (i.e., remember the blueprint?) and assembling a first draft of the exam form(s). Constraints are item and exam attributes that are must have requirements or must not have requirements. If we think of a practical example, when you are buying a vehicle you will have requirements

Pg.4

! (constraints) that you will place on the vehicle that you will purchase. The following might be constraints for buying a new vehicle: ! Vehicle type: Not truck or SUV ! Vehicle color: Only red, blue, white or black ! Fuel economy: Must get at least 30 MPG/ 7.8 L per 100km ! Number of passengers: Must be able to take at least 4 As you have probably experienced there are many constraints that are put on the purchase of a new vehicle. In fact just about anything in life is the same way, whether one is building a new home, finding a school for children, choosing a holiday to go on, we all have preferences (constraints) which we place on the selection of the perfect home, school, holiday. There are two main factors that come into play, selecting the right constraints and having the right pool of things (vehicle, home, school, holiday) to choose from. We can have the right set of constraints that we need but if there simply are no red, blue, white or black vehicles available on the lot that get at least 30 MPG and can carry at least 4 passengers then we will walk away empty handed. The same is true of examination forms. To build the perfect form(s) we must select the right mixture of constraints and have the right items in the bank to choose from. Below are a list of common constraints that may be placed on the assembly of one or more examination forms. Keep in mind that constraints are tailored to the needs of an organization (and testing model e.g., Computerized Adaptive Testing) and can be almost anything: " Number of parallel forms: 3 " Test length: Total number of items are 100 " Item difficulty: Between p-values of 0.4 to 0.7 " Item discrimination: Greater than 0.300 " Blueprint coverage: Number of questions per blueprint category " Cognitive level coverage: Number of questions in knowledge and skills categories based on blueprint " Item images: Items with images must be balanced within blueprint category " Passages: Number of passages balanced overall on the exam " Number of questions associated with passages: 4-5 items per passage " Item friends: Items that need to be presented on an exam form with each other " Item enemies: Certain items cannot be presented on an exam form with each other In practice there is generally some back and forth and fine-tuning of the constraints when the actual assembly of the forms. Once the list of constraints has been clearly defined we can assemble the first drafts of the exam forms!

Pg.5

2. Assembling first drafts of the exam form(s)


As the excitement builds at the prospect of assembling your test forms!the brakes need to be lightly applied. The size and composition of your item bank is going to be the limiting factor when assembling your exam forms. If you dont have enough questions with the right characteristics then you will not be able to assemble the exam forms you need. Realistically you should never reach the exam assembly stage without your item bank in good order. The item development and item bank management in Stage Two should have ensured that the item bank is complete and robust enough to be able to meet the exam form assembly requirements including all constraints. In the example above listing constraints, we see that it calls for 3 exam forms each with 100 questions. There are no constraints listed for overlap of questions (e.g., each form contains an

Pg.6

! anchor set of questions which are the same across forms to be used for statistical equating) so we assume that each form contains 100 unique questions. This means that our item bank must have at least 300 questions. In practice it is very difficult to have an item bank in which every item perfectly meets every criteria and can be used to build perfect parallel forms, so the bank actually would likely need to be 30-50% larger than that (i.e., 390-450) with items properly distributed in the right proportions in the right blueprint categories with the right statistical characteristics. As we can see, the item bank manager has his/her work cut out for them to orchestrate the right mix of banked questions to meet the test assembly requirements. Assuming the item bank manager did their job and the bank is ready, we can move to the actual assembly process. Often this process is done manually where an organization wades through the item bank pulling questions one by one that meet the appropriate criteria. As you can imagine, doing this by hand can be an extremely time consuming and iterative process. For the 3 forms, 100 questions per form example above, trying to meet all the constraints for all forms would truly be a long term project with much trial and error. And when the goal is create the BEST POSSIBLE form from these items with respect to average item discrimination or target test average, the manual process quite simply falls short. Fortunately, the bright people in the psychometrics industry have come up with automated approaches for assembling test forms, appropriately called Automated Test Assembly (ATA). The ATA approach uses sophisticated optimization algorithms that are able to take into account all of the constraints and build the right number of test forms in seconds, thus making the process more accurate and the effort much less time consuming. Perhaps most importantly, this approach allows the Exam Policy Group and the SMEs to focus their effort on test validation, rather than the grind of just getting the tests assembled. With this focus on validation, there is more time available to fine tune the exam by replacing particular questions, adding questions from other content areas, etc. Sometimes, the validation process also brings to light some constraint that wasnt previously considered, therefore making future tests better. On of the most exciting aspects of ATA is that it can also be used to set item bank development targets. This process involves using ATA to determine the smallest number of questions required to be developed that meet all the constraints. This become invaluable as the item development and form assembly process is underway as one can see where the gaps in the bank are being filled and where they are not being filled. When this process is used, the item development requirements are usually very specific with respect to all the possible characteristics of questions (e.g., competency, cognitive level, image or not, etc.). The item bank manager can then use this information to guide item development in areas where there is a deficit of appropriate items and stop development in areas where there is a surplus of items.

Pg.7

3. Validation and feedback of draft exam forms


Regardless of how the exam forms are assembled, the next step after preliminary assembly is to review the first draft of the exam form(s) by the Examination policy group and SME/Examination development group. The draft form(s) undergoes thorough review to ensure: Representation from blueprint is appropriate and accurate That no questions are enemies to one another appearing on the same form (e.g., enemy items are those that should not appear together as they may give away the answers to one another) That the flow of the exam is expected and appropriate That the exam form(s) are balanced in terms of question types, images, keyed correct answers, etc.

4. Reassembling of forms based on feedback


Based on the feedback obtained from the form review process a new draft version of the exam form(s) may be required. The Examination Policy Group and SME/Examination Development Group then review the form(s) again until all stakeholders are satisfied with the quality and of the form(s). It is crucial that all stakeholders are satisfied with the form(s) prior to operational release. From time to time additional items are required to fill blueprint categories or augment other areas to satisfy constraints.

Pg.8

231'!45.$! 6..,!

-.)'! /.0(&)1%0&(! 0.&!'#&! "#$!%&#'(! )#*+%)#,!

5. Setting new item development scope and targets


Depending on how the review process progresses, new item development targets and scope may need to be added. For example, if in the exam form review process it is found that there is dramatic under-representation in one or more categories or aspects of professional practice, and the item bank has no more questions in those categories, then more item development will be required to bolster the number of items in those areas. In cases like this having a loyal, well trained, readily available group of SME item developers on standby is crucial. The item bank manager can go to this pool of item developers with specific requirements for new items to be created which can be fast tracked through the item development and review process to be included on new drafts of the examination forms.

Pg.9

Next stage: Test administration


The next stage in the Assessment Life Cycle involves administering the exam form(s). All the work that has gone into building the blueprint, creating the question content, establishing the item bank, and assembling the test forms is about to go to the next level. Stage 4 is where the rubber meets the road where candidates have the opportunity to demonstrate what they know and can do. The central concern here is to create administration conditions and test and data management protocols that are standardized and secure, consistent with the seriousness of the exam. This stage will be discussed in more detail in the next white paper in this series.

6. Closing
This white paper focuses on the fundamentals of Stage Three in the Assessment Life Cycle (Assembling Your Test) detailing: ! Using the test specifications from Stage One to build constraints ! Assembling first drafts of the exam form ! Validation and feedback of draft exam forms ! Reassembling of forms based on feedback ! Setting new item development scope and targets In general, the Assessment Life Cycle is a way of organizing the processes involved in creating sound assessments into a series of easy-to-understand stages. Following the Assessment Life Cycle will help ensure that your examination program is defensible and of the most benefit to candidates and your organization. Let Yardstick help you apply the Assessment Life Cycle to your organizations assessment program. Yardstick offers a full range of products and services for every step and processes in the Assessment Life Cycle. Our clients agree, we know testing and we work hard to make your testing program the best that it can be. Contact us at: psychometrics@GetYardstick.com for more information, we would love to hear from you!

Pg.10

Information about the authors


Greg S joined Yardstick Software, as a Senior Psychometrician in Fall 2008. Bringing together his experience from research design, examination program quality control, and technology commercialization, Greg is now focused on helping clients and partners make better certification, evaluation, and educational decisions, specifically within the licensure / certification context. His current preoccupation is with the optimal design, implementation, and analysis of assessment systems tailored to the specific needs of clients. In addition to his technically oriented psychometric work, Greg is also passionate about educating the public about testing and holds it as a personal mission to demystify psychometrics to clients and stakeholders alike. Greg P is a self-declared psychometric enthusiast who draws his strength from talking about testing. He has held the position of Psychometrician with Alberta Education helping to oversee the defensibility of the high stakes diploma exam and achievement test programs. After that Greg worked for many years as a psychometrician and software product manager since designing and implemented cheater detection software, computer adaptive testing (CAT), and next generation psychometric reporting software. Greg is now at home in his role as VP Operations, helping to create customer delight for Yardstick clients.

Greg Sadesky, PhD, VP Examinations

Greg Pope, MSc, VP Operations

Pg.11

S-ar putea să vă placă și