Sunteți pe pagina 1din 27

iTrainData.

doc

Data Analysis for Production Word Tests 2009 & 2013

iTrain Team: Setsuko & Tatiana CSUMB: IST622 MIST Cohort 8: Summer 2013

iTrainData.doc CONTENTS Introduction ......................................................................................................... 3 Measurements ..................................................................................................... 3 Meaning of Scores .............................................................................................. 5 Purpose of Data .................................................................................................... 6 Implication of Analysis ....................................................................................... 6 Evaluation of Measurements ............................................................................... 9 Appendix A: Syllabus for Word 100A ............................................................ 11 Appendix B: Sample Production Test .............................................................. 15 Appendix C: Sample Grade Sheet ................................................................... 17 Appendix D: Sample Rubric Sheet .................................................................. 19 Appendix E: Sample Scoring Sheet .................................................................. 21 Appendix F: Charts for Original Data and Altered Data ................................. 23 Appendix G: Statistical Reference Items .......................................................... 25

iTrainData.doc INTRODUCTION The Business Skills Center of the Business Technology Department at Monterey

Peninsula College has accumulated test score data based on performance tests that assess student abilities pertaining to business-related, Word processing skills. The iTrain team is conducting the present test score comparison between MPCs Word Class of 2009 and Word Class of 2013. The expectation would be that the scores are higher in the more recent class than in the older class, owing to improved teaching skills and the general expansion of student knowledge in technology. The official name of the course involved is Word Processing: Microsoft Word for Windows I. The coursework includes a Theory Test and a Production Test, with the Production Test having greater weight in determining student grades (See Syllabus in the Appendix A). The scores from the two data blocks measured pertain to the same Word course but from distinct years. The data items analyzed are the scores for the Production Tests for the Word 100A class. The Production Test evaluates the students by their actual performance in creating documents, rather than in assessing the acquisition of theoretical concepts from class. A sample copy of a Production Test is included in the Appendix B.

MEASUREMENTS The test in questions is specifically titled BUSC 100A Word 2007 Production Test #1- Version A Unit 1- Chapters 1-5. The Production Test (Appendix B) purportedly evaluates the individuals ability to demonstrate basic Word processing skills. This evaluation comprises 40% of the students overall grade in the course; other assessments include theory tests and daily

iTrainData.doc

production work. Appendix C provides a Sample Grade Sheet, with the Performance Test boxed in red; other evaluatory items are visible on the sheet as well. When enrolled in the course, students must first complete all Daily Production Work and the Theory Test before they can take the Production Test. Failing the Theory Test does not imply a student will fail the course; a student can still receive a high grade even after having failed the Theory Test. If students withdraw from the class or do not complete the Daily Production Work, however, they receive a zero for the course. Once Daily Production Work is completed, students then must request to take the timed Production Test. The Production Test is delivered in a list format comprising fifteen editing tasks of varying complexity. After downloading the document, students must edit it according to the assigned tasks. The topics include file name changes, spell checking, synonym replacements, font style adjustments, and tab setting alterations, and other basic formatting essentials. The process of the evaluation for the Production Test begins with the disbursement of the task sheet (Appendix B). Students can read through the sheet and ask any questions prior to beginning the test. The students receive a thumb drive or diskette with the document supplied. Once a student feels ready, the clock will begin for the individual, allowing for a total of fortyfive minutes. The student then begins the test individually, having to navigate through the disk file to find the correct document for editing. When the students finish making edits or when time runs out, they must print off the original document and their revised documents to submit with the thumb drive for evaluation. The faculty then grades the submitted items according to the test criteria established in the guidebook binder. The binder contains the evaluation answer files that clearly indicate how many points are deducted for each required item. A Sample Rubric Sheet is provided in

iTrainData.doc

Appendix C. Depending on the appearance of the students revised document, faculty can easily evaluate the completion of tasks. If there is some discrepancy or question regarding a students work, then the faculty refers to the document saved on the thumb drive. For example, sometimes Font size, style, or spacing might not print accurately, so the faculty will investigate the formatting on the saved file to verify if students completed the task correctly.

MEANING OF SCORES When scoring the student work, the faculty assumes the students will obtain 100%, and deduct points for items missed according to the rubrics (Appendix C). The evaluators interpret the Production Test score as a percentage of 100. Students need to achieve at least 90% in order to get an A, 80% for a B, 70% for a C, and 60% for a D; below 60% is considered an F. The passing grade is a C or above. The question items are based on the syllabus and the student objectives. The tasks evaluate student performance of these objectives. So long as faculty evaluators follow the rubrics provided, the interpretation should be deemed trustworthy. Test evaluators collaborate in determining validity of their scores and consult as needed for accuracy regarding minor errors in calculating scores. During the evaluation of the student work, the faculty completes a scoring sheet; this blue scoring sheet is provided in Appendix E. The actual score calculated is then recorded on the Grade Sheet (Appendix B). Faculty incorporates this score at 40% of the final evaluation grade. The faculty has been using this system for numerous years and continues to employ it as a valid form for student assessment.

iTrainData.doc PURPOSE OF DATA

The scores in the class are used to determine how students perform at the Business Skills Center. Based on these scores, instructors can prepare new classes and eliminate other classes that do not demonstrate productive results. Typically many advanced students take the business skills classes just to prove with a letter grade that they can use these skills properly. There are also some students who have hardly ever used computers previously and every task is a challenge for them; it is important to help these students succeed. There are some students as well who take the class primarily to fill the credit requirement or to receive certain compensatory rewards; these students typically have low volition and very little intention to make much effort in learning skills in the class. The iTrain team compares here the Production Test scores between MPCs Word Class of 2009 and Word Class of 2013. Currently there is a new version of Word 2013 that needs to be added to the curriculum, so comparing the two data sets can help the instructors estimate how much time and effort they use and how much change may be necessary to create the answer keys for a newer version. If the older version of four years ago is better than todays version, then the older method of developing the course should be preserved and not the current method. The instructors should then know how much improvement they made in recent years compared to four years ago. The team expects the scores to be higher in the more recent class than in the older class, due to improvements in faculty practices and general student knowledge.

IMPLICATION OF ANALYSIS Due to the stipulation that students receive a zero if they withdraw from the course or dont complete Daily Production Work, the iTrain team chose to alter the data supplied. This

iTrainData.doc alteration dropped the number of students of each course from 34 in each original class to respectively 25 (in the 2009 class) and 27 (in the 2013 class). Too many zeros were present, so the team wanted to focus on those students who actually completed the Daily Production Work

and thus the Production Test. The team then further altered the data to see if dropping the lowest three grades from each course would significantly skew the results. Graphs comparing these files are included in Appendix D. The graphs visually suggested that such alterations could have possibly raised the level of the 2013 scores as a whole. Understanding that extremes in score ranges truly affect the calculation of data, then disregarding the lowest scores could actually support the alternative hypothesis that the course is potentially improving. In other words, extremely low scores could be attributed to individual student limitations and not necessarily be reflective of faculty instruction. In all three data files (Original, Removal of Zero Grades, and Removal of Lowest Three), however, the data indicates that students are performing worse in the 2013 class than in the 2009 class; todays class is less successful than the course of four years ago. The results indicate that there need to be improvements in instruction and that student knowledge levels are not necessarily higher; prior instructional practices may be deemed more practical than those employed today. The iTrain team conducted an initial analysis to see the quality of data. This initial analysis compiled the following results:

Frequency counts, descriptive statistics (mean, standard deviation, median), normality (skewness, kurtosis, frequency histograms), and scattergrams.

Frequency counts show that both data have Grade A as the most frequent percentage and Grade F as the second most frequent percentage. These Grade Fs were later deleted from the teams data analysis.

iTrainData.doc

Descriptive statistics show that the mean, median, and standard deviation of Word 2009 data are greater than the numbers of Word 2013 in the same categories.

Correlation coefficient was conducted to see if the two data are closely related, but the results indicate that they are independent data, because the correlation number is a weak one (0.174).

The main analysis conducted by iTrain aimed to see if the following hypothetical statements were acceptable or not:

H0: 2013 <= 2009 [Word Class of 2013 does worse or equal to Word Class of 2009]

H1: 2013 > 2009 [Word Class of 2013 does better than Word Class of 2009]

In the initial analysis there was a slight difference between the two data sets, such that the t-Test of Two-Samples Assuming Unequal Variances was used to analyze the hypothesis. The Onetail Critical value is greater than t-stat (1.668>-.754) and the one-tail P-value is greater than 0.05 (.227 > .05); therefore, we must accept the null hypothesis. In other words, the Word Class of 2009 did better than Word Class of 2013, or 2013 < 2009. Even after disguarding the zero grades and the lowest three grades, the t-stat for all analysis conducted was lower than the critical value and p-value was higher than 0.05; so in both cases, we have to accept the null hypothesis, that the Word class 2009 is actually better than (in this case, not equal to) Word class 2013. We cannot reject the null hypothesis. Data items with descriptive statistics and statistical analysis are included in Appendix F.

iTrainData.doc EVALUATION OF MEASUREMENTS The consistency of evaluating the student performance in the Production Test does seem valid. However, it would not necessarily be valid to draw from those scores a conclusion regarding faculty performance. There can be numerous factors contributing to the success or

failure of a course, and scores from one test should not be the only determining factor. Would it be fair to generalize such a perspective? Although accumulating this data is quite cost effective (mainly tapping into the time is takes faculty to score the Production Tests) and efficient (in viewing previously recorded grades simultaneously), postulating that teachers are at fault is not a legitimate conclusion. Undoubtedly, the results indicate that the class of 2013 is not performing better compared to four years ago. However, there are issues that can bring to question the validity of the results. First, there is less staff in the Business Skills Center compared to four years ago; two people passed away and one person retired. There is also only one student worker now, but four years ago there were two TAs supporting the teachers. There is less space in the classroom due to remodeling. There are also less hours of open operation currently compared to four years ago. MPC also forced the reduction of classes two years ago when the Skills Center eliminated two courses. Another factor influencing student motivation stems from the fact that there are no new computers available since 2009, due to budget cuts and minor technical problems (typically repaired by the staff members themselves). All these elements influence the educational environment and thus determine to some degree the results of student grades. Therefore, it would not be appropriate to indicate that instruction needs improvement or that general student knowledge is worse than prior years.

iTrainData.doc Understandably, the rate of success has other factors to consider. Perhaps further analysis could include faculty evaluations conducted by supervisors as well as by students,

10

indicating supporting or detrimental factors for learning. Such analysis would, of course, require sufficient funds as well as time, which MPC does not currently have at its disposal. There may be some improvement in class quality next term because of a change in the textbook edition. Similarly, a new factor of non-repeatability class next term may significantly alter student volition; they will have only one opportunity to take the course. Hopefully this data analysis can contribute to the improvement of the program and serve as instructional guidance for future best practices. The scope of this present study led the iTrain team to limit doing further analysis with data pertaining to the same courses investigated above. Of particular interest would have been to include the data obtained that indicates how many hours were dedicated to coursework and then to evaluate those elements according to individual grades acquired. Such cross analysis might prove revelatory in nature, possibly indicating a correlation between dedicated hours and student performance. The iTrain team leaves such data analysis for future statistical studies.

iTrainData.doc APPENDIX A: Syllabus for Word 100A

11

iTrainData.doc

12

BUSC 100 Microsoft Word 2007


Monterey Peninsula College Business Skills Center
A course on the basic features of Microsoft Word 2007 for Windows. This course provides students with the opportunity to learn word processing on IBM compatible computers . Prerequisite: No prerequisites are required but it is recommended that students have basic keyboarding skills and familiarity with Windows operating systems. REQUIRED TEXTBOOK: Microsoft Word 2007 Windows Vista Edition (Signature Series) by Nita Rutkosky, Paradigm Publishing, 2008, ISBN 978-0-76383-099-1. CD-ROM included. MATERIALS: One ream of 20 lb. 8" x 11" white paper, which can be purchased very reasonably at Office Depot or a similar store. One blank 3" student data disk which may be purchased at the counter. Optionally, you may use a USB drive. Upon completion of this course, the student should: Create, print, manage, and edit word processing documents Manage formatting of characters, paragraphs, and document-level Use images and visual elements to enhance documents Create and insert tables and charts within documents 50% 10% 40% 100C 11 - 15 3 3

STUDENT LEARNING OUTCOMES:

COURSE EVALUATION: Daily Production Work Theory Tests Production Tests UNITS 100A 100B 1 - 5 6 - 10 1 2 1 2

Lessons & Unit Assessments Theory Tests Production Tests

All assignments and tests must be completed to receive a passing grade. Getting Started Obtain a login code at the timeclock computer. You will use this code all semester for your course. Log in on the timeclock computer. Ask an instructor if you need help getting started.

iTrainData.doc

13

You will find that reading the textbook lessons before class will be helpful. You will then be able to just review each section when you are in the lab and then do the associated exercises at the computer. In addition to instructions in the textbook, check daily our revisions to text explained in the final section at the end of these instructions. Submitting Work for Grading The attached Record of Assignments indicate which assignments are to be printed and submitted for grading. Step-by-step Exercises in the textbook and Assessments are required to be printed and turned in for grading. Use this activity record sheet to note your completion dates for each assignment. Be careful to do every assignment completely and accurately, because even if the document isnt to be printed, it may be retrieved later and used as the starting point for a future exercise in the next step. Because you will be sharing a printer, it is important that you identify your work. Place your initials and the file name a double space below the last line of each document. For example, the first printed exercise in Chapter 1 is printed in step 2 on page 9, and it would be identified as yi:p9, C01Ex01, where yi is your initials. The page number and problem number must also be written on the Lesson Check List form that you will attach to printed documents to be turned in. Before turning in your work for grading, be sure to proofread it carefully and compare it against the Student Self-Check book at the counter. If you find any errors, correct them on the computer and reprint the assignment before submitting it. Lesson Check List Form Staple a Lesson Check List form to your work and list on the form the page numbers and file names of assignments you are submitting. For example, the first assignment you will be turning in for BUSC100A would be recorded as: p9, C01Ex01. Place your work in the IN Basket on the counter. We will circle any errors found on your assignments, and the graded set will be returned to you in the student bin. You will then need to correct the errors, reprint the assignment, and resubmit it with the original checklist and work. Your grade is lowered each time you need to redo an assignment. Daily Production Work Grading Some of your assignments will be evaluated with a letter grade and some will be checked for completion. Make sure everything you submit is letter perfect, and ask for help if instructions are confusing. For assignments that are evaluated with a letter grade, your grade will be lowered for each error we find, according to the table below. You must reprint and resubmit all problems with found errors. Daily Production Work Grading Scale = Exercise completed, or formatting correct. Identified errors corrected and resubmitted. A= Perfect. No typos and formatted correctly. B= One typographical or formatting error corrected and resubmitted. (lst Redo) C= Two or more errors corrected and resubmitted (lst Redo) D= Student chooses not to redo or 1st Redo is not acceptable (2nd Redo or more).

iTrainData.doc

14

F=

Assignment is not turned

Returning of Graded Work Your graded work will be returned to the alphabetic bins by the time clock. Please remove graded papers and save them for future reference if a question arises about our record of your completed assignments. Attendance Some chapters will go more quickly than others, but we recommend that you try to stick to a schedule of 3 hours a week per unit. If you take a short break, it is not necessary to log out and log back in again when you return. Be sure to log out on the timeclock computer before you leave the lab for the day. Startup Instructions 1. At the Windows desktop click the Start button (Windows round logo icon) in lower left. 2. Position the arrow pointer on All Programs which will cause another menu to display. Move the arrow pointer to Office 2007 and then Microsoft Word 2007 and click the left mouse button to start the MS Word program. Using the CD Which Accompanies your Textbook Your textbook includes a CD which contains the necessary files for completion of this course, or a CD may be borrowed in class. Follow the instructions on the inside back cover of your text for copying files in subfolders by chapter to your USB flash drive (or alternately to your floppy diskette) from the CD when it is called for. Ask an instructor for help with this step if you need it. Deleting a Folder (necessary only for working with files on the A: diskette drive) Delete the previous chapter folder before copying a chapter folder onto your disk. Before you do this, make sure the submitted lessons from that chapter have been graded and recorded on your grade card at the counter, and that you do not have any redos. Follow the instructions on the back cover of your textbook for Deleting a Folder.

To Exit Word
When you are finished working with Word and have saved all the necessary information, exit Word by clicking the Office Button the upper right of the taskbar. , then Exit; or click the Close button at

iTrainData.doc APPENDIX B: Sample Production Test

15

iTrainData.doc Production Test

16

Time limit: 45 minutes This is an open book, open note test. You have 45 minutes to complete the test, but be sure that we have started timing you before beginning. If you complete the test before your time is called, take the extra time to proofread before turning in your completed test. When time is called, you must stop keyboarding, but you may then print. Staple these instructions behind your completed printed test and submit with a blue testing form. Return the Testing Diskette with your test. 1. Open 100A Start2007 VerA.docx from the Testing Diskette. Save the document to this diskette with the new name Your Name 100A (where Your Name is replaced with your own name) 2. Change the document style to use the built-in Style Set Modern. 3. Change the style of the title DESKTOP PUBLISHING DESIGN to use Title. 4. Change the topics Designing a Document and Creating Focus to Heading 1. 5. Proof the document to catch the spelling and grammar errors (passive voice warning is OK). Read the first paragraph carefully to find a word that was not automatically flagged as a potential error (Word 2007 missed it), then correct this word manually. 6. Replace the word relevant in the first paragraph of the report with an appropriate synonym. 7. Change formatting of the 10 text paragraphs to indent the first line 0.6 8. Change the first bulleted list to a numbered list and change the line spacing to 1.5 lines. 9. At the bottom of your document, add the words Tabs Demo, and change the style to Subtitle. 10. Under the subtitle Tabs Demo, on a blank line using paragraph spacing Before and After 10 pts, setup tabs for center tab at 1.5 and 3, and right tab at 5.5 11. Tab to the first stop, use bold font for the word Chapter, then tab and enter Title and again for Number of Pages. 12. On the next line, setup tabs with dot leaders where appropriate, using left tab at 1, center tab at 3 and right tab at 5.5 13. Create the remainder of the text for the three chapters as shown below: Chapter Title Number of Pages Chapter 1 ....................... Creating ................................................23 Chapter 2 ..................... Formatting ..............................................22 Chapter 3 ..................... Paragraphs ..............................................30 (the 3 lines above will have spacing as the Paragraph Before and After 10 pts) 14. On a line in a paragraph spaced 24pts below the end of the last line above, type your name, using font Cambria, 14 points, aligned at the right margin. 15. Proofread carefully, print the report, save it on the testing diskette with the same name, and close MS Word.

iTrainData.doc APPENDIX C: Sample Grade Sheet

17

iTrainData.doc

18

iTrainData.doc APPENDIX D: Sample Rubric Sheet

19

iTrainData.doc

20

iTrainData.doc

21

APPENDIX E: Sample Scoring Sheet

iTrainData.doc

22

iTrainData.doc APPENDIX F: Chart Illustrating Original Data (with consequential removal of failing grades) & Chart Suggesting Removal of Three Lowest Scores

23

iTrainData.doc

24

Bubble Graph Emphasizing Too Many Failing Grades:

Line Graph Suggesting Removal of Three Lowest Grades:

iTrainData.doc APPENDIX G: Statistical Reference Items

25

iTrainData.doc Original Data


Original Data Analysis of Word Processing Class Word Class 2013 98 94 Word Class 2009 0 100 t-Test: Two-Sample Assuming Unequal Variances Word Word Class Class 2013 2009 64.97059 72.23529 1687.242 1465.943 34 34 0 66 -0.75437 0.226656 1.668271 0.453312 1.996564

26

94 93 0 67 97 0 0 100 94 0 91 100 100 49 94 0 0 60 0 92 88 88 90 99 0 92 82 77 84 0 86 100

94 71 95 0 0 78 85 0 92 96 97 93 95 0 100 95 95 0 95 0 97 83 100 90 100 97 79 64 90 78 97 100

Mean Variance Observations Hypothesized Mean Difference df t Stat P(T<=t) one-tail t Critical one-tail P(T<=t) two-tail t Critical two-tail

iTrainData.doc Data After Removal of Grade Fs


Word Class 2013 98 94 94 93 67 97 100 94 91 100 100 49 94 60 92 88 88 90 99 92 82 77 84 86 100 Word Class 2009 100 94 71 95 78 85 92 96 97 93 95 100 95 95 95 97 83 100 Word Class 2013 Mean Standard Error Median Mode Standard Deviation Sample Variance Kurtosis Skewness Range Minimum Maximum Sum Count 88.36 2.585781636 92 94 12.92890818 167.1566667 3.05807183 -1.78885 51 49 100 2209 25 Word Class 2009 Mean Standard Error Median Mode Standard Deviation Sample Variance Kurtosis Skewness Range Minimum Maximum Sum Count

27

90.96296 1.841653 95 100 9.569509 91.5755 1.302731 -1.3879 36 64 100 2456 27

t-Test: Two-Sample Assuming Unequal Variances Word Class 2013 88.36 167.1567 25 0 44 -0.81994 0.208335 1.68023 0.41667 2.015368 Word Class 2009 90.96296 91.5755 27

90 100 Mean 97 Variance 79 Observations Hypothesized Mean 64 Difference 90 df 78 t Stat 97 P(T<=t) one-tail 100 t Critical one-tail P(T<=t) two-tail t Critical two-tail

S-ar putea să vă placă și