Documente Academic
Documente Profesional
Documente Cultură
IMPLEMENTING A WEB-BASED
MEASUREMENT OF 3D UNDERSTANDING
Ken Sutton
The University of Newcastle
Ken.Sutton@newcastle.edu.au
Andrew Heathcote
The University of Newcastle
Andrew.Heathcote@newcastle.edu.au
Miles Bore
The University of Newcastle
Miles.Bore@newcastle.edu.au
ABSTRACT
This paper outlines the conversion of a psychometric test to a web-based study. The test measures understanding of
three-dimensional (3D) concepts as it applies to technical drawing. We describe the instrument in terms of its
subtests and examine implementation and data collection issues. Comments are provided about technical,
experimental design, participation and recruitment matters. We report comparisons of reliability and validity
measures against a parallel laboratory-based study. Advantages and disadvantages of web-based studies are
summarized and the relevance of the instrument to industry is addressed along with considerations for future
developments.
1.
INTRODUCTION
2.
MEASUREMENT INSTRUMENT
Understanding 3D concepts is an important aspect of technical drawing. Teaching these concepts is often
problematic. The overall aim of this project is to develop a measurement instrument that will underpin
learning tasks aimed at improving 3D understanding. The measurement instrument consists of 89 items
divided into six subtests that measure both accuracy and response time (RT). Five subtests are based on
previous psychological research while the sixth subtest assesses understanding of true length, an important
concept in technical drawing. We contend that understanding 3D concepts is best measured across a range
of tasks requiring different types of spatial reasoning (Blasko, Holiday-Darr, Mace & Blasko-Drabik,
2004). Descriptions of the subtests follow. Examples are shown in Figure 1, in order of description from
left to right, and from top to bottom.
2D-3D Recognition, B
Correct Fold A
Correct Fold B
Mental Rotation
Possible-Impossible
Structures
Dot Coordinate
Participants were presented with onscreen written instructions and explanations. They carried out practice
trials to become familiar with the procedure. No feedback in terms of accuracy and RT was given during
practice or testing. Recruitment occurred through the web site the study was linked to (Psychological
Research on the Net, http://psych.hanover.edu/Research/exponnet.html), and special psychological interest
groups. Participants were not included in the analysis if they indicated having technical drawing
experience or being under the age of 18. Participants could choose to nominate for a prize draw as a form
of reimbursement. Results from 30 web-participants and 41 laboratory participants (tested using parallel
implementation in the Superlab software) are reported.
3.
An existing laboratory study was adapted to run on a ColdFusion Server platform. As a measure to protect
against poor web experimental design, implementation was checked against 16 standards suggested by
Reips (2002). To test its reliability and validity, the web-based version needed to be consistent with the
laboratory version in terms of design and structure. This imposed restrictions on the programmer. Further,
because web-participants would work independently (laboratory participants questions were answered by
the experimenter), additional instructions and explanations were necessary. Participation in the web study
was more demanding in terms of reading and comprehending the test instructions than for the laboratory
study.
The programmer reported several issues. Because of the need to mirror the laboratory study, this
sometimes prevented the use of best practices and options suited to ColdFusion technology. A better
approach in the future may be to develop both laboratory and web studies in unison with consideration to
both environments. Part of this would include choosing better interface options for participant responses.
Another alternative is to run the web study in both environments. Other issues included how to
accommodate a variety of end-user connections and making allowances for timing differences due to
network bandwidth.
The main technical difficulty with the web implementation was caused by use of the back button in the
web browser and compilation problems with some participant data. The experimental design resulted in a
complex study that might not normally be regarded as suitable for online research. It required dedication
from participants who needed to work through detailed instructions and compulsorily complete practice
trials before testing began. In contrast to the laboratory study, participants worked in isolation with no
supervisory support. Collectively, these factors may have discouraged many potential participants. About
260 entered the demographics section of the study but this reduced to about 80 who actually made a start
to the testing phase. This further reduced to 30 who provided complete data. Recruitment was another
issue since the measures taken did not attract the sample size hoped for. A review of procedures indicated
that promotion was not widespread enough to attract the numbers anticipated. Future studies will
investigate other avenues.
Despite some difficulties, the study is still active and attracting new participants without any changes to
the experimental design. It offers advantages reported earlier and even with low numbers, it is likely to
eventually produce a sample size superior to that possible for internal laboratory-based studies.
4.
We tested reliability by comparing Cronbach alpha coefficients and validity by comparing mean accuracy
and RT between laboratory and web based samples. Results are reported in Table 1. Generally, both
laboratory and web-based subtest scores produced acceptable alpha reliability coefficients. Reliability
coefficients for the combined subtests indicate high reliability in both laboratory and web-based studies.
Significant differences in mean correct answers were found between laboratory and web samples on four
of the nine subtests. Participants in the web-based study produced significantly lower total scores than
laboratory-based participants.
Web-Based Study
Accuracy
RT
75 (.68)
22 (.76)
79* (.79)
18 (.74)
51* (.42)
21 (.69)
67 (.57)
16 (.78)
74 (.89)
12 (.76)
70 (.80)
19 (.70)
81* (.62)
7* (.83)
78* (.83)
7* (.88)
40 (.92)
22 (.94)
70* (.96)
13 (.95)
Laboratory Study
Accuracy
RT
82 (.09)
21 (.69)
92* (.48)
17 (.82)
72* (.38)
25 (.56)
75 (-.02)
23 (.51)
85 (.80)
11 (.68)
81 (.54)
20 (.50)
90* (.61)
5* (.60)
88* (.74)
3* (.84)
53 (.82)
29 (.74)
80* (.90)
17 (.87)
Accuracy (% correct) and RT (sec) for correct responses, with Cronbach alpha reliability coefficients in
brackets. A * indicates a significant (p<.05) difference between web and laboratory results.
5.
CONCLUSION
The web study reported in this paper is complex in design and demands more from its participants than
those who participate in a laboratory version or a typical online study. It was developed as a near duplicate
of the laboratory study with testing of reliability and validity in mind. The results are encouraging and
provide some confidence in the web as a reliable and valid data collection tool. Improvement in design
and structure is possible by reducing the number of items, simplifying instructions and changing the
interface layout to take better advantage of the tools within ColdFusion. Some attention to improving the
robustness of the experiment will help eliminate some unpredictable data collection anomalies. Along with
improved and more widespread promotion strategies, these measures may help achieve a higher
participant rate.
The data collected from this psychometric instrument underpin present development of a set of learning
tasks to improve understanding of 3D concepts. Modern computer software and web technology provide
innovative opportunities to achieve this.
6.
REFERENCES
Bertoline, G. R., & Miller, D. C. (1990). A visualization and orthographic drawing test using the macintosh
computer. Engineering Design and Graphics Journal. Vol. 54, No. 1, 1 7.
Birnbaum, M. (2004). Human research and data collection via the internet. Annual Review of Psychology. First
published online as a Review in Advance October 6, 2003.
Blasko, D. G., Holliday-Darr, K., Mace, D., & Blasko-Drabik, H. (2004). VIZ: The visualization assessment and
training Web site. Behavior Research Methods, Instruments & Computers. 36 (2), 256 260.
Bore, M., & Munro, D. (2002). Mental agility test. Personal Qualities Assessment. Newcastle: TUNRA Ltd.
Cooper, L. A. (1990). Mental representation of three-dimensional objects in visual problem solving and recognition.
Journal of Experimental Psychology: Learning, Memory and Cognition. Vol. 16, No. 6, 1097 1106.
Duesbury, R. T., & ONeil, H. F. (1996). Effect of type of practice in a computer-aided design environment in
visualizing three-dimensional objects from two-dimensional orthographic projections. Journal of Applied
Psychology. Vol. 81, No. 3, 249 260.
Metzler, D., & Shepard, S. (1988). Mental rotation: Effects of Dimensionality of objects and type of task. Journal of
Experimental Psychology: Human Perception and Performance. Vol. 14, No. 1, 3 11.
Reips, U.-D. (2002). Standards for internet-based experimenting. Experimental Psychology. Vol. 49 (4): 243 256.
Schacter, D. L., & Cooper, L. A. (1990). Implicit memory for unfamiliar objects depends on access to structural
descriptions. Journal of Experimental Psychology: General. Vol. 119, No. 1, 5 24.
Steyvers, M., & Malmberg, K. J. (2003). The effect of normative context variability on recognition memory. Journal
of Experimental Psychology: Learning, Memory and Cognition. Vol 29, No 5, 760 766.