Documente Academic
Documente Profesional
Documente Cultură
Rosane Pagano
Manchester Metropolitan University, UK
r.pagano@mmu.ac.uk
Abstract: The dramatic expansion of the higher education sector in the UK has contributed to a significant
increase in competition among organizations within the sector. Based in the North of England, Salford University
is one of the largest universities in the UK with regards to student numbers and programs of study. The Student
Information System supports the management of student information throughout key business activities, that is,
recruitment, admission, registration, invoicing, accommodation, assessment, progression, graduation and
careers. The purpose of this study is to perform a user-centered post-implementation evaluation of this business
critical IT system at Salford University.
profile. The sample included a wide variety of against other sources. Miller & Doyle (1984)
users across the University, who were then were able to identify: completeness, accuracy,
grouped into types. Gap analysis was flexibility and relevance of information outputs
performed on the whole sample as well as on as major success factors for users.
each of the groupings, yielding interesting
comparisons. The evaluation project was 2.1.2 The design of the front-end
conducted over a period of six months during
the year 2002. The 'front-end' of the system, consists of the
elements of the system that the users see and
One of the most compelling implications of the interact with, i.e. the forms, screens and
case study findings concerns the overall reports. A well designed 'front-end' is likely to
impact on IT project management in general result in successful use of an information
within the organization. How well the users’ system. Users evaluate the design of the 'front-
business needs are met by the Product and end' according to: its affect, efficiency,
the Service attributes is dependent upon the ‘learnability’, helpfulness and levels of Control
Process. The research outcome suggests that (Oulanov & Pajanillo, 2001).
the organization would greatly benefit from
moving to a continuous participative evaluation 2.1.3 The level of functionality
activity integrated into the project management The functionality of a system relates to how it
process. performs various processes concerned with
the business needs of the users. Clegg et al
2. User evaluation of an (1997) determined functionality and its
information system – consequential impact on business processes
Conceptual framework as one of the principle variables that influence
users reactions to a new Information System.
Projects are more likely to be successful where
users do not become disillusioned, from having 2.1.4 The quality of training
overly high expectations of a system that
cannot be met. In order for users to perceive The quality of training is vital to project
an Information System as a success, it is success. It is the main way users learn about
important for their expectations and the system, and its quality will affect not only
perceptions to be managed effectively (Clegg their successful use of the system but also
et al, 1997) (Marcella & Middleton, 1996) (Lim their attitudes towards it. Riley & Smith (1997)
& Tang, 2000). Whyte & Bytheway (1996) observed how insufficient training could
proposed a holistic approach to IS evaluation contribute to users 'resistance to change', and
by specifying three core elements to a system: Clegg et al (1997) credited training as core to
the Product, that is, hardware, software, and successful technological change.
training provided to users; the Service, that is,
how users are responded to; and the Process 2.1.5 The quality of documentation
by which the Product and Service are
User documentation of an information system
provided. How well the business needs of the
usually takes the form of training manuals and
user are met by product and service attributes
user instructions. To achieve documentation of
of the system is dependent upon the Process,
a high quality and of use to users, it should
i.e. the Management of the Project.
take a user perspective, by satisfying the
varying skill levels among users, orientating
2.1 The product and reassuring users (Nahl, 1999).
Aspects of the product which users value are
the quality information held, the design of the 2.2 The service
'front end', the level of functionality, quality of
Elements to the service provided which are
training and quality of documentation.
important to users are concerned with user
involvement, communication and response to
2.1.1 The quality of the information held their needs.
The information held in an information system
needs to be reliable and accurate, if the users 2.2.1 User involvement
are to have any faith in the system. When
Oulanov & Parjrillo (2001) emphasised the
users cease to trust the data held in a system,
importance of user participation in system
they either stop using the system and create
planning and design as significant factors in
their own smaller systems, or they spend time
their perceiving project success. Cicmill (1999)
checking the information outputs of the system
each system attribute on a five-point scale. The scales are described in Table 2.
Table 2: Scale for evaluating the system’s attributes
Importance
Scale-Point 1 2 3 4 5
Description Irrelevant Not Important Don’t Know Important Critical
PERFORMANCE
Scale-Point 1 2 3 4 5
Description Very Poor Poor Average Good Excellent
was 2.3, which is interpreted as between 'Poor' indicates a variation of two points on a scale of
and 'Average' on the Performance scale. This five used to evaluate the system’s attributes. A
evaluation did not meet the level of consensus very high standard deviation of 5 was found for
among users that was expected, but relatively the ‘User Involvement’ attribute score. This
convergent in the light of users’ profile. Given lack of consensus, when analyzed in the light
that the standard deviation ranged between of the rich data (open questions), pointed to
1.3 and 1.8 on Importance scores, and the need of managing users’ expectation
between 1.3 and 1.6 (with one exception during the project.
computed at 5) on Performance scores, it
Table 5: Attribute Importance Mean and Attribute Performance Mean for all respondents
Importance Performance Gap
Attribute
Number Rank Mean SD Rank Mean SD
1 16 3.6512 1.6768 15 2.2674 1.3573 -1.3837
2 8 3.7558 1.7253 13 2.2941 1.4024 -1.4617
3 21 3.2558 1.5069 22 1.9294 1.4559 -1.3264
4 22 3.1860 1.5158 24 1.7765 1.4155 -1.4096
5 24 2.7791 1.3921 23 1.7976 1.4364 -0.9815
6 1 4.0233 1.7213 18 2.1412 1.3152 -1.8821
7 2 3.9884 1.6988 21 1.9412 1.2361 -2.0472
8 6 3.8372 1.8151 19 2.0706 1.2223 -1.7666
9 12 3.6977 1.6760 20 2.0465 1.2342 -1.6512
10 15 3.6628 1.6640 17 2.1512 1.3344 -1.5116
11 23 2.9767 1.6387 12 2.3095 1.6527 -0.6672
12 19 3.5349 1.6445 2 2.8488 1.4267 -0.6860
13 9 3.7442 1.7552 14 2.2857 1.4405 -1.4585
14 17 3.6163 1.7221 15 2.2674 1.4028 -1.3488
15 3 3.8837 1.7333 5 2.7558 1.5460 -1.1279
16 14 3.6744 1.6669 7 2.6941 1.5417 -0.9803
17 18 3.5698 1.6591 4 2.7857 5.0003 -0.7841
18 3 3.8837 1.7102 9 2.5238 1.5169 -1.3599
19 5 3.8488 1.7375 10 2.5176 1.4844 -1.3312
20 13 3.6860 1.7042 1 3.1395 1.5135 -0.5465
21 20 3.4651 1.6138 8 2.6667 1.6190 -0.7984
22 11 3.7326 1.6777 6 2.7326 1.4737 -1.0000
23 9 3.7442 1.6431 3 2.8235 1.5515 -0.9207
24 7 3.7907 1.7400 11 2.3488 1.4006 -1.4419
The mean Gap between the Importance Performance score mentioned above which
scores and the Performance scores for all was obtained by discriminating the attributes.
attributes was a negative one of –1.2. This
indicates that users have an overall negative Graph 1 shows the gap between importance
perception towards the implementation of the and performance for each attribute, as per
Student Information System. When asked to Table 5. It can be seen that, overall, users
rate the system’s performance overall (no have negative perceptions about the
attribute discrimination), users’ mean score implementation of the Student Information
was 2.2, which is ‘Poor’ on the Performance System.
scale. This result corroborates the mean
Results Overall
4.5
4.0
3.5
3.0
2.5
Rating
2.0
1.5
1.0
0.5
0.0
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24
Attributes
Graph 1: Gap between Attribute Importance Mean and Attribute Performance Mean
The range of mean Importance scores was to the questionnaire suggested otherwise. The
between 2.8 and 4. The highest scores – largest gaps (in absolute value), between –2.0
between 3.8 and 4.2, and interpreted as and –1.6, were for the attributes ‘Accuracy’,
'Important' on the Importance scale – were for ‘Constraint Controls’, ‘Effectiveness’ and
the attributes ‘Accuracy’, ‘Constraint Controls’, ‘Navigation’, suggesting that these are the
‘Effectiveness’, ‘Project Management’, ‘Skilled areas of the system that users are most
Project Staff’, and ‘Training’. The lowest scores dissatisfied with.
– between 2.8 and 3.2, and interpreted as
‘Don’t Know’ on the Importance scale – were An analysis similar to the one performed above
for the attributes ‘University Strategy (2, 3)’ and for the whole sample was also performed for
‘Transparency’. all the user groupings (see Table 3), yielding
interesting comparisons. Due to the word limit,
The range of mean Performance scores was these were not included in this text. An
between 1.8 and 3.1. The highest scores – extensive research report, including the
between 2.8 and 3.2, and interpreted as detailed questionnaire, can be obtained from
‘Average’ on the Performance scale – were for the first author on request.
the attributes ‘Training Manuals’,
‘Communication’, ‘Response’. The lowest Another way of viewing and exploring the data
scores – between 1.8 and 2.2, and interpreted set was to focus on a particular attribute and to
as ‘Poor’ on the Performance scale – were for examine its Importance and Performance
the attributes ‘Accuracy’, ‘Constraint Controls’, ratings across various user groupings. Take for
‘Effectiveness’, ‘University Strategy (1-3)’, example the attribute ‘Accuracy’, which is
‘Navigation’, and ‘Ease of Use’. This makes related to the quality of the information held on
one third of the attributes at the lowest end of the system. For this attribute, regarding
the performance evaluation. perceived Performance, Graph 2 shows at a
glance what can be interpreted as:
The gap between Importance and the range of user perceptions in each
Performance scores ranged from –2.0 to –0.5. grouping, indicated by the maximum and
The smallest gaps (in absolute value), minimum scores linked by a vertical line;
between –0.8 and –0.4, were for the attributes the point of convergence of perceptions in
‘Focus’, ‘Transparency’, ‘Training Manuals’, each grouping, indicated by the mean
‘Communication’, and ‘User Involvement’. This score (black dot on the range line)
seems to indicate a marginal dissatisfaction
Table 6 displays the data values that underline
regarding these attributes (negative values).
Graph 2. It also includes additional data helpful
The notable exception is the ‘User
to understand this scenario, that is, the
Involvement’ attribute, for which free answers
standard deviation (SD) of the performance Table 6: Performance ratings per user
scores in each grouping. The concentration of groupings for the ‘Accuracy’ attribute
scores around the mean can be interpreted as User groupings Max Min Mean SD
indicative of the level of consensus among 1
users of the same grouping regarding Clerical 5.0 1.0 2.8 1.1
2 Intermediate 4.0 1.0 2.4 1.2
perceived performance. By inspecting the ‘SD’
column of Table 6 in conjunction with Graph 2, 3 Managers 4.0 1.0 2.3 0.9
it can be concluded that, in the majority of 4 Daily 5.0 1.0 2.5 1.1
cases, ‘Accuracy’ performance scores are 5 Weekly 4.0 1.0 2.1 1.1
between ‘Poor’ and ‘Average’. The extreme 6 Monthly 4.0 1.0 2.6 1.3
values (5 and 1) observed in the case of
7 Central 4.0 1.0 2.2 1.0
‘Clerical’, ‘Daily’ and ‘School’ groupings seem
to be few isolated responses. 8 Faculty 4.0 1.0 2.5 1.2
9 School 5.0 1.0 2.6 1.3
Accuracy
6.0
Performance Rating
5.0
4.0
3.0
2.0
1.0
0.0
1 2 3 4 5 6 7 8 9
User Groupings
Graph 2: Performance ratings per user groupings for the ‘Accuracy’ attribute
Managers
5.0
Performance Rating
4.0
3.0
2.0
1.0
0.0
11
13
15
17
19
21
23
1
Attributes
Graph 3: Performance ratings per attribute for the ‘Managers’ user group
Using the same method as above to grouping that attributes more directly
summarize the data set, the focus was then dependent upon project management are
placed on a particular user grouping and its below average standards, for example, ‘User
Importance and Performance ratings across all Involvement’ (Mean 2.2, SD 0.9) and
attributes (see Table 1 for attribute ‘Understanding User Needs’ (Mean 2.4, SD
descriptions) were plotted for comparison. 0.9).
Consider for example the user group
‘Managers’. Graph 3 shows a remarkably The low performance ratings of the Navigation
similar pattern of perceptions in this group and Ease of Use attributes of the system show
across attributes. The range of scores is wide that many users find the system difficult to use.
and falls within the same boundaries for all The more practiced, and therefore more
attributes but three. More important, the skilled, regular users rated the performance of
standard deviation around the mean score for these attributes higher than non-regular users,
the vast majority of attributes was smaller (less indicating that the less skilled the user the less
than 1) in this group than in any other user likely they are to give system usability a high
group. This indicates a higher level of evaluation. Marcella & Middleton (1996)
consensus on the performance score of each observed that users will only access available
particular attribute (mean score). Attribute training and education on a 'need to know'
mean scores vary between ‘Poor’ (score 2) basis, which revealed a predominance of
and ‘Average’ (score 3) as seen in Graph 3. infrequent consultation of Training Manuals
The data collected suggests that there is and Help Screens amongst the users. Table 7
agreed perception within the ‘Managers’ illustrates this point.
Table 7: User consultation of information on the system provided by the project team
Daily Weekly Monthly Rarely Never
Training Manuals 12% 22% 26% 36% 4%
Reporting Help Screen 1% 3% 6% 46% 44%
System’s Intranet Site 0 1% 3% 27% 69%
E-mail Correspondence 10% 12% 9% 49% 20%
System’s News Letter 0 1% 9% 53% 37%
.
The medium high importance ratings of they do not see the value in the system being
Navigation and Ease of Use amongst users, able to support Business Processes and
particularly, non-regular users, support the University Strategy. As Riley & Smith (1997)
findings of Kebede (2002) that users wish to pointed out, a lack of 'felt need' for change can
access information within the constraints of result in much of a systems attributes being
their skills. The difficulties that users had in undervalued and thereby affect the success of
operating the system seemed to be the cause the project. It is vital, therefore, that the issue
of many of the negative feelings they had of data quality is resolved, so that user trust
about the Student Information System. As and utilisation of the system can be won.
observed by Riley & Smith (1997), users may
see a new system as threatening if it requires Marcella & Middleton (1996), Clegg et al
new skills from them. The findings support (1997) and Lim & Tang (2000) all commented
Kebede’s (2002) view that user skills should be upon the importance of managing user
taken into consideration when implementing expectations, as reproduced in the rich data of
new information technology. this study, with emerging themes of users
being 'promised great things' but actually
The strong references to the continued use of finding the system problematic, inconvenient
internal departmental systems, in the rich data and a general disappointment. There was also
(answer to open questions) indicate that a significant reference made of the legacy
‘resistance to change’ is present amongst system being superior, despite it having less
users, due to their reluctance to replace these functionality, indicating that users have
internal systems with the Student Information preconceived expectations of how a system
System. As already identified, the mistrust of should look and function, based upon their
the data quality within the system is the experiences of the previous system. However,
primary reason for this resistance to change. there were some optimistic statements made
This continued mistrust and dependency on about the future of the new system. It is
internal systems, however, means the Student important that these higher expectations are
Information System is under utilised as users built upon and managed effectively.
do not run the system to its full capacity as