Sunteți pe pagina 1din 12

Computers & Education 55 (2010) 554e565

Contents lists available at ScienceDirect

Computers & Education


journal homepage: www.elsevier.com/locate/compedu

e-Learning process management and the e-learning performance: Results of


a European empirical study

Maja Cukusic a, *, Niksa Alfirevi
c a, Andrina Grani 
c b, Zeljko Gara
ca a
a
Faculty of Economics, Matice hrvatske 31, Split 21000, Croatia
b
Faculty of Science, Nikole Tesle 12, Split 21000, Croatia

a r t i c l e i n f o a b s t r a c t

Article history: The aim of this paper was to design and assess a comprehensive model for managing the e-learning
ĆReceived 25 November 2009 process and to define the relationship between systematic implementation of the model, outcomes of
Received in revised form certain e-learning aspects and subject of e-learning. The validation of the model was performed by using
10 February 2010
two questionnaires sent via e-mail to teachers and field experts from the chosen sample of 14 European
Accepted 11 February 2010
schools participating in an EU-funded project. Research results imply the existence of a clear link
between planning and controlling of the e-learning process and its learning outcomes. On the other
Keywords:
hand, no empirical relationship between the e-learning outcomes and the subject of learning has been
e-Learning
e-Learning process management established. It is believed that the model and its practical implications can be used by institutions
e-Learning process evaluation engaged in e-learning, or as a process model for introducing e-learning related activities.
Ó 2010 Elsevier Ltd. All rights reserved.

1. Introduction

Knowledge management, innovation and lifelong learning are in the forefront of global economic development, while investments in
education become a key indicator of understanding the dynamics of contemporary business and society (European Council, 2009). In today's
economy, companies face increasing pressure to cut costs and provide customers with high levels of quality they expect. In this context,
e-learning represents an almost ideal approach for a flexible and cost-effective competence development since it can be used without
restrictions related to physical location and time of usage (Horton & Horton, 2003). However, it is not enough to develop an appropriate
curriculum and implement it through an e-learning system, given that e-learning needs to be appropriately integrated into the overall
system of human resource management (Kuechler & Schmidt, 2007). The same issue is relevant for educational institutions, as well:
a fundamental problem of many different approaches to e-learning implementation is the assumption that the use of modern technology
per se means automatic and significant progress towards achieving the learning goals (Sanchez & Salinas, 2008). Often enough, it is
considered that it is sufficient to “integrate” technology solution in the process of teaching and learning. However, it is much more than
this e the whole process should be appropriately (re)designed.
Many authors provide their own definitions of (business) processes. They are all similar and imply that a set of inter-related activities is
conducted with a purpose to create results of added value, congruent with the previously defined goals of the business system (cf. Garaca,
2009). With the increasing requirements to provide evidence of effectiveness for all kinds of processes, including education/training, it is
essential to design and empirically investigate the e-learning process and the factors which influence its performance.

1.1. e-Learning process


Bahtijarevic-Siber (1999) presents a model of an educational process, starting from determining educational needs, which is the basis for
performing the other steps. Those include: selecting those who should attend the education (learners); determining goals of the educational
program; selecting criteria and designing assessment techniques; selecting the content and educational programs; conducting a program;
selecting teaching methods; program monitoring; and evaluation of learning effects. However, learning excellence can be achieved only by

* Corresponding author. Tel.: þ385 21 430 758; fax: þ385 21 430 701.

E-mail address: maja.cukusic@efst.hr (M. Cukusi
c).

0360-1315/$ e see front matter Ó 2010 Elsevier Ltd. All rights reserved.
doi:10.1016/j.compedu.2010.02.017

M. Cukusic et al. / Computers & Education 55 (2010) 554e565 555

defining competencies, i.e. learning objectives expressing what learners should learn and criteria for their evaluation, which set perfor-
mance standards for learners and educators (Angelo, 1996).
An attempt to design the e-learning experience should primarily start by defining methodological strategies, setting objectives/learning
outcomes and designing learning activities which include appropriate e-learning content (cf. Phipps & Kelly, 2006). A technological platform
is then selected by examining the methodological options/requirements in order to achieve the full potential of the learning process and
successfully develop the educational materials. In addition, identification of learners' educational needs should be performed at the
beginning of an e-learning course, including necessary skills to access the educational program (Lewis & Merton, 1996). Following the results
of their research and similar examples, Conole and Oliver (1998) suggest that the process of e-learning should advance through the
following steps:

1. Review current course/program;


2. Set clear goals and expected outcomes of the course/program;
3. Set learning methods and activities (the e-learning scenario) necessary to achieve goals and learning outcomes;
4. Define organization and presentation of activities; identify the best method based on prior knowledge and skills of learners with regard
to suitability of learning media and flexibility of the learning sequence;
5. Define learning material for every activity; determine whether the material will be created, purchased, or the existing material will be
adapted;
6. Select appropriate assessment models;
7. Relate individual scenarios into a single program;
8. Identify skills and other requirements for access;
9. Set requirements related to resources and infrastructure (e.g. need for additional training of staff), if necessary.

None of the described process models encompasses all stages of an e-learning process, starting from preparation and planning,
advancing to organization and implementation of e-learning activities, and finally controlling of the process flow and its outcomes. This
paper strived to develop such a process model and empirically validate its relationships with the e-learning performance, which is inter-
preted in terms of successful interaction of elements contained within the e-learning scenario. Further on, the term e-learning performance
is used within the context of the e-learning process management and it should be distinguished from the pedagogical interpretation of the
learning performance, which can be defined as the individual process of interpreting and applying information and/or previous experience,
transforming current knowledge and exploring new intellectual pathways (cf. Hayes, 2010, p. 237).

1.2. e-Learning outcomes

Controlling outcomes of an e-learning process enables continuous measurement of quality and quantity targets, which identifies areas
with the potential for improvement. One of the effective ways to understand, describe and assess aspects of e-learning design and
implementation is Reeves' (1994) dimensional model. Reeves introduces the concept of a pedagogical dimension, i.e. ability of an e-learning
system to initiate a strong interaction between teacher and student, track the progress of students, support teachers, adhere to individual
learners' differences and the like. As such, the dimensions can be used as a criterion for understanding and comparing the results of
e-learning. The 14 pedagogical dimensions of computer-based education (CBE) proposed by Reeves are concerned with those aspects of the
design and implementation of the system under study that affect directly the learning process. The dimensions are illustrated in Fig. 2
further below.
In collaboration with experienced teachers, Earle (2002) has concluded that the pedagogical dimension model allows simple concep-
tualization of possible educational program outcomes. Explaining the need for evaluation of learning outcomes, Reeves et al. (2002) claim
that the results of educational program evaluations are often inadequate, as they are in the form of descriptive statements. Moreover,
evaluators often consider most educational programs incomparable. Reeves' methodology has been used on several occasions (e.g. Boer &
Collis, 2002; Leonard, 2003; Quinton, Dreher, Fisher, & Houghton, 2005), with dimensions representing a range of pedagogical criteria by
which various forms of a computer-supported educational process were assessed and compared. For this purpose, the original conceptu-
alization of 14 pedagogical dimensions can be either expanded, or reduced. However, for the purpose of this study, the original methodology
has been used since it is believed that it fully covers all aspects of an “average” e-learning system.

2. Designing the e-learning process management model

Three stages of the e-learning process management model can be identified as planning, organizing/implementing and controlling the
e-learning process. Such a generic model could be, like most management processes, aligned with the iterative Deming's Plan-Do-Check-Act
(PDCA) or Plan-Do-Study-Act (PDSA) process (Moen & Norman, 2006). Just as in the PDSA cycle, the e-learning process progresses through
the following stages (in Fig. 1, we detail out every stage):

1. planning (development of operating plans and e-learning scenarios),


2. organization/implementation of e-learning (implementation of e-learning scenarios in realistic settings),
3. controlling (evaluation of various aspects of the process and its performance), and
4. improving the process and the platform.

The basic principle of this process is iteration e once initial assumptions are validated (or rejected) in the controlling stage, the cycle is
repeated and improvement applied.
556 
M. Cukusic et al. / Computers & Education 55 (2010) 554e565

Fig. 1. e-Learning process management.

2.1. Planning the e-learning process

The methodological foundation of the future e-learning experience starts from existing models that adequately use the potential of new
technology in education. Such models can be compared with the list of user requirements related to project's/program's methodological
foundations (see, e.g. Nikolova & Tzanavari, 2006). If the system is to be used in a broader context, following the analysis of contemporary
educational trends, it is also necessary to explore the local context of educational institutions in which the system will be used. The purpose

of this exercise is to examine specific characteristics of educational processes and practices (cf. Cuku si
c et al., 2007, for the relevant analysis
of European secondary schools). Any planning process includes situational analysis (environmental monitoring), goal/objective setting,
identification and selection of the most appropriate action plan to realize the goal (Goodstein, Nolan, & Pfeiffer, 1993). These generic
planning activities are mapped to e-learning planning in Table 1.
The operational plan, referred to as an e-learning scenario, developed during this stage is a significant mechanism for managing the
learning experience. Quality of the plan directly affects the e-learning performance, which can be manifested in the feedback of the

participants, reusability, flexibility of implementation, etc. The activities required to develop a scenario plan are (Cuku si
c, Grani
c, Mifsud, &
Zammit, 2008):

1. Preparatory activities. After a topic or a unit that will be presented within an e-learning system has been identified, group or groups of
learners should be detected, and the following should be identified: learning topic/area, learning context/level, domain of the scenario,
number of participants, necessary background knowledge and pedagogical approach.
2. Setting goals/expected outcomes. Objectives of each learning activity should be defined, including the expected outcomes of the entire
e-learning scenario. Learning objectives and teachers' expectations should be clearly expressed and communicated to students from the
beginning.
3. Describing learning activities. In relation to objectives, it is possible to describe the sequence of learning activities that the scenario will
follow.

Table 1
e-Learning planning.

Planning phase in e-learning


1. Analysis of contemporary trends in education; review of existing models that exploit the potential of new technologies.
2. Collection of user requirements (related to technology and pedagogy) (Nikolova & Tzanavari, 2006); setting target outcomes of e-learning (Grani 
c, Cukusi
c,
Tzanavari, & Papadopoulos, 2009).

3. Investigation of local contexts (school, university, organization), specific educational processes and existing educational practices (Cuku si
c, Grani
c, Pagden,

Zammit, & Walker, 2006; Popova et al., 2008; Cuku si
c et al., 2007).
4. Devising a learning environment that matches the content, target audience and technical platform (Grani 
c, Mifsud, & Cukusi
c, 2009).
5. Formulation of a methodological framework by which educational decisions would be made later (Cuku  si
c et al., 2006).
6. Selection of learning styles and techniques of assessment (Cuku  si
c et al., 2008).
7. Development of operational plans i.e. e-learning scenarios with detailed learning activities, expected outcomes, resources, assessment strategies and anticipated
time (Zoakou et al., 2007).
8. Identification of training gaps, planning procurement/production of learning materials; finalization of educational program budget (Limanauskiene, Stuikys, &
Sitikovs, 2008).

M. Cukusic et al. / Computers & Education 55 (2010) 554e565 557

Table 2
Organization/implementation of e-learning organization/implementation.

Organization phase in e-learning


1. Defining the structure of the course and collecting relevant materials; selection of authoring tools.
2. 
Development, testing and evaluation of the course: release of learning material according to the scenario; setting learner tasks/assignments (Cuku si
c et al., 2008).
3. Establishing rules for project work, research or independent tasks (as suggested in the scenario); agreement on etiquette (Grani 
c, Mifsud, & Cukusic, 2009).
4. Maintenance of virtual consultations; encouraging discourse; maintenance of learning materials (Tzanavari, 2007).

4. Listing tools and resources. Technical equipment to be used (e.g. an e-learning platform, PDAs, laptops, other hardware or software), as
well as other resources (tutorials, digital boards, maps, links and any other material) should be listed next to every learning activity,
5. Determining assessment strategy.
6. Allocation of time. It is necessary to allocate a specific time interval to each activity, which is related to the number of lessons per week,
the duration of independent projects, field work, etc.
7. Planning of learning content structure and type. Appropriate tools/resources/assessment techniques require appropriate content. It is
necessary to list all online or offline content, type of media, file formats, etc.

After a thorough literature review and analysis of contemporary trends, Zoakou, Tzanavari, Papadopoulos, and Sotiriou (2007) identified,
integrated and adapted two scenario templates, initially developed by Kynigos (1995) and Beetham (2004). This new scenario template was
used within the framework of an EU-funded project where the process model, which we present in this paper, was introduced and evaluated
as well.

2.2. Organizing/implementing the e-learning process

After planning, the e-learning scenario should be implemented. Learning content is created and published on the platform; students are
presented with the system and instructed how to use it (see Table 2). Gradually, with the progress of the learning process, content is
expanded by introducing new written materials, photographs, video material, news, articles, quizzes, PowerPoint presentations, links, etc.
It is also recommended that online or offline counseling is provided to students as to assist them in e-learning activities.
In this stage, the role of materials to support teachers is extremely important. In addition to templates and examples of learning plans to
create one's own scenarios (cf. Zoakou et al., 2008), it is necessary to select and prepare appropriate guidelines which teachers related both
to the methodological issues and learning content, if they are expected to independently develop the content (cf. Tzanavari, 2007). Creating
content is not a trivial task since it requires specific skills, knowledge and time. It includes analysis of the individual learning objectives and
target audience, defining the structure and gathering relevant material, choice of tools and formats, development, testing and evaluation
and maintenance of learning materials.

2.3. Controlling the e-learning process

The purpose of control can be best expressed in terms of monitoring/evaluation of the entire process and its performance in relation to
objectives and/or standards (see Table 3). This stage should comprise platform testing along with recording platform errors, if any, as well as
recommending process improvements, both from the technical and the methodological point of view. Scenario evaluation is also extremely
important e it should involve constant monitoring of learners' interest and review of planned activities, critical reflection and self-reflection
of all participants after each block of activities. Evaluation should be continuous, informal and formal. In addition to scenario evaluation by
the participants in the process (teachers, students), it would be desirable to involve other experts and, before the implementation, identify
possible improvements. In addition to qualitative feedback in the form of comments, the expert evaluation of pedagogical dimensions,
aspects of the scenarios can use the already mentioned Reeves' (1994) methodology.
Controlling the e-learning process should include inspection of, at least, three different components e the e-learning platform, scenario
and user feedback (Kellner et al., 2008). Techniques of platform performance evaluation should be based on standard assessments of
technical systems, while respecting specifics of the platform as means of learning. Quality control of a scenario should start during the
planning phase, where an expert can be included in commenting the scenario from a methodological and technical viewpoint. Upon
completion of the scenario, relevant user feedback is to be summarized e.g. by using the case record forms (Kellner et al., 2008). The
evaluation records should contain a rich selection of material and remain open to various interpretations. This approach represents
a qualitative evaluation of the best way to show how technology (i.e. e-learning platform) can support learning in a particular context.
However, it is also necessary to evaluate outcomes and pedagogical dimensions of scenarios in order to compare them with target variables.

Table 3
Controlling phase in e-learning controlling.

Controlling phase in e-learning


1. Defining methods and techniques for evaluation of e-learning platform, processes and scenarios; setting goals/standards in: efficiency, effectiveness and
memorability of the platform, user satisfaction, outcomes of specific e-learning aspects, and quality of scenarios.
2. Establishing measuring instruments and mechanisms; conducting tests with users (teachers, students, experts); checking site logs; reporting on usability issues.
3. Analysis of collected data; comparison of educational programs, scenarios; comparison of platform performance with its potential (Kellner et al., 2008).
4. Making judgements; suggesting improvements; changing the platform and/or organizational improvements, practice improvements; identifying opportunities
for wider use of e-learning scenarios and content.
558 
M. Cukusic et al. / Computers & Education 55 (2010) 554e565

3. Empirical validation of the e-learning model relationship with the e-learning performance

The focus of this study is establishing the link between the systematic management of e-learning (i.e. planning, organization and control
of the e-learning process) and the outcomes of the process (i.e. e-learning performance) as measured by Reeves on the one hand, and the
dependence of the learning subject/content on the other hand. The previously published research, which includes Anderson's conversa-
tional model of e-learning (Anderson, 2007), a decision-making model for the introduction of e-learning in Higher Education (Divjak &
Begicevi
c, 2006), the confirmatory factor model for e-learning critical success factors (Selim, 2007) and similar models, does not focus
on the entire process of e-learning but on a particular dimension or purpose of the process and/or the e-learning platform. Even more, in our
overall research, special focus was put on the platform performance since it may influence the systematic implementation of the e-learning

model. Results of this part of the study are available in (Cuku si 
c, 2009) and (Granic & Cukusi
c, submitted for publication) and therefore are
not addressed in this paper.

3.1. Context of the empirical research

Empirical research was conducted in July 2008 on a population of participants (teachers and experts) of the UNITE, Unified e-Learning
Environment for School, project (www.unite-ist.org). UNITE was a 30-month project, partially supported by the European Community
under the Information Society Technologies (IST) priority of the 6th Framework Programme for R&D. The main objective of the project was
to provide innovative approaches to the education of young Europeans by integrating into the curricula of a number of schools in the
Europe-wide network different state-of-the-art technologies in e-learning, also taking into consideration innovation in technology and
pedagogy.
The project was successfully completed in 2008. It has been evaluated as successful, although the sustainability of project's results should
have been discussed throughout the development and implementation of a project (in particular at the planning stage), rather than close to
end of the project, as it is the case with most EU-funded projects/initiatives (Stansfield & Connolly, 2009). Schönwald (2003) states that the
sustainable implementation of e-Learning requires good change management that takes into account “the strategic, didactic, organizational,
economic and cultural dimension”. UNITE consortium did try to address these dimensions in Exploitation plan report (see Downie, Wunner, &
MacRae, 2008), but possibly too late. The report outlines how the platform can be realistically sustained and commercially exploited. The
exploitable results of the project are clearly divided into two groups: (i) technical outcomes (the platform which combined and enhanced
three existing technologies) and (ii) pedagogical results (many valuable outcomes that are ready for non-commercial exploitation some of
which are: pedagogical framework design and implementation facilitating the learning scenario design, methodology of scenario design,
general scenario template, over 40 scenarios implemented and translated in English, all content that comes with the scenarios, validation
framework, numerous published papers etc). It is important to note that all these “pedagogical results” were developed similarly to Brox's
(2009) business model for the exchange of e-learning courses i.e. scenarios, content, concepts and training materials were significantly
reused and shared in this international network formally called Network of schools. Metadata and learning object standardization were
crucial in order to accomplish this level of reusability.
A decision was made to keep UNITE available for both existing and new users until December 2009. Unfortunately, both the technical
platform and the UNITE website are unavailable since, due to insufficient funds and/or interest. But, the sustainability of the pedagogical
results is nevertheless assured because all these results are fed into new work/projects that partners or users (primarily teachers) become
involved with since they are not tied to a particular platform. For example, the scenario developed for UNITE in a Croatian school (presented

in Granic, Mifsud, & Cuku si
c, 2009) is implemented repeatedly but using another technical platform, Moodle. Besides, every partner will
continue to use their UNITE experiences and contacts while teachers that participated in the project are able to influence other teachers.
Some schools did consider options for purchasing the platform or its elements but it is hard to compete with Moodle or Ilias, which are free
of charge. Due to the facts that UNITE as a whole was a prototype and no further (international) funding was acquired it is quite unlikely that
a UNITE spin-off company will be formed or a commercial product would be offered. This is in line with Salajan's (2007) claims that projects
receiving EU funding often fail to produce commercially viable applications that could boost the institutional revenues. He sees benefits
mostly as individuals' (p. 374) “in the way of developing their own networks of professional expertise, learning about practices pursued in
other corners of Europe, satisfying a personal or professional mission, but they amount to small ‘personal victories’ that are not integrated in
long-term institutional strategies”.

3.2. Research methodology

The empirical study of the model is based on variables that could be “captured” by a survey, i.e. variables describing stages of model
implementation in terms of objective indicators (time, presence, existence of some components and the like), but also of subjective ones
(estimated dimension outcomes). Subjectivity of outcome assessments was confirmed by their comparison with estimates made by expert
pedagogical evaluators who observed firsthand the implementation of e-learning scenarios in schools. Experts were well acquainted with
the e-learning scenarios being evaluated since they were involved in two iterations of the scenario quality control prior to their imple-
mentation and were involved in the project from the beginning. They all have significant expertise in e-learning since they participated in
various activities related to e-learning development through more e-learning initiatives prior to UNITE project.
The empirical research was based on two hypotheses: (i) the hypothesized relationship between the systematic implementation of all
e-learning process management stages and the targeted outcomes/performance of e-learning and (ii) independence of e-learning perfor-
mance from the subject of learning (i.e. the subject of the e-learning scenario).
The systematic implementation of processes related to planning, organization and control of one specific e-learning platform named
UNITE, has been evaluated. A list of participating schools is presented in Table 4. Total population is made of 15 teachers from schools
that use(d) this platform and were involved in the project (overall 46 teachers). Profiles of schools/teachers and specific characteristics
of the school environment(s), including features of national education systems have been discussed in two comprehensive studies

(Cuku si 
c et al., 2007; Limanauskiene, Romanova, Sitikovs, & Stuikys, 2008). To put the results and analysis presented further on in

M. Cukusic et al. / Computers & Education 55 (2010) 554e565 559

Table 4
List of participating schools.

School Country
1. 134. Hebrew and English Language School “Dimcho Debelainov”, Sofia Bulgaria
2. High School of Mathematics and Science “Acad. L. Tchakalov”, Sofia Bulgaria
3. The English School, Nicosia Cyprus
4. Ellinogermaniki Agogi School, Athens Greece
5. Elementary school “Spinut”, Split Croatia
6. 3. secondary school, Riga Latvia
7. Kaunas University of Technology Gymnasium, Kaunas Lithuania
8. Stella Maris College, Gzira Malta
9. Margaret Mortimer Girls' Junior Lyceum, Sta. Lucia Malta
10. Berufliche Schule Elektrotechnik/Elektronik, Rostock Germany
11. Erasmus-Gymnasium, Rostock Germany
12. Gimnazija in ekonomska srednja skola, Trbovlje Slovenia
13. King Edward VI School, United Kingdom
14. VA High School, Lynn Grove United Kingdom

a more focused perspective, short teacher profile is presented here as well based on the report from Popova, Vehovar, Pusnik, Cuku  si
c,
and Kellner (2008) who collected information on user profiles, ICT literacy, UNITE provision and alike (N ¼ 31). Out of 40% male and 60%
of female teachers most have several years of professional experience (6 years or more) and are between 31 and 40 years old. Almost
a quarter teach ICT related subjects, followed by English language, Modern and Ancient Greek, Biology, Mathematics and Geography
respectively. One third previously used another e-learning platform (either Moodle, WebCT or another platform provided by the school)
and about a half of them finds UNITE better in terms of capability, but lacking user-friendliness. Almost 70% of teachers would
nevertheless recommend UNITE to their schools. The same percentage agrees that pedagogical preparation for using the system
required more time compared to their usual preparation.
In this paper, an e-learning process management model has been verified on the sample of fifteen teachers who were either finalizing the
implementation of their e-learning scenario, or have just finished one at the time of the research. Since the sample has been limited by the
requirement of e-learning scenario completion, potential violations of statistical assumptions for parametric inference have been consid-
ered. If the classical assumptions were not met by the sample, the non-parametric statistical procedures were used, as they rely on less strict
methodological presumptions (Sprent & Smeeton, 2001).
A questionnaire for the collection of primary data was developed and sent by electronic mail to teachers who planned, organized and
controlled the e-learning process in their schools, within the UNITE projects. The questionnaire contains 56 questions, grouped into four
logical units: planning the e-learning scenario; implementation of the e-learning scenario; controlling the e-learning scenario; and
assessment of e-learning scenario outcomes (see Table 5 with characteristic questionnaire items). All 15 eligible teachers filled out the

Table 5
Questionnaire excerpt.
A - Planning the scenario (9 questions)
YES /
The scenario is a customized version of an UNITE template scenario?
NO
YES /
I have consulted some or all of the supporting material provided by UNITE.
NO
Namely, I have used / worked with:
a. UNITE scenario template
b. Teachers’ handbook If YES,
c. Quick guide to pedagogy in UNITE to what
d. Portal http://pedagogy.unite-ist.org extent?
e. Content development handbook
f. Quick guide to content development in UNITE
B - Implementing the scenario (9 questions)
The scenario as a whole is performed through several learning activities within a time span of --
-----------.
Approximately ------------- students took part in the scenario.
YES /
The scenario encompasses the use of UN ITE platform and mobile devices.
NO
C - Validating the scenario (9 questions)
I have participated in the platform eval
uation (scenario-based user testing). YES /
NO
I would change something in the scenario (planning and/or implementation). YES /
NO
I received positive feedback from students while / after using the UNITE platform. YES /
NO
D - Evaluation of the pedagogical dimensions of the scenario (28 questions)
Please evaluate the scenario: rate (move the slider) and comment its pedagogical dimensions.
User Activity: Some learning environments ar e primarily intended to enable learners to "access various
representations of content". These are labeled as "mathemagenic" environments. Other learning
environments, called "generative", engage learners in the process of creating, elaborating or representing
knowledge. Generative learning environments are aligned most closely with constructivist pedagogy while
"mathemagenic" are often based upon instructivist pedagogy, but this is not necessarily always obvious.
The figure below illustrates this continuum. Please move the slider to rate the scenario:
User Activity
Mathemagenic ----------------------------------------------Generative
Short comment your rating.
560 
M. Cukusic et al. / Computers & Education 55 (2010) 554e565

Fig. 2. Pedagogical dimensions used to evaluate scenario outcomes.

questionnaire. Data were collected electronically, using a form created in MS Word and processed by SPSS v.16, MS Excel and Orange
Canvas.

3.3. Constructing indicators of systematicity

Before analyzing data, an expert, who also participated in the UNITE project, has been consulted in order to construct indicators of
systematic e-learning management. Three indicators show whether the teachers were following the suggested process model, while
planning, implementing and controlling their particular e-learning scenario (see Table 6 for the results per individual and aggregate
indicators):

1. An indicator of the systematic e-learning planning process was obtained as the sum of four separate sub-indicators (each measured by
five-point scales): (i) number of methodological principles underlying the e-learning scenario, (ii) the degree of instructor manuals/
documentation usage, (iii) degree of support usage and organized training participation, and (iv) time spent for scenario planning.
A constructed summary indicator of systematic e-learning planning could range from 4 to 20, with its average value for a total of 15
scenarios of 12.60. Actual results ranged from 9 to 17.

M. Cukusic et al. / Computers & Education 55 (2010) 554e565 561

Table 6
Individual and aggregate systematicity indicators, N ¼ 15.

Avg St. Dev. Kolmogorove Asimpt Sig. Exact. Sig.


Smirnov Z (2-tailed) (2-tailed)
Planning Number of methodological principles underlying the e-learning 3.47 1.246
scenario
Degree of instructor manuals/documentation usage 3.00 1.254
Degree of support usage and organized trainings attendance 3.33 .816
Time spent for the scenario planning 2.80 .862
Summary indicator of planning systematicity 12.60 2.444 .505 .961 .932

Implementation Time spent on platform preparation 2.53 .915 .981 .291 .246
Time spent on learning content creation/authoring 2.87 1.060 .748 .630 .565
Subjective satisfaction with platform 3.47 .516 1.357 .050* .038*
Subjective rating of own e-learning implementation 3.67 .488 1.624 .010** .007**
Scenario complexity 4.04 .834 .654 .787 .726

Control Summary indicator of control systematicity 4.33 1.113 1.261 .083 .065

* Correlation is significant at the .05 level (2-tailed).


** Correlation is significant at the .01 level (2-tailed).

2. Different indicators of systematic e-learning implementation were taken into account (on a scale comparable to the planning indicator):
(i) time spent on platform preparation, (ii) time spent on learning content creation/authoring, (iii) subjective satisfaction with platform
use and (iv) subjective rating of one's own e-learning scenario implementation.
3. A summary indicator of systematic e-learning control/evaluation is based on five index variables related to the feedback loop and two
possible improvement activities. Overall, this indicator variable is measured on a scale from 1 to 5, with an average value of 4.33.

More specific, when the number of methodological principles underlying the scenarios is concerned, teachers mostly (more than 66.6%)
based them on more than three methodological principles (e.g. active learning, cooperative learning, research-based learning, etc.). This is in
line with the objectives promoted by the project. Regarding the use of instructor manuals and associated documentation designed to help
the planning process (i.e. use of a scenario template, methodological guide and its accompanying brief version, pedagogical portal
e-material, content authoring manual and its accompanying brief version), they were used with medium intensity, with 10 out of 15
teachers either “reading” and/or “frequently using” the documents. Teachers were supported by local partners, usually university
researchers from the field of educational sciences and were invited to local and international seminar(s). Support and available additional
education were well used, considering that only one teacher did not attend any organized seminars and was “very little” in contact with its
local partner. Time spent on the planning process could have been expressed on the following scale: less than 2 h, from 2 to 10 h, from 10 to
20 h, from 20 to 40 h, and more than 40 h. The planning process was supposed to encompass activities like the formulation of learning
objectives, definition of the methodological foundations, selection of resources and setting the time frame for the e-learning process.
Authors of the paper supported teachers during planning, implementation and control activities so it was possible to estimate duration of
related activities based on their own experience. It is surprising that none of the teachers spent more than 40 h on scenario planning and
development (although some scenarios lasted over a month and covered the whole semester while the scenario template requires to detail
out every learning activity). However, teachers understand the importance of planning since more than half (eight teachers) spent between
10 and 20 h in this stage.
The same scale was used to determine the time spent on platform preparation, i.e. time required to create a working space, add users,
tasks, resources and the like. As expected, the majority of the teachers (86.7%) spent between 2 and 20 h in order to prepare the platform.
Time spent on content development included preparation of materials for the learning repository, creating lessons, quizzes and the like. The
majority of the teachers (60%) spent more than 10 h for these kinds of activities. One-third, i.e. 5 teachers spent more than 20 h for content
creation. This period would be even greater if the teachers were not encouraged to share content with each other and reuse existing material
from the common repository. The average subjective rating of one's own implementation of e-learning (3.67) is slightly higher than the
subjective evaluation of satisfaction with the platform (3.47). This is consistent with estimates obtained when measuring platform
performance, where the subjective satisfaction of a larger population of teachers (total of 23) was estimated on a scale from 1 to 5, with an
average rating of 3.39. The complexity of scenarios was based on the type of scenario (own or reused), type of content used (own or reused)
and the duration period (from a week to a couple of months). Almost a third of the scenarios (26.7%) are the most complex ones (factor 5.00)
with 10 scenarios (66.7%) lasting more than a month.
Two-thirds of the teachers (66.7%) systematically controlled the process i.e. participated in the platform evaluation, produced case record
forms, and improved the scenario in the planning phase with activities suggested by experts and/or students.
To inspect whether or not the distribution of variables and their aggregates is normally distributed, the KolmogoroveSmirnov test was
used (see Table 6). Results indicate that the assumptions of normality can be used for further analysis (Sheskin, 2007). Variables related to
satisfaction are not normally distributed since the teachers had a tendency to give mostly positive ratings to their implementation practice
and the platform. Thus, the teachers either overestimated their satisfaction, or did not provide completely reliable answers. Given that the
sample was N ¼ 15, it is not possible to use common statistical tests, which required application of non-parametric statistical tests.

3.4. Measuring learning scenario outcomes

Data for evaluation of the outcomes was collected by using the same questionnaire, which included a section on outcome evaluation,
based on Reeves' (1994) methodology. Before estimating the outcomes of e-learning scenarios, teachers were asked to assess whether, in
Table 7
Results of teachers' and experts' evaluations of pedagogical dimensions per scenario, N ¼ 15.

Dimensions Scenarios

1. Nutrition 2. Ecosystem 3. Investigating 4. Historical 5. Traffic 6. Environmental 7. Wied 8. Ancient 9. Creating 10. Visiting 11. Guide 12. Youth 13. Teenage 14. Wonderful 15. Travelling AVERAGE
and digestion Global heritage of Survey Studies Ghollieqa Agora of databases BAIT Expo of Trbovlje crime Well-being World of
Warming Trbovlje fieldwork Athens 2007 Inventions
1. Epistemology T-1 8 5.5 2.6 12.8 9 8.7 10.8 10 2.7 2 11.7 6 10.5 10 7 7.82
E1 6 9.7 8.6 8.2 8.8 9.7 10.5 13 3 7.8 8.4 11.4 12.8 13.4 10.1 9.43
2. Pedagogical T-2 8.5 11.6 8.4 12.5 7 7.5 11.4 12.3 2.5 12.7 11.7 9 11.2 10.5 10.1 9.79
philosophy E2 11 10 7.3 8.8 7.5 10.2 6.7 13 3 7.6 8.2 11.4 13 13.2 8.9 9.32
3. Underlying T-3 8.7 13 8.4 1.2 9 8.4 11.2 12.5 12.3 13.5 3 8.3 13 12.7 10 9.68
psychology E3 11 9.7 8.7 8.7 9.1 8.7 9 12.9 3.8 7.7 8.4 11.4 12 13.2 4 9.22
4. Goal orientation T-4 8.2 8 2 12.5 3 3.3 5 7 .7 5.2 12.5 7 11 10.4 7.1 6.86
E4 10 10.8 8.2 9.5 8.4 10.3 5.6 12.6 4 8.5 8.2 10 13 12.8 4.1 9.07
5. Experiential T-5 6 8.7 7 13.8 10 9.5 12 12.8 13.5 5.7 13.8 6 8.5 12.5 10.5 10
value E5 4.8 11.6 8.2 8.6 8 11.6 10.5 12.5 11.2 12 7.8 5 13 13.2 10.8 9.92
6. Teacher role T-6 8.7 7.7 10.5 4.7 12.5 12 11.8 13.2 2 7.7 7 8.5 10.7 13.5 12.8 9.55
E6 11 11.2 7.6 8 7.7 10.5 6.6 12.5 11.2 7.8 8.3 11.4 12.7 13 8.9 9.89
7. Program T-7 7.3 4.5 12 13.8 4 4.4 11.2 6.4 7.2 7.5 13.8 3.7 8.5 11.5 13.5 8.62
flexibility E7 6.8 9.8 9.6 6.3 9.7 9 7.9 10.6 7.2 7.8 6.8 7 12 11.6 6.8 8.59
8. Value of T-8 6 10.2 12.3 13.8 12 11.5 11.7 12.5 10 6.3 13.8 8.5 9 10.5 13 10.7
errors E8 11 10.2 9.2 8 10.3 11.6 9.7 12.5 3 10.8 8 3 12.4 13 11.3 9.6
9. Motivation T-9 6 6.2 12.3 13.8 2.5 3 3 7 13.5 7.7 13.8 4.5 8.5 10 13 8.32
E9 6.5 10.2 7.8 8.3 7.7 9 12 9.7 7 8 8.8 7 12 12.5 7.1 8.91
10. Accommodation T-10 5.5 4.7 12.3 13.8 3 3.3 9 12.2 10 5.4 13.8 4 8.5 10.7 7 8.21
of individual E10 2.5 9.2 7 9.3 7 7 7.6 10.8 2.4 9 7 2.4 12.2 12 2.1 7.17
differences
11. Learner control T-11 8.2 7.3 12.6 12.3 2.7 2.7 8.5 10.8 9.5 9.5 11.5 8 8 10 7 8.57
E11 4.1 8.7 7 8 6.8 9.6 6.8 8.5 4.5 7 8 4.5 12 13.3 3.7 7.5
12. User activity T-12 8.2 8.7 13.1 12.8 7 7.2 9.8 2 2.7 8 13.8 8.5 12.5 10.7 5 8.67
E12 3.9 12.4 10 8.7 10.2 11.2 9.6 8.2 11.3 13 8.4 4.2 12.8 12.6 3.6 9.34
13. Cooperative T-13 9 10 1.5 13.8 10.2 9.8 9.8 12.7 7.3 10.5 13.8 8.5 10 12.5 11.5 10.1
learning E13 3.6 10.5 9.5 12 9.6 9 8.2 11.2 3.6 10.2 12.4 3.8 12.2 12.6 4.1 8.83
14. Cultural T-14 3 13.8 7 1.3 10 10.4 7 7 .7 11 .5 .5 9 10 11 6.81
sensitivity E14 6.5 9.3 6.7 1 7 7 7 1 7.2 7.2 1 7 12.6 8.6 6.7 6.39

Scenario rating e teachers 7.24 8.56 8.71 10.9 7.28 7.26 9.44 9.89 6.76 8.05 11 6.5 9.92 11.1 9.89 8.84
Scenario rating e experts 7.05 10.2 8.24 8.1 8.41 9.6 8.41 10.6 5.89 8.89 7.84 7.11 12.5 12.5 6.59 8.8

M. Cukusic et al. / Computers & Education 55 (2010) 554e565 563

Table 8
Results of non-parametric ManneWhitney and Wilcoxon test, N ¼ 15.

D01 D02 D03 D04 D05 D06 D07 D08 D09 D10 D11 D12 D13 D14
ManneWhitney U 84.50 98.00 99.00 67.00 105.50 111.00 112.00 87.50 99.50 93.00 80.50 97.50 90.50 97.00
Wilcoxon W 204.50 218.00 219.00 187.00 225.50 231.00 232.00 207.50 219.50 213.00 200.50 217.50 210.50 217.00
Z 1.162 .602 .561 1.889 .291 .062 .021 1.038 .540 .811 1.330 .622 .913 .648
Asimp. sig. (2-tailed) .245 .547 .575 .059 .771 .950 .983 .299 .589 .417 .183 .534 .361 .517
Exact sig. (2-tailed) .250 .567 .595 .061 .775 .967 1.000 .305 .595 .436 .187 .539 .367 .539

their own opinion, they achieved what they intended. Out of 15 teachers, only one responded negatively and believes that his/her own
vision has not been realized. Impact estimations were collected per pedagogical dimensions where the outcome of every dimension was
qualitatively estimated by a slider, provided as a part of the electronic questionnaire. When analyzing the results, qualitative estimates have
been converted to numerical values by measuring the distance of marked positions from 0 to 14. As to ensure the objectivity of the
evaluations, they were also performed by four independent experts who used the same methodological approach. It is important to note
that experts were well acquainted with the e-learning scenarios being evaluated since they were involved in two iterations of the scenario
quality control prior to their implementation. In this way, the comparable evaluation data has been produced for both populations. Observed
dimensions or aspects are illustrated in Fig. 2, while the results of teacher and expert evaluations per scenario in Table 7.
The first glance reveals no significant differences between assessments of teachers and experts, so it can be claimed that experts agree
with the teachers, whose assessment of the e-learning outcomes were mainly positioned on the right side of the measurement scale and in
accordance with project requirements and expectations. Although average values per scenario are similar, there are notable differences in
the ratings of some scenarios' dimensions. While teachers had a tendency to “extreme” ratings, experts were more “conservative”, thus the
average ratings of scenarios made by experts are closer to the “middle” of the continuum, i.e. are neutral.
On average, in terms of epistemology, experts believe that the scenarios tend to the constructivists theory of knowledge (9.43). From the
perspective of pedagogical principles scenarios, they tend to the constructivist theory as well (9.32), and from the underlying psychological
point of view to cognitive psychology (9.22). Regarding goal orientation, experts believe, contrary to teachers, that scenarios tend to less
focused goals and objectives (9.07). From the perspective of the value of practical experience, experts believe that scenarios result in specific,
practical experience (9.92). Regarding the role of the teacher, the accent is on mentoring (9.89). From the perspective of flexibility, scenarios
are flexible in the opinion of experts (8.59), while in terms of value of errors, they tend to experimental i.e. research-based learning (9.60).
With regards to sources of motivation, scenarios encourage intrinsic motivation (8.91), while they are relatively neutral in fostering indi-
viduality (7.17). As to the level of student control concerned, experts believe that scenarios presume a certain level of control for students
(7.50), and encourage students to generate their own learning materials (9.34). Scenarios also encourage students to cooperative learning
(8.83), while in terms of cultural sensitivity, scenarios have no effect on increasing cultural sensitivity (6.39).
In order to analyze differences between teachers' and experts' ratings more exact, variables were recoded to a relative scale from 1 to 5.
The KolmogoroveSmirnov test then confirmed that the distribution is normal. In order to examine the existence of significant differences
between the same variables for two groups of respondents, the non-parametric ManneWhitney's and Wilcoxon's tests were used. Since all
the values for exact significance (see Table 8) are greater than .05, it can be concluded that experts and teachers, in general, do not differ in
the assessments of scenario outcomes for all dimensions.

4. Discussion: does the systematic e-learning management lead to the expected e-learning performance?

Before examining the interdependence of e-learning scenario outcomes and systematic implementation of the model, the relationship
between various indicators describing the three stages of the e-learning management has been analyzed. Its presence would implicate that
improvement of one aspect of the process management aids to improve development of the other, thus leading to a “virtuous circle”.

Table 9
Interdependence of systematic e-learning management indicators and scenario outcomes, N ¼ 15.

Summary Time spent Time spent on Scenario Summary indicator Average Average scenario
indicator of on platform learning content time span of control scenario e-learning e-learning
planning preparation creation/authoring systematicity outcome score, outcome score,
systematicity teachers (T) experts (E)
Summary indicator of Kendall tau_b 1.000 .328 L.469(*) .051 .407 .394(*) .315
planning systematicity Sig. (2-tailed) . .137 .028 .818 .066 .046 .110
Time spent on platform Kendall tau_b 1.000 .612(**) .667(**) .303 .317 .341
preparation Sig. (2-tailed) . .009 .006 .212 .141 .113
Time spent on learning Kendall tau_b 1.000 .473(*) .431 L.496(*) L.474(*)
content creation/authoring Sig. (2-tailed) . .043 .065 .017 .022
Scenario time span Kendall tau_b 1.000 .159 .135 .357
Sig. (2-tailed) . .514 .531 .098
Summary indicator of control Kendall tau_b 1.000 .504(*) .135
systematicity Sig. (2-tailed) . .019 .531
Average scenario e-learning Kendall tau_b 1.000 .276
outcome score, teachers (T) Sig. (2-tailed) . .151
Average scenario e-learning Kendall tau_b 1.000
outcome score, experts (E) Sig. (2-tailed) .

* Correlation is significant at the .05 level (2-tailed).


** Correlation is significant at the .01 level (2-tailed).
564 
M. Cukusic et al. / Computers & Education 55 (2010) 554e565

Table 10
Relationship between the experts' e-learning scenario outcome evaluation and scenario subject, N ¼ 15.

Value df Asymp. Sig. Exact Sig. Exact Sig. Point


(2-sided) (2-sided) (1-sided) Probability
Pearson Chi-Square 7.050 6 .316 .343
Likelihood Ratio 8.685 6 .192 .366
Fisher's Exact Test 6.326 .377
Linear-by-Linear Association .035a 1 .852 1.000 .504 .147
a
The standardized statistic is .187.

Spearman's rho correlation coefficient points to a high correlation of the planning summary indicator with its components e the summary
indicator is affected by the number of methodological principles underlying the scenario the most (rho ¼ .762). There is also high and
significant interdependence of the time invested in content development with the platform preparation period (rho ¼ .716), as well as with
the time span of the scenarios (rho ¼ .735). Scenario complexity on the other hand, surprisingly, is not correlated to time invested in its
implementation, nor the time span.
In order to analyze the first hypothesis, indicators of e-learning management were tested against average scenario outcomes estimated
by experts (E) and teachers (T), which is demonstrated by Table 9.
As a non-parametric measure of the correlation of ordinal variables, Kendall's tau-b indicator was used. Table 9 indicates a clear link
between e-learning planning and its learning outcomes, as perceived by teachers (tau-b ¼ .394), but not experts. The same applies to
e-learning process control (tau-b ¼ .504). With regards to organization/implementation of e-learning, there is a significant negative
correlation between the time required to create the e-learning content and achieving outcomes (tau-b ¼ .496). This can be explained by
the fact that teachers are not yet skilled for e-learning content creation, where design and production takes longer than authoring content
for traditional forms of learning. Those teachers who have reused content from the repository or the whole scenario achieved better
outcomes, as opposed to those who spent a lot of time on new content development. Therefore, the hypothesized relationship between
the systematic implementation of the e-learning management components model and the achievement of targeted e-learning
performance is considered to be empirically validated.
In order to analyze the hypothesized relationship between e-learning outcomes and scenario categories - (i) Biology and Physics, (ii)
History and Geography, (iii) ICT, and (iv) Social sciences, a cross-tabulation of the two variables, followed by the Chi-square test, has been
performed (see Table 10 for the relationship with expert evaluations.
At the 95% reliability level, it can be concluded that there is no relation of e-learning outcome and scenario subject, as per expert
evaluation. The same applies to the teachers' assessments of e-learning outcomes, which provide empirical evidence for the hypothesized
independence of the e-learning performance from the subject of learning. As previously determined, the e-learning performance seems to
primarily depend on the quality and systematic implementation of the e-learning process management model.

5. Conclusion

E-learning is a highly multidisciplinary field, but it should be noted that research in education science, on one hand, and specific
information technology applications, on the other hand, still dominate this area. Technological challenges of the e-learning process are
basically much easier to solve than the organizational and methodological ones. Educational institutions must adapt themselves by
designing and managing e-learning processes providing quick, targeted, inexpensive and highly flexible information delivery to their users,
and thus help in developing their skills and competencies.
Despite large nominal popularity and support for introducing e-learning in different contexts, and numerous implementation experi-
ences, there is still very little empirical research related to managing the entire process of e-learning to ensure the use of full technology
potential. Therefore, this study concentrated primarily in the formulation of such a model; developing measures and ways of model
implementation and assessment, additionally empirically testing the relationships between the systematic implementation of the
e-learning management model, i.e. the e-learning subject and the e-learning performance.
It is hoped that the initial empirical evidence, demonstrating both the existence of a link between the systematic e-learning
management and the e-learning performance, as well as the independence of e-learning performance from the subject of e-learning,
provide educational institutions and other actors in the e-learning arena with the incentive to address more thoroughly the methodo-
logical and organizational issues of their e-learning initiatives. The message is clear e whatever one tries to teach, the success of
e-learning seems to be determined by the systematic management of the process, rather than by the price tag and sophistication of
the technology used.

References

Anderson, T. (2007). Towards online learning theory. Croatian Academic and Research Network Magazine e Edupoint, 7(51), 16e31, (in Croatian).
Angelo, T. A. (1996). Seven shifts and seven levers: developing more productive learning communities. The National Teaching and Learning Forum, 6(1), 1e4.
Bahtijarevi 
c e Siber, F. (1999). Human resource management. Zagreb: Golden Marketing. (in Croatian)717e772.
Beetham, H. (2004). Draft template for describing a unit of (e)learning. Designing for learning theme of the e-learning and pedagogy programme of Joint Information Systems
Committee (JISC).
Boer, W., & Collis, B. (2002). A changing pedagogy in e-learning: from acquisition to contribution. Journal of Computing in Higher Education, 13(2), 87e101. doi:10.1007/
BF02940967.
Brox, C. (2009). A business model for the exchange of e-learning courses in an international network. In M. Stansfield, & T. Connolly (Eds.), Institutional transformation through
best practices in virtual campus development: Advancing e-learning policies (pp. 269e288). Hershey, PA: Information Science Reference.
Conole, G., & Oliver, M. (1998). A pedagogical framework for embedding C&IT into the curriculum. ALTJ, 6(2), 4e16. doi:10.1080/0968776980060202.

Cuku si
c, M. (2009). Managing the e-learning process in elementary education. Masters thesis. (in Croatian).

M. Cukusic et al. / Computers & Education 55 (2010) 554e565 565


Cuku si
c, M., Granic, A., Mifsud, C., Pagden, A., Walker, R., & Zammit, M. (2007). National and school specifics as a prerequisite for the successful design of an e-learning system:
the UNITE approach. In B. Aurer, & M. Ba ca (Eds.), Proceedings of the 18th IIS 2007 (pp. 85e92). Vara zdin: FOI.

Cuku si
c, M., Granic, A., Mifsud, C., & Zammit, M. (2008). Deliverable D4.3: pedagogical framework implementation report (final). UNITE report. Retrieved 16.10.08 from. www.
unite-ist.org.

Cuku si
c, M., Grani c, A., Pagden, A., Walker, R., & Zammit, M. (2006). UNITE deliverable D4.1: UNITE pedagogical framework design.
Divjak, B., & Begi cevi c, N. (2006). Imaginative acquisition of knowledge e strategic planning of e-learning. In. Proceedings of the 28th international conference on information
technology interfaces. Dubrovnik, Croatia: University Computing Centre SRCE, University of Zagreb.
Downie, S., Wunner, R., & MacRae, N. (2008). Deliverable D10.2: exploitation plan: final version. UNITE report.
Earle, A. (2002). Designing for pedagogical flexibility e experiences from the CANDLE project. Journal of Interactive Media in Education, 4.
European Council. (2009). Council conclusions on a strategic framework for European cooperation in education and training (‘ET 2020’). Official Journal of the European Union.
2009/C 119/02. Retrieved 16.11.09, from. http://eur-lex.europa.eu.
Garaca, Z (2009). ERP systems. Split: University of Split. (in Croatian).
Goodstein, L., Nolan, T., & Pfeiffer, J. W. (1993). Applied strategic planning: How to develop a plan that really works. McGraw-Hill Professional Publishing.
Grani 
c, A., & Cuku si
c, M. Complementing usability and educational evaluation: a case study of an e-learning platform. Educational Technology & Society, submitted for
publication.
Grani 
c, A., Cuku si
c, M., Tzanavari, A., & Papadopoulos, G. A. (2009). Employing innovative learning strategies using an e-learning platform. In C. Mourlas, N. Tsianos, &
P. Germanakos (Eds.), Cognitive and emotional processes in web-based education: Integrating human factors and personalization (pp. 414e436). Hershey, PA, USA: Infor-
mation Science Reference.
Grani 
c, A., Mifsud, C., & Cuku si
c, M. (2009). Design, implementation and validation of a Europe-wide pedagogical framework for e-learning. Computers & Education, 53(4),
1052e1081. doi:10.1016/j.compedu.2009.05.018.
Hayes, D. (2010). Encyclopedia of primary education. Abingdon: Routledge.
Horton, W., & Horton, K. (2003). E-learning tools and technologies: A consumer's guide for trainers, teachers, educators and instructional designers. Wiley & Sons.
Kellner, A., Teichert, V., Hagemann, M., Cuku  si
c, M., Grani
c, A., Pagden, A., et al. (2008). Deliverable D7.2: report of validation results. UNITE report. Retrieved 11.05.09 from.
www.unite-ist.org.
Kuechler, T., & Schmidt, K. (2007). From e-learning to integrated learning architectures: a novel approach to learning management in corporate and higher education
contexts. Management & Marketing, 2(4), 27e34.
Kynigos, C. (1995). We should not miss the chance: educational technology as a means expression and observation in general education. In A. Kazamias, & M. Kasotakis (Eds.),
Greek education, perspectives of reformulation and modernization (pp. 396e416).
Leonard, C. (2003). Pedagogical principles: paradigms, or platitudes. In D. Lassner, & C. McNaught (Eds.), Proceedings of world conference on educational multimedia, hyper-
media and telecommunications (pp. 2142e2143). Chesapeake, VA: AACE.
Lewis, R., & Merton, B. (1996). Technology for learning: Where are we going?. Independent learning unit position paper University of Lincoln and Humberside.
Limanauskiene, V., Romanova, G., Sitikovs, V., & Stuikys,  V. (2008) Network of schools as a general framework for validation of the UNITE project outcomes. In ICALT'08, eighth
IEEE international conference on advanced learning technologies, July 1e5, 2008, Santander, Cantabria, Spain. pp. 669e670.
Limanauskiene, V., Stuikys, V., & Sitikovs, V. (2008). Deliverable D6.2: content-enriched network of schools installed. UNITE report. Retrieved 11.05.09 from. www.unite-ist.
org.
Moen, R., & Norman, C. (2006). Evolution of the PDSA cycle. Retrieved 11.05.09 from. http://deming.ces.clemson.edu/pub/den/deming_pdsa.htm.
Nikolova, I., & Tzanavari, A. (2006). Deliverable 1: state-of-the-art in e-learning and user requirements. UNITE report. Retrieved 11.05.09 from. www.unite-ist.org.
Phipps, L., & Kelly, B. (2006). Holistic approaches to e-learning accessibility. Association for Learning Technology, 14(1), 69e78.
Popova, A., Vehovar, V., Pusnik, V., Cuku si
c, M., & Kellner, A. (2008). Deliverable D8.2: Socio-economic evaluation methodology and final report for UNITE prototype. UNITE
report. Retrieved 03.10.08 from. www.unite-ist.org.
Quinton, S., Dreher, H., Fisher, D., & Houghton, P. (2005). Harnessing technology to empower mature age learners. In D. Fisher, D. Zandvliet, I. Gaynor, & R. Koul (Eds.), Fourth
international conference on science, mathematics and technology education (pp. 490e499). Victoria, British Columbia, Canada: Curtin University of Technology.
Reeves, T. C. (1994). Evaluating what really matters in computer-based education. In M. Wild, & D. Kirkpatrick (Eds.), Computer education: New perspectives (pp. 219e246).
Perth, Australia: MASTEC.
Reeves, T. C., Benson, L., Elliot, D., Grant, M., Holschuh, D., Kim, B., et al. (2002). Usability and instructional design heuristics for e-learning evaluation. In Paper presented at the
annual conference on educational multimedia, hypermedia & telecommunications, ED-MEDIA 2002. June 24e29, in Denver, USA.
Salajan, F. D. (2007). The European elearning programme(s): between rhetoric and reality. European Educational Research Journal, 6(4), 364e381.
Sanchez, J., & Salinas, A. (2008). ICT and learning in Chilean schools: lessons learned. Computers & Education, 51(4), 1621e1633. doi:10.1016/j.compedu.2008.04.001.
Schönwald, I. (2003). Sustainable implementation of e-learning as a change process at universities, Presented at Online Educa Berlin 2003, Berlin. (2004). In P. Boezerooy, &
P. Gorissen (Eds.), Dutch e-learning in Europe: ICT and education. (Report). Utrecht, The Netherlands: SURF Foundation. (2004)Retrieved 08.02.10 from. http://www.
surffoundation.nl/SFDocuments/e-learning.pdf.
Selim, H. M. (2007). Critical success factors for e-learning acceptance: confirmatory factor models. Computers & Education, 49(2), 396e413. doi:10.1016/j.compedu.2005.09.004.
Sheskin, D. (2007). Handbook of parametric and nonparametric statistical procedures (4th ed.). Boca Raton, FL: Chapman & Hall/CRC.
Sprent, P., & Smeeton, N. C. (2001). Applied nonparametric statistical methods (3rd ed.). Boca Raton, FL: Chapman & Hall/CRC.
Stansfield, M., & Connolly, T. (2009). Institutional transformation through best practices in virtual campus development: Advancing e-learning policies. Hershey, PA: Information
Science Reference.
Tzanavari, A. (2007). Deliverable D5.2: 1st version of specific UNITE eLearning scenarios; PART II: Handbook for content development v.2. UNITE report. Retrieved 16.10.08 from.
www.unite-ist.org.

Zoakou, A., Cuku si
c, M., Dechau, J., Kellner, A., Limanauskiene, V., Nikolova, N., et al. (2008). Deliverable D5.3: Final version of the specific UNITE eLearning scenarios (part II).
UNITE report. Retrieved 16.10.08 from. www.unite-ist.org.
Zoakou, A., Tzanavari, A., Papadopoulos, G. A., & Sotiriou, S. (2007). A methodology for eLearning scenario development: the UNITE approach. In. Proceedings of the ECEL2007-
European conference on e-learning (pp. 683e692). Copenhagen, Denmark: ACL Publications.

S-ar putea să vă placă și