Documente Academic
Documente Profesional
Documente Cultură
ORG
1 INTRODUCTION
owadays it is generally acceptable that the use of the computers and the information technology can improve very much the learning process. The Adaptive Educational Hypermedia Systems (AEHS) are being developed more and more [1] trying to help the studentuser to obtain the knowledge needed in the most efficient way. Studies have shown [2],[3] that the emotional state of the user during the attendance of a course plays a significant role in the effectiveness of the learning process. For example, if some presentation of a lesson causes a sense of boredom to the student then he won't be in a condition to attend the course and the time pass by ineffectively. Other emotions are confusion, frustration, fatigue [4]. Consequently the ability for the system to detect the student's condition is very significant as it has the potential to adjust properly and provide stimulants to the student so as to change his disposition and to stimulate again his interest. In this work we propose a method for the detection of the sense of boredom based on the hand motions of the student and by extension the mouse movements. Previous works [5],[6],[7] attempt to detect the emotional state by the use of several devices which monitoring the student. These devices are cameras, movement detectors and other special computer peripherals (pointing devices). The disadvantage of the above is that such devices can't be available to every user who uses the computer at his home in order to attend a distance learning course. On the other hand, the mouse as a device is widespread and its observation can be done relatively easily.
G. Tsoulouhas is with the Democritus University of Thrace, Xanthi, GR 67100. D. Georgiou is with the Democritus University of Thrace, Xanthi, GR 67100. A. Karakos is with the Democritus University of Thrace, Xanthi, GR 67100.
JOURNAL OF COMPUTING, VOLUME 3, ISSUE 11, NOVEMBER 2011, ISSN 2151-9617 HTTPS://SITES.GOOGLE.COM/SITE/JOURNALOFCOMPUTING WWW.JOURNALOFCOMPUTING.ORG
10
eras, movement and distance sensors. The use of these devices is effective but there is a problem. They have some cost and are difficult to exist on every computer - elearning environment. So there is a need to find ways to monitor the emotional condition of a student via devices which are widespread and of low cost as for example the keyboard and the mouse which are two devices that exist in every computer.
In order to prove that the movements of the mouse can be used to detect the boredom effect we collected the data from an experiment with 136 students. For this experiment the students needed to attend a session of 45 min of a concrete course which consisted of different learning objects. The subject of the course was Computer Programming Techniques and consisted of 7 different learning objects. During the session the movements of the mouse were recorded along with the intervals of immobility of the mouse. If for some concrete time interval the mouse would stop moving, then the students were told to answer to the question "Are
JOURNAL OF COMPUTING, VOLUME 3, ISSUE 11, NOVEMBER 2011, ISSN 2151-9617 HTTPS://SITES.GOOGLE.COM/SITE/JOURNALOFCOMPUTING WWW.JOURNALOFCOMPUTING.ORG
11
you bored?" with a single Yes or No. The time span was determined from one constant "b" which we named boredom threshold. To group the 136 students we separated them into 4 smaller groups. For each of these 4 groups the parameter "b" had a different value 10sec, 20sec, 30sec and 40sec. Our purpose was to see if the movements of the mouse and the intervals of its immobility are related to the boredom effect. The experiment was done on computers of the same hardware and on screens with the same resolution for each student.
pre-process and store the data into a MySQL database [40]. Fig. 3 shows the model of the architecture. Generally, the monitoring of the mouse pointer movements on the page can be done by the use of a code written in JavaScript [41]. The JavaScript provides events for every action of the mouse like on mouse over, on mouse move, on mouse up, on mouse down, on mouse out. So with the use of the event "on mouse move" (Fig.4) we can notice when there is some movement of the mouse. This event is activated every time the mouse is moved even by one pixel. We have made a function which is called every time the event "on mouse move" occurs. This function records the time when the movement took place and the co-ordinates of the muse pointer and stores them in an array. Also for as long the mouse moves (continuous movement) its position is recorded with sample rate 8msec and it is also stored on the above array. The data of the array are sent to the server periodically (for example every 5 sec). Every time that the data are sent, the array empties. In this way we avoid the accumulation of great quantities of data on the user's computer something that could slow down the system. The collected data is stored as previously mentioned on a MySQL data base. The data base contains on separate relational tables the following data (as seen on the database schema on Fig.5): Data of the users who use the application. The learning objects of the lesson. The movements of the mouse (moving times and x, y coordinates) connected with the corresponding learning object and the corresponding user and the answers of the users to the question "Are you bored?" connected with the corresponding last movement of the mouse. Every time the user has the mouse immobilized for a time period equal or more of its value of parameter "b" then the next time the mouse moves (that is after b+x sec) a pop up dialog is activated which asks the user to answer
Fig.3. The Ajax model of communication between the client and the server [39].
JOURNAL OF COMPUTING, VOLUME 3, ISSUE 11, NOVEMBER 2011, ISSN 2151-9617 HTTPS://SITES.GOOGLE.COM/SITE/JOURNALOFCOMPUTING WWW.JOURNALOFCOMPUTING.ORG
12
with a single "Yes" or a single "No" to the question "Are you bored?". This means that the pop-up dialog isn't activated every b seconds, but every b+x seconds, with the rate of x not being apparently constant. We do this so as not bother the user during the pauses of the mouse and not to influence the duration of the pause. Each row contains the data that corresponds on a specific position of the mouse pointer. The difference between end_time and start_time is equivalent with the inactivity time of the mouse. If the difference is less than 500ms, that shows continuous movement (no inactivity). mouse movements performed within a 45-degree area. For instance, direction number 1 represents all actions performed with angles between 0 degree and 45 degrees, whereas direction number 2 is responsible for all actions performed between 45 degrees and 90 degrees. In Table 1 we notice an example of the raw data which is recorded on the data base. Except of the timestamps, the coordinates and the directions, on the table appears the id of the session - user to whom the concrete record corresponds. Also appears the learning object id, which corresponds to the learning object upon which the mouse was moving. Concerning the learning objects what interests us is the type description of each learning object. For our application as previously mentioned we used 7 different learning objects on which depending on their content we gave the following descriptions (Learning Object Types - LOT):
Fig.6. Mouse movement directions. For instance, direction number 1 represents all actions performed with angles between 0 degree and 45 degrees.
JOURNAL OF COMPUTING, VOLUME 3, ISSUE 11, NOVEMBER 2011, ISSN 2151-9617 HTTPS://SITES.GOOGLE.COM/SITE/JOURNALOFCOMPUTING WWW.JOURNALOFCOMPUTING.ORG
13
1. medium text with images (mti) 2. short text (st) 3. short text with images (sti) 4. long text with images (lti) 5. video (vd) 6. multiple choice questions (mcq) 7. exercise (exr) From the 136 participants we collected about 1,800,000 records of raw data.
In the following, using diagrams we will show the relation between these metrics and the appearance of the boredom effect to the users- learners.
PROPOSED METRICS
Metrics
Fig.7. Percentage of users who claimed boredom per learning object type. The difference between different learning objects is obvious.
LearningObjectType(LOT)
Total Average MovementSpeed (TMS) Latest Average Speed - Before Asked (LMS) Mouse Inactivity ccurrences Before Asked (MIN) Average Duration of Mouse Inactivity Before Asked (DMIN) Horizontal Movements to Total Movements Ratio (HRZ) Vertical Movements to Total Movements Ratio (VRT) Diagonal Movements to Total Movements Ratio (DGNL) Average MovementSpeed per Movement Direction(MDA)
we have a small percentage of boredom while to the learning objects with a longer text the percentages are quite high (mti, lti).
4.3. Total Average Movement Speed and Latest Average Movement Speed
The speed of the mouse movement is a metric which can help to the detection of boredom. The Total Average Movement Speed (TMS) corresponds to the average speed of the movement for all the duration of monitoring the learning object, while the Latest Average Speed (LMS)
Each row contains the data that corresponds on a specific position of the mouse pointer.
Fig.8. Users 1 and 2 claimed boredom. There is a significant difference between TMS and LMS for users 1 and 2, but not for users 3 and 4.
JOURNAL OF COMPUTING, VOLUME 3, ISSUE 11, NOVEMBER 2011, ISSN 2151-9617 HTTPS://SITES.GOOGLE.COM/SITE/JOURNALOFCOMPUTING WWW.JOURNALOFCOMPUTING.ORG
14
is the average speed of the movement in the last 60 seconds before the user is asked if bored. We noticed that TMS over the LMS does not have big differences in users who didn't report that they are bored. On the other hand, for users who reported boredom both values differ significantly. In the diagram of Fig.8 we see this difference in two speeds for 4 different users, 2 who reported boredom (user 1 and user2) and 2 who reported no boredom (user 3 and user 4). In the classification process we use the ratio TMS/LMS, since it shows the proportion between these values.
Fig.10. How distributions of eye fixation time and clickthrough relate to distribution of mouse hovering time, for regions like text regions or image regions [45].
Fig.11. Distribution of the 8 mouse movement directions from all the 136 participants.
Fig.9. Average values of MIN and DMIN metrics, for bored and nonbored users.
gram of Fig. 9 we see the averages of metric MIN and DMIN for all the users who answered "Yes" to the question "Are you bored?" and all those who answered "No".
reported boredom. To be more specific, an increase of the vertical and diagonal movements in regard to the horizontal movements was observed. Fig. 12 shows the histogram of movement directions from 6 different users. Users 4, 5 and 6 in Fig. 12b claimed boredom and users 1, 2 and 3 in Fig. 12a did not claim boredom. For the users of Fig.12b we have an increase of directions 1 and 5 in regard to directions 3 and 7. So for the metrics HRZ, VRT and DGNL we can say they are connected with the user's behavior.
JOURNAL OF COMPUTING, VOLUME 3, ISSUE 11, NOVEMBER 2011, ISSN 2151-9617 HTTPS://SITES.GOOGLE.COM/SITE/JOURNALOFCOMPUTING WWW.JOURNALOFCOMPUTING.ORG
15
Each row contains some values of the proposed metrics from 2 different users (the user with session id = 102 and the user with session id = 103). For example, the user 102 on the learning object lti declares boredom (bored=yes). (a) of the metrics appears on Table 3. We import the data (for each group separately) to the Weka software and applied the J48 classifier. The properties of J48 were: confidenceFactor: 0.9 , minNumObj: 1, numFolds: 3. In order to evaluate the results of the classifier, we separated the data into 2 sets of the proportion 60:40. We used the small set in order to train the classifier and the big set to evaluate. For all 4 groups the classifier managed to classify correctly more than 90% of the data. To be more specific, the results of each group are: 1. (b)
Fig.12. (a) Histograms of Directions of Movement from 3 users who didnt claimed boredom. (b) Histograms of Directions of Movement from 3 users who claimed boredom. The increase of value of directions 1 and 5 is obvious.
Fig.13. Distribution of the MDA for each mouse movement direction. The values are the average values of MDA from all 136 participants.
Forb=10: Correctly Classified Instances: 423(97.2414 %) Incorrectly Classified Instances:12(2.7586 %) Kappa statistic:0.9266 Mean absolute error:0.0436 Root mean squared error: 0.1647 Relative absolute error:11.6371 % Root relative squared error:37.3886 % Total Number of Instances:435(60%of725) ConfusionMatrix: ab<classifiedas 10311|a=yes(bored) 1320|b=no(nonbored) 2. Forb=20: Correctly Classified Instances: 246(95.3488 %) Incorrectly Classified Instances:12(4.6512 %) Kappa statistic: 0.873 Mean absolute error: 0.0717 Root mean squared error: 0.2101 Relative absolute error: 19.1182 % Root relative squared error: 47.6024 % Total Number of Instances: 258 (60% of 430) ConfusionMatrix: ab<classifiedas 5612|a=yes(bored) 0190|b=no(nonbored) 3. Forb=30: Correctly Classified Instances: 178 (93.1937 %) Incorrectly Classified Instances: 13 (6.8063 %)
JOURNAL OF COMPUTING, VOLUME 3, ISSUE 11, NOVEMBER 2011, ISSN 2151-9617 HTTPS://SITES.GOOGLE.COM/SITE/JOURNALOFCOMPUTING WWW.JOURNALOFCOMPUTING.ORG
16
Kappa statistic: 0.8033 Mean absolute error: 0.0752 Root mean squared error: 0.2596 Relative absolute error: 21.5768 % Root relative squared error: 62.1531 % Total Number of Instances: 191 (60% of 318) ConfusionMatrix: ab<classifiedas 629|a=yes(bored) 1246|b=no(nonbored) 4. Forb=40: Correctly Classified Instances: 104 (88.1356 %) Incorrectly Classified Instances: 14 (11.8644 %) Kappa statistic: 0.6449 Mean absolute error: 0.1186 Root mean squared error: 0.3444 Relative absolute error: 30.5677 % Root relative squared error: 81.6108 % Total Number of Instances: 118 (60% of 196)
Fig. 14 and Fig. 15 show an example of one of the classifier trees which were created. Fig. 14 shows a visual representation and Fig. 15 shows a text representation of the tree. The classifier gives concrete values of the metrics so as to conclude whether the user displays or not boredom. An example of a condition which leads to boredom according to Fig.14 and Fig.15 is the following: IF MDA2 <= 16.96 AND MIN <= 7 AND VRT > 14.93 AND MDA7 <= 8.91 THEN bored = yes In order to have a better view of the values of metrics which can help us detect the emotion of boredom on the student-user, we perform a clustering to the data. The algorithm we used for the clustering is the SimpleKmeans. The properties of SimpleKmeans were Euclidean distance
with 2 clusters, since our goal was to separate the instances into 2 clusters, one for bored users and one for non-bored users. Fig. 16 shows the results of the clustering and Fig. 17 shows a visualization of the results. Except for the metric LOT which its values are discrete and cannot have numerical values, the values of the other metrics which arise from the clustering agree almost completely with the findings mentioned in paragraph 4. For example, the value of metric TMS/LMS for 1st cluster (bored=yes) is 56.78% which shows a decrease of value TMS in regard to LMS, while for the 2nd cluster (bored=no) it is 78% which indicates that TMS and LMS don't have a big difference. If to the properties of simple K means we change the number of produced clusters from 2 to some greater number (for example 7) then we will be able to create clusters for each value of metric LOT.
JOURNAL OF COMPUTING, VOLUME 3, ISSUE 11, NOVEMBER 2011, ISSN 2151-9617 HTTPS://SITES.GOOGLE.COM/SITE/JOURNALOFCOMPUTING WWW.JOURNALOFCOMPUTING.ORG
17
[2] [3]
[4]
[5] [6]
[7]
[8] [9]
[10]
to detect if a certain student shows boredom. We showed that on a particular group of users and with specific computer hardware the C4.5 classifier can classify the metrics relevant to boredom with a success rate of above 90%. The advantage of this methodology is the fact that without the use of special devices, only by the analysis of the mouse movements we can monitor the behavior affective state of the learner. As future work, we plan to automate the whole procedure, that is, we are developing a plug-in tool to automate the data pre-processing and classification steps. This tool, according to classification results, will be able to predict the emotional state of the learner and then it will inform the AHES to redesign the flow of the lesson if necessary.
[12]
[16]
[17]
[18]
ACKNOWLEDGMENT
The authors wish to thank the students of Democritus University of Thrace who were kind enough to participate in the experiments of this work.
[19] [20]
[21]
REFERENCES
[1] S. Chatzisavva, G. Tsoulouhas, A. Georgiadou and A. Karakos, A dynamic environment for distance learning, 2nd International Conference on Computer Supported Education, pp. 398-401, 2010.
[22]
[23]
Spering, M., Wagener, D., Funke, J., The role of emotions in complex problems solving, Cognitive and Emotion 19 2005 B. Kort, R. Reilly and R. W. Picard, An affective model of interplay between emotions and learning: Reengineering educational pedagogy-building a learning companion, Proceedings of the IEEE International Conference on Advanced Learning Technologies, pp. 43-46, 2001. R. Reilly, B. Kort, The Science Behind The Art of Teaching Science: Emotional State and Learning, Proceedings of Society for Information Technology & Teacher Education International Conference, pp. 3021-3026, 2004. R. W. Picard, Toward computers that recognize and respond to user emotion, IBM Systems Journal, vol. 39, no 3.4, pp. 705-719, Apr. 2010 M. Pantic, L.J.M. Rothkrantz, Toward an Affect-Sensitive Multimodal HumanComputer Interaction, Procceedings of the IEEE, vol 91, no 9, pp. 1370-1390, Sept. 2003. S. DMello, T. Jackson, S. Craig, B. Morgan, P. Chipman, H. White, N. Person, B. Kort, R. Kaliouby, R. Picard and A. Graesser, AutoTutor detects and responds to learners affective and cognitive states, Workshop on emotional and cognitive issues in ITS held in conjunction with the ninth international conference on intelligent tutoring systems, pp. 31-43, 2008. A. Damasio, Descartes Error - Emotion, Reason and the Human Brain, New York: Penguin Books, pp. 165201, 2005. C. Frasson, P. Chalfoun, Managing Learners Affective States in Intelligent Tutoring Systems, Advances in Intelligent Tutoring Systems, vol. 308, pp. 339-358, 2010. S. Asteriadis, P. Tzouveli, K. Karpouzis and S. Kollias, Estimation of behavioral user state based on eye gaze and head pose application in an e-learning environment, Multimedia Tools and Applications, vol 41, no 3, pp. 469-493, 2008 T. Dragon, I. Arroyo, B. P. Woolf, W. Burleson, R. Kaliouby and H. Eydgahi, Viewing Student Affect and Learning through Classroom Observation and Physical Sensors, Lecture Notes in Computer Science, vol 5091, pp. 29-39, 2008. C. Conati, R. Chabbal, and H. Maclaren, A study on using biometric sensors for monitoring user emotions in educational games, User ModelingWorkshop on Assessing and Adapting to User Attitudes and Affect: Why,When, and How?, 2003 S. D'Mello, R. W. Picard, A. Graesser, Toward an Affect-Sensitive AutoTutor, IEEE Intelligent Systems, vol. 22, no. 4, pp. 53-61, 2007. Abrams, R., & Balota, D. (1991). Mental chronometry: Beyond reaction time, Psychological Science, 2, 153-157. J. I. Gold, M. N. Shadlen, Neural computations that underlie decisions about sensory stimuli, Trends in Cognitive Sciences, vol. 5, pp. 10-16, 2001. J. H. Song, K. Nakayama, Target selection in visual search as revealed by movement trajectories, Vision Research, vol. 48, no 7, pp. 853-861, 2008. M. A. Goodale, D. Pelisson and C. Prablanc, Large adjustments in visually guided reaching do not depend on vision of the hand or perception of target displacement, Nature, vol 320, pp. 748-750, 1986. M. Finkbeiner, J. H. Song, K. Nakayama, and A. Caramazza, Engaging the motor system with masked orthographic primes: A kinematic analysis, Visual Cognition, vol. 16, pp. 11-22, 2008. T. Schmidt, The finger in flight: Real-time motor control by visually masked color stimuli, Psychological Science, vol. 13, pp. 112-117, 2002. M. J. Spivey, M. Grosjean, and G. Knoblich, Continuous attraction toward phonological competitors, Proceedings of the National Academy of Sciences, vol. 102, pp. 10393-10398, 2005. R. Dale, C. Kehoe, and M. J. Spivey, Graded motor responses in the time course of categorizing atypical exemplars, Memory & Cognition, vol. 35, pp. 15-28, 2007. T. A. Farmer, S. E. Anderson, and M. J. Spivey, Gradiency and visual context in syntactic garden-paths, Journal of Memory & Language, vol. 57, pp. 570-595, 2007. J. B. Freeman, N. Ambady, N. O. Rule, and K. L. Johnson, Will a category cue attract you? Motor output reveals dynamic competition
JOURNAL OF COMPUTING, VOLUME 3, ISSUE 11, NOVEMBER 2011, ISSN 2151-9617 HTTPS://SITES.GOOGLE.COM/SITE/JOURNALOFCOMPUTING WWW.JOURNALOFCOMPUTING.ORG
18
[24]
[25]
[26]
[27] [28]
[29]
[30]
[31]
[32]
[33]
[34]
[38]
[43] [44]
[45]
across person construal, Journal of Experimental Psychology: General, vol. 137, pp. 673-690, 2008. J. B. Freeman, N. Ambady, Motions of the hand expose the partial and parallel activation of stereotypes, Psychological Science, vol. 20, pp. 1183-1188, 2009. A. A. E. Ahmed, I. Traore, A New Biometric Technology Based on Mouse Dynamics, Dependable and Secure Computing, vol. 4, no 3, pp. 165-179, 2007. S.Benson, A.Thomson, A Behavioral Biometric Approach Based on Standardized Resolution in Mouse Dynamics, International Journal of Computer Science and Network Security, vol. 9, no 4, pp. 370-377, 2009. A. A. E. Ahmed, I. Traore, Detecting Computer Intrusions Using Behavioral Biometrics, PST, 2005. J.B. Freeman and N. Ambady, Motions of the hand expose the partial and parallel activation of stereotypes, Psychological Science, vol. 20, pp. 11831188, 2009. Q. Guo, E. Agichtein, Towards Predicting Web Searcher Gaze Position from Mouse Movements, 28th international conference extended abstracts on Human factors in computing systems, 2010. K. Rodden and X. Fu, Exploring how mouse movements relate to eye movements on Web search results pages, Web InformationSeeking and Interaction (WISI) Workshop at the 30th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 29-32, 2007. M. C. hen, J. R. Anderson, M. H. Sohn, What can a mouse cursor tell us more? Correlation of eye/mouse movements on web browsing, CHI EA '01 extended abstracts on Human factors in computing systems, 2001. Q. Guo, E. Agichtein, Exploring mouse movements for inferring query intent, 31st annual international ACM SIGIR conference on Research and development in information retrieval, pp. 707-708, 2008. A.L. Cox and M.M. Silva, The role of mouse movements in interactive search, 28th Annual Meeting of the Cognitive Science Society, pp. 1156-1161, 2006. E. Frank, M. Hall, G. Holmes, R. Kirkby, B. Pfahringer, I. H. Witten, L. Trigg, Weka-A Machine Learning Workbench for Data Mining, Data Mining and Knowledge Discovery Handbook, pp. 1269-1277, 2010. J. R. Quinlan , C4.5: programs for machine learning, San Mateo: Morgan Kaufmann Publishers, 1993. J. R. Quinlan, Improved Use of Continuous Attributes in C4.5, Journal of Artificial Intelligence Research, vol 4, pp. 77-90, 1996. T. Kanungo, D. M. Mount, N. S. Netanyahu, C. Piatko, R. Silverman, A. Y. Wu , The analysis of a simple k-means clustering algorithm, sixteenth annual symposium on Computational geometry, pp. 100-109, 2000. J. J. Garrett, Ajax: A New Approach to Web Applications, http://adaptivepath.com/ideas/ajax-new-approach-webapplications. 2005. http://www.php.net http://www.mysql.com/ http://en.wikipedia.org/wiki/JavaScript S. Nash, Learning Objects, Learning Object Repositories, and Learning Theory: Preliminary Best Practices for Online Courses, Interdisciplinary Journal of Knowledge and Learning Objects, vol. 1, pp. 217-228, 2005. M. Martinez, Designing learning objects to personalize learning, The Instructional Use of Learning Objects, pp.151173, 2002. O. Conlan, D. Dagger and V. Wade, Towards a standards-based approach to e-Learning personalization using reusable learning objects, World Conference on E-Learning, E-Learn 2002, pp. 210217, 2002. F. Mueller, A. Lockerd, Cheese: Tracking Mouse Movement Activity on Websites, a Tool for User Modeling, CHI '01 extended abstracts on Human factors in computing, pp. 279-280, 2001.
(2007-2011) on Programming (FORTRAN, C and Internet Programming). He deals with programming both for Windows and UNIX environments in C++, Perl, Visual Basic, .NET, PHP and anything that comes handy. Research Interests: 1. Data Mining, 2. Intelligent Tutoring Systems, 3. Metadata, 4. Software Agents, 5. Fuzzy Systems. Dimitrios Georgiou Born in 1948. BSc. AUTh Mathematics Department , PhD. DUTh Postdoc, UCBerceley. Visiting Scholar at UC, Davis (1980-1982). Visiting Professor URI (1989-92). Associate Professor, School of Engineering DUTh since 1991. Instructor of several undergraduate and postgraduate courses he also teaches vocational courses to Hellenic Power Corporation, Center for Productivity and Development, Training School for High School Teachers, Hellenic Air Force e.a. He published six text books. Research interests: 1) Qualitative behaviour of solutions of ODE, Difference Equations, and PDEs, 4) Numerical Methods for Boundary Value Problems, 5) Intelligent Tutoring Systems. 6) Computer Networks. Research Papers published in Several Scientific Journals and conferences. Member of IEEE Computer Society, AACE, AMS, ECMI e.a. Referee for the Journal of Mathematical Analysis and its Applications and other journals in mathematics and educational technology. A. Karakos received the Degree of Mathematician from the Department of Mathematics from Aristotle University of Thessaloniki, Greece and the Maitrise d' Informatique from the university PIERRE ET MARIE CURIE, Paris. He completed his PhD studies at university PIERRE ET MARIE. He is Assistant Professor at the Dept. of Electrical and Computer Eng., Democritus University of Thrace, Greece. His research interests are in the areas of learning systems, data analysis and programming languages.
George Tsoulouhas Born in 1979. BSc. Democritus University of Thrace (Duth), Greece. Polytechnics School, Electrical and Computer Engineering Department. PhD candidate at Democritus University of Thrace (DUTh), Polytechnics School, Electrical and Computer Engineering Department. Professor Assistant as PhD candidate