"How do you know that I don’t understand?" A look at the future of intelligent tutoring systems

Sarrafzadeh, Abdolhossein, Alexander, Samuel, Dadgostar, Farhad, Fan, Chao and Bigdeli, Abbas (2008) "How do you know that I don’t understand?" A look at the future of intelligent tutoring systems. Computers in Human Behavior, 24 4: 1342-1363. doi:10.1016/j.chb.2007.07.008

Author Sarrafzadeh, Abdolhossein
Alexander, Samuel
Dadgostar, Farhad
Fan, Chao
Bigdeli, Abbas
Title "How do you know that I don’t understand?" A look at the future of intelligent tutoring systems
Journal name Computers in Human Behavior   Check publisher's open access policy
ISSN 0747-5632
Publication date 2008-07-01
Year available 2007
Sub-type Article (original research)
DOI 10.1016/j.chb.2007.07.008
Open Access Status Not yet assessed
Volume 24
Issue 4
Start page 1342
End page 1363
Total pages 22
Place of publication Kidlington, United Kingdom
Publisher Pergamon
Language eng
Subject 280203 Image Processing
280207 Pattern Recognition
280213 Other Artificial Intelligence
970108 Expanding Knowledge in the Information and Computing Sciences
Formatted abstract
Many software systems would significantly improve performance if they could adapt to the emotional state of the user, for example if Intelligent Tutoring Systems (ITSs), ATM’s, ticketing machines could recognise when users were confused, frustrated or angry they could guide the user back to remedial help systems so improving the service. Many researchers now feel strongly that ITSs would be significantly enhanced if computers could adapt to the emotions of students. This idea has spawned the developing field of affective tutoring systems (ATSs): ATSs are ITSs that are able to adapt to the affective state of students. The term “affective tutoring system” can be traced back as far as Rosalind Picard’s book Affective Computing in 1997.

This paper presents research leading to the development of Easy with Eve, an ATS for primary school mathematics. The system utilises a network of computer systems, mainly embedded devices to detect student emotion and other significant bio-signals. It will then adapt to students and displays emotion via a lifelike agent called Eve. Eve’s tutoring adaptations are guided by a case-based method for adapting to student states; this method uses data that was generated by an observational study of human tutors. This paper presents the observational study, the case-based method, the ATS itself and its implementation on a distributed computer systems for real-time performance, and finally the implications of the findings for Human Computer Interaction in general and e-learning in particular. Web-based applications of the technology developed in this research are discussed throughout the paper.
Keyword Affective tutoring systems
Lifelike agents
Emotion detection
Facial expressions
Q-Index Code C1
Q-Index Status Provisional Code
Institutional Status Non-UQ
Additional Notes Published online 14 September 2007

Version Filter Type
Citation counts: TR Web of Science Citation Count  Cited 48 times in Thomson Reuters Web of Science Article | Citations
Scopus Citation Count Cited 62 times in Scopus Article | Citations
Google Scholar Search Google Scholar
Created: Wed, 06 May 2009, 21:44:05 EST by Dr Ildiko Horvath on behalf of Faculty Of Engineering, Architecture & Info Tech