Special Issue on Emotional Intelligence for Online Learning
(http://maiga.athabascau.ca/editors/JDET-Special_Issue-2011-Emotional_Intelligence.htm)

in

International Journal of Distance Education Technologies

(http://www.igi-global.com/Bookstore/TitleDetails.aspx?TitleId=1078&DetailsType=Description

Many intelligent tutoring systems have been developed and used to support students' learning. Good online learning systems require emotional communications happened in-between the tutors and the learners. Emotions play an important role when users engage in career or drama training, or language learning situations via role-play. Interpreting player experience and affect from open-ended verbal communication and diverse body language has been challenging research tasks. It further promotes the importance of such studies when employing diverse game-play modalities (e.g. mobile interfaces and augmented reality platforms) and when recognizing affect and mood from metaphorical affective expressions, dialogue context, emotionally ambiguous or non-prototypical spoken utterances, facial expressions and gestures. Such research is significantly beneficial to the development of intelligent virtual agents which are capable of interpreting social relationships, context, general mood and emotion, sensing or reasonably predicating others' inter-conversion, identifying its role and participating intelligently in open-ended interaction. How to develop a system which can identify user's emotion and/or present its feelings to the user according to user's words, behaviors, and performance in online learning is definitely an important, a very helpful as well an interesting research topic. The purpose of this special issue is to explore how models, theories, and solutions of emotion intelligence can be used in online learning and what benefits users can receive from such emotion intelligence-embedded systems and agents.

 

Guest Editors (in alphabet order): Dr. Maiga Chang, Dr. Rita Kuo and Dr. Li Zhang

 

Suggested topics:

We cordially invite authors to submit high quality manuscripts for any application domain as long as the core of the manuscript belongs to:

 

ü      Affect sensing from text, speech, facial expression and gestures

ü      Affect inspired interactive intelligent games (e.g., mobile interfaces, augmented reality platforms, natural user interfaces - non touch based user interfaces)

ü      Emotion modelling and generation

ü      Human computer interaction issues and challenges that affective computing/emotional intelligence solutions for online learning may have

ü      Emotional tutoring agents/learning companions/interactive robots

ü      Affective computing tools, systems and applications for online learning

ü      Emotional intelligent agents for assisting teaching or learning

ü      Multi-agent based affective computing systems and applications

ü      Practical experiences in using & deploying emotional intelligence for online learning

ü      Successful cases of applying emotional intelligence to online learning

ü      Not-so-successful cases and the lessons learnt

ü      Evaluation models for emotional intelligent agents/systems

 

 

Important dates and manuscript guidelines:

All submissions have to follow IGI Global journal manuscript guidelines and should be sent to the guest editors directly via email (maiga.chang@gmail.com, rita.mcsl@gmail.com and L.Zhang@cs.bham.ac.uk). by February 28, 2012. All submissions are going to be reviewed by at least three reviewers, the final camera-ready manuscripts have to be revised by the author(s) according to reviewer comments before sending to the guest editors by June 1, 2012. The other important dates are:

 

February 28, 2012: Submission deadline

May 1, 2012: Review result notification

June 1, 2012: Revised manuscript submission deadline

June 15, 2012: Acceptance notification

June 30, 2012: Final camera-ready manuscript submission deadline

 

IGI Global manuscript guidelines at: http://www.igi-global.com/Files/AuthorEditor/journaltemplate.doc  

 

For queries, please contact Dr. Maiga Chang (maiga.chang@gmail.com)