W3C is pleased to announce the relaunch of the Multimodal Interaction Working Group to develop technology that enables users to use their preferred modes of interaction with the Web. Deborah Dahl (Invited Expert) chairs the group which is chartered to develop open standards to adapt to device, user and environmental conditions, and to allow multiple modes of Web interaction including GUI, speech, vision, pen, gestures, haptic interfaces, sensor data, etc. W3C Members may use this form to join the Working Group. Read about the Multimodal Interaction Activity.
Very interesting as this WG aims bringing the Emotion Markup Language to Candidate Recommendation. So far the Elements of an EmotionML 1.0 have been identified as a W3C Incubator Group Report. Interestingly, ISO/IEC MPEG also has emotions within the scope of its MPEG-V work item. I wonder whether both groups could benefit from each other, the one way or the other ... let me know what is your opinion.