Tuesday, February 10, 2009

W3C Multimodal Standard Brings Web to More People, More Ways

As part of ensuring the Web is available to all people on any device, W3C published a new standard today to enable interactions beyond the familiar keyboard and mouse. EMMA, the EMMA: Extensible MultiModal Annotation Markup Language, promotes the development of rich Web applications that can be adapted to more input modes (such as handwriting, natural language, and gestures) and output modes (such as synthesized speech) at lower cost. The document, published by the Multimodal Interaction Working Group, is part of a set of specifications for multimodal systems, and provides details of an XML markup language for containing and annotating the interpretation of user input. Read the press release and testimonials, and learn more about the Multimodal Interaction Activity.

Wow, this development really took a long time but interesting stuff though. The first WD dates back to August 2003 which makes me wonder whether somebody will use/implement that. Nevertheless, it can be used for universal multimedia access w.r.t. multimodal interaction to the content. Other activities of this working group comprise Multimodal Architectures and Interfaces and Ink Markup Language (InkML) both at WD stage. Hope its development does not last forever...

No comments: