Thursday, May 20, 2010

11th Multimedia Metadata Community Workshop on Interoperable Social Multimedia Applications (WISMA 2010)

On May 19-20, 2010 I've attended the 11th Multimedia Metadata Community Workshop on Interoperable Social Multimedia Applications (WISMA 2010) in Barcelona, Spain. The proceedings are available online on CEUR-WS.org and Twitter stream in case you'd like to review it.

I had two presentation which I'd like to provide here. The first one was on A Metadata Model for Peer-to-Peer Media Distribution:
Abstract: In this paper we describe a metadata solution for a Peer-to-Peer (P2P) content distribution system termed NextShare. We outline the key motivating factors for our approach, detail the overall generic architecture we have developed and present the workflow for delivering metadata through Peer-to-Peer based content distribution. The paper also presents the metadata model we have developed and we describe in detail how all the content can be packetized and distributed using NextShare. Finally, a description of the core and optional metadata attributes which may be utilized within the system is provided.
The second presentation provided an answer for the following question: Are Sensory Effects ready for the World Wide Web?
Abstract. The World Wide Web (WWW) is one of the main entry points to access and consume Internet content in various forms. In particular, the Web browser is used to access different types of media (i.e., text, image, audio, and video) and on some platforms is the only way to access the vast amount of information on the Web. Recently, it has been proposed to stimulate also other senses than vision or audition while consuming multimedia content through so-called sensory effects, with the aim to increase the user’s Quality of Experience (QoE). The effects are represented as Sensory Effects Metadata (SEM) which is associated to traditional multimedia content and is rendered (synchronized with the media) on sensory devices like fans, vibration chairs, lamps, etc. In this paper we provide a principal investigation of whether the sensory effects are ready for the WWW and, in anticipation of the result, we propose how to embed sensory effect metadata within Web content and the synchronized rendering thereof.

No comments: