Saturday, July 6, 2019

DASH-IF sponsored ice cream social event at ACM MMSys 2019

The DASH-IF sponsored the ice cream social event at ACM MMSys 2019 which allowed for networking and discussions. Here are some impressions from this unique event...






Friday, July 5, 2019

Workshop on Coding Technologies for Immersive Audio/Visual Experiences

This is an invitation to attend the Workshop on Coding Technologies for Immersive Audio/Visual Experiences. It is open not limited to MPEG members but publicly accessible with free-of-charge preregistration. Non-MPEG member may send your registration information to Lu Yu  and Silke Kenzler  with your name, affiliation and email address.

This workshop will cover the MPEG-I immersive Audio and Visual activities - past, present and future, demonstrate systems and technologies and discuss future requirements on standardization to provide audio/visual immersive experiences.

Time/Date: 13:00-18:00, 10 July, 2019

Address: 
Room: Drottingporten 3
Clarion Post Hotel
Drottningtorget 10
411 03 Gothenburg, Sweden

Programm: 




1300-1315
Introduction 
(Lu Yu, Zhejiang University)
1315-1345
Usecases and challenges about user immersive experiences
(Valerie Allie, InterDigital)
1345-1415
Overview of technologies for immersive visual experiences: capture, processing, compression, standardization and display
(Marek Domanski, Poznan University of Technology)
1415-1445
MPEG-I Immersive Audio
(Schuyler Quackenbush, Audio Research Labs)
1445-1455
Brief introduction about demos: 
    Integral photography display (NHK)
    Realtime interactive demo with 3DoF+ content (InterDigital)
    Plenoptic 2.0 video camera (Tsinghua University)
    A simple free-viewpoint television system (Poznan University of Technology)
1455-1530
Demos 
Coffee break
1530-1600
360° and 3DoF+ video
(Bart Kroon, Philips)
1600-1630
Point cloud compression
(Marius Preda, Telecom SudParis, CNRS Samovar)
1630-1700
How can we achieve 6DoF video compression?
(Joel Jung, Orange)
1700-1730
How can we achieve lenslet video compression? 
(Xin Jin, Tsinghua University, 
Mehrdad Teratani, Nagoya University)
1730-1800
Discussion

Thursday, July 4, 2019

ACMMM'19 Tutorial: A Journey towards Fully Immersive Media Access

ACM Multimedia 2019
October 21-25, 2019, Nice, France

Note: exact date/time slot of this tutorial will be provided at a later stage

Lecturers

Christian Timmerer, Alpen-Adria-Universit├Ąt Klagenfurt & Bitmovin, Inc.
Ali C. Begen, Ozyegin University and Networked Media

Abstract

Universal media access as proposed in the late 90s, early 2000 is now reality. Thus, we can generate, distribute, share, and consume any media content, anywhere, anytime, and with/on any device. A major technical breakthrough was the adaptive streaming over HTTP resulting in the standardization of MPEG-DASH, which is now successfully deployed in HTML5 environments thanks to corresponding media source extensions (MSE). The next big thing in adaptive media streaming is virtual reality applications and, specifically, omnidirectional (360°) media streaming, which is currently built on top of the existing adaptive streaming ecosystems. This tutorial provides a detailed overview of adaptive streaming of both traditional and omnidirectional media within HTML5 environments. The tutorial focuses on the basic principles and paradigms for adaptive streaming – both traditional and omnidirectional media – as well as on already deployed content generation, distribution, and consumption workflows. Additionally, the tutorial provides insights into standards and emerging technologies in the adaptive streaming space. Finally, the tutorial includes the latest approaches for immersive media streaming enabling 6DoF DASH through Point Cloud Compression (PCC) and concludes with open research issues and industry efforts in this domain.

Keywords: Omnidirectional media, HTTP adaptive streaming, over-the-top video, 360 video, virtual reality, immersive media access.

Learning Objectives

This tutorial consists of two main parts. In the first part, we provide a detailed overview of the HTML5 standard and show how it can be used for adaptive streaming deployments. In particular, we focus on the HTML5 video, media extensions, and multi-bitrate encoding, encapsulation and encryption workflows, and survey well-established streaming solutions. Furthermore, we present experiences from the existing deployments and the relevant de jure and de facto standards (DASH, HLS, CMAF) in this space. In the second part, we focus on omnidirectional (360) media from creation to consumption as well as first thoughts on dynamic adaptive point cloud streaming. We survey means for the acquisition, projection, coding and packaging of omnidirectional media as well as delivery, decoding and rendering methods. Emerging standards and industry practices are covered as well (OMAF, VR-IF). Both parts present some of the current research trends, open issues that need further exploration and investigation, and various efforts that are underway in the streaming industry.
Upon attending this tutorial, the participants will have an overview and understanding of the following topics:
  • Principles of HTTP adaptive streaming for the Web/HTML5
  • Principles of omnidirectional (360-degree) media delivery
  • Content generation, distribution and consumption workflows for traditional and omnidirectional media
  • Standards and emerging technologies in the adaptive streaming space
  • Current and future research on traditional and omnidirectional media delivery, specifically enabling 6DoF adaptive streaming through point cloud compression
ACM Multimedia attracts attendees that are quite knowledgeable in specific areas. However, not all are experts across multiple disciplines (such as the subject matter here) and only few are familiar with what is happening in the field and standards. Thus, we believe the proposed tutorial will be of interest to this year’s attendees as much as it did in the past.

Table of Contents

Part I: The HTML5 Standard and Adaptive Streaming
  • HTML5 video and media extensions
  • Survey of well-established streaming solutions
  • Multi-bitrate encoding, and encapsulation and encryption workflows
  • The MPEG-DASH standard, Apple HLS and the developing CMAF standard
  • Common issues in scaling and improving quality, multi-screen/hybrid delivery
Part II: Omnidirectional (360-degree) Media
  • Acquisition, projection, coding and packaging of 360-degree video
  • Delivery, decoding and rendering methods
  • The developing MPEG-OMAF and MPEG-I standards
  • Ongoing industry efforts, specifically towards 6DoF adaptive streaming

Speakers

Christian Timmerer received his M.Sc. (Dipl.-Ing.) in January 2003 and his Ph.D. (Dr.techn.) in June 2006 (for research on the adaptation of scalable multimedia content in streaming and constraint environments) both from the Alpen-Adria-Universität (AAU) Klagenfurt. He joined the AAU in 1999 (as a system administrator) and is currently an Associate Professor at the Institute of Information Technology (ITEC) within the Multimedia Communication Group. His research interests include immersive multimedia communication, streaming, adaptation, Quality of Experience, and Sensory Experience. He was the general chair of WIAMIS 2008, QoMEX 2013, MMSys 2016, and PV 2018 and has participated in several EC-funded projects, notably DANAE, ENTHRONE, P2P-Next, ALICANTE, SocialSensor, COST IC1003 QUALINET, and ICoSOLE. He also participated in ISO/MPEG work for several years, notably in the area of MPEG- 21, MPEG-M, MPEG-V, and MPEG-DASH where he also served as standard editor. In 2013, he cofounded Bitmovin (http://www.bitmovin.com/) to provide professional services around MPEG-DASH where he holds the position of the Chief Innovation Officer (CIO) – Head of Research and Standardization. He is a senior member of IEEE and member of ACM, specifically IEEE Computer Society, IEEE Communications Society, and ACM SIGMM. Dr. Timmerer was a guest editor of three special issues for the IEEE Journal on Selected Areas in Communications (JSAC) and currently serves as associate editor for IEEE Transactions on Multimedia. Further information available at http://blog.timmerer.com.

Ali C. Begen is the co-founder of Networked Media, a technology company that offers consulting services to industrial, legal and academic institutions in the IP video space. He has been a research and development engineer since 2001, and has broad experience in mathematical modeling, performance analysis, optimization, standards development, intellectual property and innovation. Between 2007 and 2015, he was with the Video and Content Platforms Research and Advanced Development Group at Cisco, where he designed and developed algorithms, protocols, products and solutions in the service provider and enterprise video domains. Currently, he is also affiliated with Ozyegin University, where he is teaching and advising students in the computer science department. Ali has a PhD in electrical and computer engineering from Georgia Tech. To date, he received a number of academic and industry awards, and was granted 30+ US patents. He held editorial positions in leading magazines and journals, and served in the organizing committee of several international conferences and workshops in the field. He is a senior member of both the IEEE and ACM. In 2016, he was elected distinguished lecturer by the IEEE Communications Society, and in 2018, he was re-elected for another two-year term. More details are at http://ali.begen.net.

Monday, July 1, 2019

DASH-IF Interoperability Documents for Community Review

Community Review documents are published on the DASH-IF website in order to get feedback from the industry on tools and features that are documented for improved interoperability. For each of the documents, comments may be submitted on the technologies itself, on specific features, etc. These documents are only published temporarily for community review and will be replaced by a full version after the commenting period has closed and the comments have been addressed.

LOW-LATENCY DASH

The change request against IOP v4.3 for Community Review is accessible here. This change provides a new clause for live services that addresses specification updates as well as implementation guidelines to support Low-Latency DASH services addressing the requirements above. Community review is open until July 31st, 2019. Addition to IOP is expected by Q3/2019. Comments may be submitted through the github or public bugtracker.

LIVE MEDIA INGEST

The new draft specification is accessible here and a pdf version. This document specifies protocol interfaces for live ingest/egress of media content. It can be used between live ABR encoders, streaming origins, packagers and content delivery networks. It features support for redundant workflows with failover support and timed metadata. Community review is open until July 31st, 2019. Publication is expected by Q3/2019. Comments may be submitted through the github or public bugtracker.

Monday, June 24, 2019

IEEE JSAC: Multimedia Economics for Future Networks: Theory, Methods, and Applications


Guest Editorial
Multimedia Economics for Future Networks: Theory, Methods, and Applications

Authors: Wen Ji, Zhu Li, H. Vincent Poor, Christian Timmerer, and Wenwu Zhu

Abstract: With the growing integration of telecommunication networks, Internet of Things (IoT), and 5G networks, there is a tremendous demand for multimedia services over heterogeneous networks. According to recent survey reports, mobile video traffic accounted for 60 percent of total mobile data traffic in 2016, and it will reach up to 78 percent by the end of 2021. Users’ daily lives are inundated with multimedia services, such as online video streaming (e.g., YouTube and Netflix), social networks (e.g., Facebook, Instagram, and Twitter), IoT and machine generated video (e.g, surveillance cameras), and multimedia service providers (e.g., Over-the-Top (OTT) services). Multimedia data is thus becoming the dominant traffic in the near future for both wired and wireless networks.

W. Ji, Z. Li, H. V. Poor, C. Timmerer and W. Zhu, "Guest Editorial Multimedia Economics for Future Networks: Theory, Methods, and Applications," in IEEE Journal on Selected Areas in Communications, vol. 37, no. 7, pp. 1473-1477, July 2019.
doi: 10.1109/JSAC.2019.2918962

Linkhttps://doi.org/10.1109/JSAC.2019.2918962

Friday, June 21, 2019

DASH-IF awarded Best PhD Dissertation on Algorithms and Protocols for Adaptive Content Delivery over the Internet

For the first time, the DASH Industry Forum (DASH-IF) awarded the best PhD dissertation on algorithms and protocols for adaptive content delivery over the internet at ACM MMSys 2019. After a public call, the DASH-IF received a good number of nominations from all around the world to pick one winner. Due to the high number and quality of nominations (and since it's the first time), the DASH-IF decided to award also the second and third best dissertation; increasing its financial prize as follows: first place – $750; second place – $500; and third place – $250. The winners are chosen by a DASH Industry Forum appointed committee and results are final.

And the winner is...

First place
Dr. Abdelhak Bentaleb (National University of Singapore)
for the dissertation
Enabling Optimizations of Video Delivery in HTTP Adaptive Streaming
(From left to right): R. Zimmermann (supervisor), A. Bentaleb, A. Giladi (DASH-IF); (c) Ishita Dasgupta

Second place
Dr. Jeroen van der Hooft (Ghent University), Low-Latency Delivery of Adaptive Video Streaming Services
(From left to right): Jeroen van der Hooft, A. Giladi (DASH-IF); (c) Ishita Dasgupt

Third place
Dr. Jonathan Kua (Swinburne University of Technology), Achieving High Performance Content Streaming with Adaptive Chunklets and Active Queue Management

The DASH-IF would like to congratulate all winners and hope seeing you next year at ACM MMSys 2020.

DASH-IF awarded Excellence in DASH award at ACM MMSys 2019

The DASH Industry Forum Excellence in DASH Award at ACM MMSys 2019 acknowledges papers substantially addressing MPEG-DASH as the presentation format and are selected for presentation at ACM MMSys 2019. Preference is given to practical enhancements and developments which can sustain future commercial usefulness of DASH. The DASH format used should conform to the DASH-IF Interoperability Points as defined by http://dashif.org/guidelines/. It is a financial prize as follows: first place – €1000; second place – €500; and third place – €250. The winners are chosen by a DASH Industry Forum appointed committee and results are final.

This year's award goes to the following papers (Two first places, and one third):

1. Stefan Pham, Patrick Heeren, Daniel Silhavy, Stefan Arbanowski, Evaluation of shared resource allocation using SAND for ABR streaming
(From left to right): S. Pham, A. Giladi (DASH-IF); (c) Ishita Dasgupta


1. Abdelhak Bentaleb, Christian Timmerer, Ali C. Begen, Roger Zimmermann, Bandwidth prediction in low-latency chunked streaming
(From left to right): A. Bentaleb, A. Giladi (DASH-IF); (c) Ishita Dasgupta


3. Xing Liu, Bo Han, Feng Qian, Matteo Varvello, LIME: understanding commercial 360° live video streaming services
(From left to right): X. Liu, A. Giladi (DASH-IF); (c) Ishita Dasgupta

The DASH-IF would like to congratulate all winners and hope seeing you next year at ACM MMSys 2020.