Showing posts with label emmy. Show all posts
Showing posts with label emmy. Show all posts

Wednesday, March 31, 2021

MPEG news: a report from the 133rd meeting (virtual)

The original blog post can be found at the Bitmovin Techblog and has been modified/updated to focus on and highlight research aspects. Additionally, this version of the blog post will also be posted at ACM SIGMM Records.

MPEG Systems File Format Subgroup wins Technology & Engineering Emmy® Award

The 133rd MPEG meeting was once again held as an online meeting, and this time, kicked off with great news, that MPEG is one of the organizations honored as a 72nd Annual Technology & Engineering Emmy® Awards Recipient, specifically the MPEG Systems File Format Subgroup and its ISO Base Media File Format (ISOBMFF) et al.

The official press release can be found here and comprises the following items:
  • 6th Emmy® Award for MPEG Technology: MPEG Systems File Format Subgroup wins Technology & Engineering Emmy® Award
  • Essential Video Coding (EVC) verification test finalized
  • MPEG issues a Call for Evidence on Video Coding for Machines
  • Neural Network Compression for Multimedia Applications – MPEG calls for technologies for incremental coding of neural networks
  • MPEG Systems reaches the first milestone for supporting Versatile Video Coding (VVC) and Essential Video Coding (EVC) in the Common Media Application Format (CMAF)
  • MPEG Systems continuously enhances Dynamic Adaptive Streaming over HTTP (DASH)
  • MPEG Systems reached the first milestone to carry event messages in tracks of the ISO Base Media File Format
In this report, I’d like to focus on ISOBMFF, EVC, CMAF, and DASH.

MPEG Systems File Format Subgroup wins Technology & Engineering Emmy® Award

MPEG is pleased to report that the File Format subgroup of MPEG Systems is being recognized this year by the National Academy for Television Arts and Sciences (NATAS) with a Technology & Engineering Emmy® for their 20 years of work on the ISO Base Media File Format (ISOBMFF). This format was first standardized in 1999 as part of the MPEG-4 Systems specification and is now in its 6th edition as ISO/IEC 14496-12. It has been used and adopted by many other specifications, e.g.:
  • MP4 and 3GP file formats;
  • Carriage of NAL unit structured video in the ISO Base Media File Format, which provides support for AVC, HEVC, VVC, EVC, and probably soon LCEVC;
  • MPEG-21 file format;
  • Dynamic Adaptive Streaming over HTTP (DASH) and Common Media Application Format (CMAF);
  • High-Efficiency Image Format (HEIF);
  • Timed text and other visual overlays in ISOBMFF;
  • Common encryption format;
  • Carriage of timed metadata metrics of media;
  • Derived visual tracks;
  • Event message track format;
  • Carriage of uncompressed video;
  • Omnidirectional Media Format (OMAF);
  • Carriage of visual volumetric video-based coding data;
  • Carriage of geometry-based point cloud compression data;
  • … to be continued!
This is MPEG’s fourth Technology & Engineering Emmy® Award (after MPEG-1 and MPEG-2 together with JPEG in 1996, Advanced Video Coding (AVC) in 2008, and MPEG-2 Transport Stream in 2013) and sixth overall Emmy® Award, including the Primetime Engineering Emmy® Awards for Advanced Video Coding (AVC) High Profile in 2008 and High-Efficiency Video Coding (HEVC) in 2017, respectively.

Essential Video Coding (EVC) verification test finalized

At the 133rd MPEG meeting, a verification testing assessment of the Essential Video Coding (EVC) standard was completed. The first part of the EVC verification test using high dynamic range (HDR) and wide color gamut (WCG) was completed at the 132nd MPEG meeting. A subjective quality evaluation was conducted comparing the EVC Main profile to the HEVC Main 10 profile and the EVC Baseline profile to AVC High 10 profile, respectively:
  • Analysis of the subjective test results showed that the average bitrate savings for EVC Main profile are approximately 40% compared to HEVC Main 10 profile, using UHD and HD SDR content encoded in both random access and low delay configurations.
  • The average bitrate savings for the EVC Baseline profile compared to the AVC High 10 profile is approximately 40% using UHD SDR content encoded in the random-access configuration and approximately 35% using HD SDR content encoded in the low delay configuration.
  • Verification test results using HDR content had shown average bitrate savings for EVC Main profile of approximately 35% compared to HEVC Main 10 profile.
By providing significantly improved compression efficiency compared to HEVC and earlier video coding standards while encouraging the timely publication of licensing terms, the MPEG-5 EVC standard is expected to meet the market needs of emerging delivery protocols and networks, such as 5G, enabling the delivery of high-quality video services to an ever-growing audience.

In addition to verification tests, EVC, along with VVC and CMAF were subject to further improvements to their support systems.

Research aspects: as for every new video codec, its compression efficiency and computational complexity are important performance metrics. Additionally, the availability of (efficient) open-source implementations (i.e., x264, x265, soon x266, VVenC, aomenc, et al., etc.) are vital for its adoption in the (academic) research community.

MPEG Systems reaches the first milestone for supporting Versatile Video Coding (VVC) and Essential Video Coding (EVC) in the Common Media Application Format (CMAF)

At the 133rd MPEG meeting, MPEG Systems promoted Amendment 2 of the Common Media Application Format (CMAF) to Committee Draft Amendment (CDAM) status, the first major milestone in the ISO/IEC approval process. This amendment defines:
  • constraints to (i) Versatile Video Coding (VVC) and (ii) Essential Video Coding (EVC) video elementary streams when carried in a CMAF video track;
  • codec parameters to be used for CMAF switching sets with VVC and EVC tracks; and
  • support of the newly introduced MPEG-H 3D Audio profile.
It is expected to reach its final milestone in early 2022. For research aspects related to CMAF, the reader is referred to the next section about DASH.

MPEG Systems continuously enhances Dynamic Adaptive Streaming over HTTP (DASH)

At the 133rd MPEG meeting, MPEG Systems promoted Part 8 of Dynamic Adaptive Streaming over HTTP (DASH), also referred to as “Session-based DASH,” to its final stage of standardization (i.e., Final Draft International Standard (FDIS)).

Historically, in DASH, every client uses the same Media Presentation Description (MPD), as it best serves the service's scalability. However, there have been increasing requests from the industry to enable customized manifests for enabling personalized services. MPEG Systems has standardized a solution to this problem without sacrificing scalability. Session-based DASH adds a mechanism to the MPD to refer to another document, called Session-based Description (SBD), allowing per-session information. The DASH client can use this information (i.e., variables and their values) provided in the SBD to derive the URLs for HTTP GET requests.

An updated overview of DASH standards/features can be found in the Figure below.
MPEG DASH Status as of January 2021.

Research aspects: CMAF is mostly like becoming the main segment format to be used in the context of HTTP adaptive streaming (HAS) and, thus, also DASH (hence also the name common media application format). Supporting a plethora of media coding formats will inevitably result in a multi-codec dilemma that needs to be addressed soon as there will be no flag day where everyone will switch to a new coding format. Thus, designing efficient bitrate ladders for multi-codec delivery will an interesting research aspect, which needs to include device/player support (i.e., some devices/player will support only a subset of available codecs), storage capacity/costs within the cloud as well as within the delivery network, and network distribution capacity/costs (i.e., CDN costs).

The 134th MPEG meeting will be again an online meeting in April 2021. Click here for more information about MPEG meetings and their developments.

Monday, October 14, 2019

Happy World Standards Day 2019 - Video Standards Create a Global Stage

Today on October 14, we celebrate the World Standards Day, "the day honors the efforts of the thousands of experts who develop voluntary standards within standards development organizations" (SDOs). Many SDOs such as W3CIETF, ITU, ISO (incl. JPEG and MPEG) celebrate this with individual statements, highlighting the importance of standards and interoperability in today's information and communication technology landscape. Interestingly, this year's topic for the World Standards Day within ISO is about video standards creating a global stage. Similarly, national bodies of ISO provide such statements within their own country, e.g., the A.S.I. statement can be found here (note: in German). I have also blogged about the World Standards Day in 2017.

HEVC Emmy located at ITU-T, Geneva, CH (Oct'19).
The numbers for video content created, distributed (incl. delivery, streaming, ...), processed, consumed, etc. increases tremendously and, actually, more than 60 percent of today's world-wide internet traffic is attributed to video streaming. For example, almost 700,000 hours of video are watched on Netflix and 4.5 million videos are viewed on YouTube within a single internet minute in 2019. Videos are typically compressed (or encoded) prior to distribution and are decompressed (or decoded) before rendering on potentially a plethora of heterogeneous devices. Such codecs (portmanteau of coder-decoder) are subject to standardization and with AVC and HEVC (jointly developed by ISO/IEC MPEG and ITU-T VCEG) we have two successful standards which even have been honored with Primetime Engineering Emmy Awards (see one of them in the picture).

Within Austria, Bitmovin has been awarded with the Living Standards Award in 2017 for its contribution to the MPEG-DASH standard, which enables dynamic adaptive streaming over HTTP. This standard -- the 4th edition is becoming available very soon -- is now heavily deployed and has been adopted within products and services such as Netflix, Amazon Prime Video, YouTube, etc.

Standardization can be both source for and sink of research activities, i.e., development of efficient algorithms conforming to existing standards or research efforts leading to new standards. One example of such research efforts just recently started at the Institute of Information Technology (ITEC) at Alpen-Adria-Universität Klagenfurt (AAU) as part of the ATHENA (AdapTive Streaming over HTTP and Emerging Networked MultimediA Services) project. The aim of this project is to research and develop novel paradigms, approaches, (prototype) tools and evaluation results for the phases (i) multimedia content provisioning (video coding), (ii) content delivery (video networking), (iii) content consumption (player) in the media delivery chain, and (iv) end-to-end aspects, with a focus on, but not being limited to, HTTP Adaptive Streaming (HAS).

The SDO behind these standards is MPEG (officially ISO/IEC JTC 1/SC 29/WG 11), which has a proven track record of producing very successful standards (not only those mentioned as examples above) and its future is currently discussed within its parent body (SC 29). A possible MPEG future is described here, which suggests upgrading the current SC 29 working groups to sub-committees (SCs), specifically to spin-off a new SC that basically covers MPEG while the remaining WG (JPEG) arises within SC 29. This proposal of MPEG and JPEG as SC is partially motivated by the fact that both WGs work on a large set of standardization projects, actually developed by its subgroups. Thus, elevating both WGs (JPEG & MPEG) to SC level would only reflect the current status quo but would also preserve two important brands for both academia and industry. Further details can be found at http://mpegfuture.org/.

Friday, February 6, 2009

MPEG news: a report from the 87th meeting in Lausanne, Switzerland

MPEG’s high-performance video coding (HVC) standard is evolving and currently targets mobile devices, IPTV, and Ultra-HD. However, the trade-off between coding efficiency and codec complexity is still driving the thresholds which has been set at this meeting, well, at least initially. For low-complexity HVC is seeking for 25% gain in coding efficiency and for full-/increased-complexity the threshold is set to 50% gain in coding efficiency (cf. goal for AVC standardization). The application scenarios range from shared Ultra-HD to personalized experiences. The latter is targeting a viewing distance of 0.5*h (i.e., 50cm) and for personal use only. A vision document (N10361) has been issued at this meeting and interested parties are requested to join the Ad-hoc Group (AhG) for further discussions (N10371). One open issue with HVC is whether and how audio will adapt to these new developments, probably with high-performance audio coding (HAC) or can HE-AAC already complement HVC? Anyway, short viewing distance ultimately leads to short listening distances and a wide angle for the sound stage...

3D video coding (3DVC) has already a history within MPEG. A vision document has been produced here stating that MPEG’s version of 3D video shall be compatible with existing standards, mono/stereo devices, and existing/planned infrastructure. While the former seems to be very clear, i.e., backwards compatibility to AVC is desirable in the same way as it was for SVC, the latter needs some more discussions. How can one be compatible with planned infrastructures?

In the area of advanced IPTV terminal (AIT) we had a productive meeting with ITU-T SG16 Q.13. The objectives look very promising and aim to increase the user experience thanks to the usage of latest media coding, transport, and processing technologies. An AhG has been set up to discuss the details. Again, AhGs are open to the public and everyone can join and contribute.

The information exchange with virtual worlds (MPEG-V) has merged with the representation of sensory effects (RoSE) and will be now jointly developed under the umbrella of MPEG-V. However, it was sought that ‘information exchange with virtual worlds’ is no longer appropriate and they’re seeking for a new name. Any inputs are welcome and are discussed within the AhG that is open to everybody. Thus, join this exciting activity now! In its current working draft the following sensory effects are defined which shall create an enhanced, immersive user experience: various light effects, temperature, wind, vibration, water sprayer, perfumer/scent, fog, window blind/shadow, sound, and color correction.

Finally, MPEG got another Emmy, the "Technology and Engineering Emmy Award 2007-2008" for MPEG-4 AVC from the National Academy of Television Arts & Sciences (NATAS). It seems that success within MPEG is from now on measured in terms of number of awards received ;-) And last but not least, if you’ve been ever interested in the MPEG vision, you may find it here.

Friday, October 17, 2008

MPEG got an Emmy!

At this week's MPEG meeting, MPEG celebrated the JVT receipt of the 2008 Primetime Emmy Engineering Award. They've received it for its work on the Advanced Video Coding (AVC) High Profile. The actual awards - three of them - are now hosted at the three main standardization bodies who worked on AVC, namely IEC, ISO, and ITU. However, somehow Gary Sullivan managed to get an additional Emmy statue for MPEG and presented it to Leonardo Chiariglione (MPEG convenor). The MPEG Emmy statue will travel from national body to national body based on MPEG's meeting schedule. That is, the next chance to see the Emmy statue live will be in Lausanne, CH at the 87th MPEG meeting.

During the Friday plenary I also got the chance to take a picture of/with the Emmy statue ;-) Another picture can be found here.