What is the format of an A / V container with streaming support?

What exactly distinguishes streaming container file formats such as Matroska and MPEG-4 Part 14 from those that are supposedly not “streaming”, such as AVI?

Do they just provide an order of metadata and data that allows you to work with decoding software without random access to the media file or large buffers, or include some synchronization headers to allow the client to carry some packet loss (with reduced accuracy)?

Or is it even a function that should be provided by audio and video codecs instead of a container? (I think not, because MPEG-4 (A) SP appears to be thread safe (whatever that means) inside MPEG-4 Part 14 containers, but not inside AVI.)

I am wondering if it is possible to send a "streaming ready" file via a lossy connection (UDP without any additional levels of synchronization or metadata), and it is reasonable to expect the client to suffer moderate damage and / or packet loss without a permanent loss of synchronization, or if some intermediate level will be required.

Update: I found that MPEG transport streams seem to provide features such as periodic synchronization and metadata frames that allow the client to recover from data loss or corruption, while MP4 and Matroska seem to provide this information in the file header.

Are there really two types of streaming formats - “streaming”, which still require a reliable transport protocol (such as HTTP) under the same as MP4 and Matroska, and “correct” streaming formats, such as MPEG-TS, that can carry data loss in the middle of -stream and allow clients to tune in at any given time, periodically including headers and metadata?

How does RTP fit here? It seems to provide many functions (frame numbering, format description headers, tips for the codec on how to interpret the frame (B- or I-frame)), which are also present in MPEG-TS, but are absent in MP4 and Matroska.

+4
source share
1 answer

A file format, such as AVI, fully compiles the various offsets and lengths of a media fragment in a special index. this index is placed at the end of the file. Therefore, if you want to play AVI, the player must first go to the end of the file to get this index before any game can happen. This is what makes AVI available.

In streaming file formats, metadata (media type, time position, encoding and length) alternates mainly in the form of segment headers.

The second important aspect of streambale content is the presence of timestamps. Each media segment must have an accurate time stamp to make sure that if you play a session for 5 hours, this should not lead to a gradual loss of LipSync due to the relative slip between the audio and video playback speed. AVI usually assumes that there is a standard speed (25 frames per second) for displaying and playing sound, and goes to a device to control this time. That makes him terrible!

+4
source

Source: https://habr.com/ru/post/1391535/


All Articles