How to implement the Adobe HTTP Streaming specification without using their streaming server

Starting with Flash 10.1, they added the ability to add bytes to a NetStream object using the appendBytes method (described here http://www.bytearray.org/?p=1689 ). The main reason for this addition is that Adobe finally supports HTTP streaming video. This is great, but it seems to you that you need to use the Adobe Media Streaming Server ( http://www.adobe.com/products/httpdynamicstreaming/ ) to create the correct video fragments from your existing video, ensuring smooth playback.

I tried to use the hacked version of HTTP streaming in the past where I replace NetStream objects (similarly here http://video.leizhu.com/video.html ), but there is always a pause between pieces. With the new appendBytes, I tried to make a layout with two sections of the video from the previous site, but even then the skip still remains.

Does anyone know how to format two consecutive .FLV files so that the appendBytes method on the NetStream object creates a nice smooth video without a noticeable gap between segments?

+4
source share
6 answers

I managed to get this working using the Adobe File Packager Tool, which Samuel talked about. I did not use the NetStream object, but I used the OSMF Sample Player, which I assume uses this internally. Here's how to do without using FMS:

  • Get Adobe File Packager for Http Dynamic Streaming from http://www.adobe.com/products/httpdynamicstreaming/
  • Run File Packager in an existing MP4 file containing H.264 / AAC as follows: C: \ Program Files \ Adobe \ Flash Media Server 4 \ tools \ f4fpackager> f4fpackager.exe - input-file = "MyFile.mp4" - segment-duration = 30

This will result in 30 second F4F files, also F4X and F4M file. F4F files are properly segmented (and fragmented) MP4 files that must be played. If you want to test this with the OSMF player, follow these steps:

So, to answer the original question, Adobe File Packager is a splitter to use, you do not need to buy FMS to use it, and it works for FLV and MP4 / F4V files.

+9
source

You do not need to use your server. Wowza supports Adobe's HTTP streaming version, and you can implement it yourself by properly distributing the video and downloading all segments on a standard HTTP server.

Links to all specifications for streaming Adobe HTTP:

http://help.adobe.com/en_US/HTTPStreaming/1.0/Using/WS9463dbe8dbe45c4c-1ae425bf126054c4d3f-7fff.html

Attempting to hack into a client to execute a custom HTTP streaming style will be much more frustrating.

Please note that HTTP streaming does not support streaming of several different videos, but transmits a single file that has been split into separate segments.

File packager

A command line tool that converts media files on demand into fragments and writes fragments to F4F files. File Packager is a standalone tool. You can use File Packager to encrypt files for use with Flash Access. For more information, see Packaging on Demand.

The file package is available on adobe.com and is installed from Adobe® Flash® Media Server to the rootinstall / tools / f4fpackager folder.

The link for downloading Packager is here: Download File Packager for dynamic HTTP streaming

http://www.adobe.com/products/httpdynamicstreaming/

+1
source

You can use F4Pack , it is a graphical interface around the command line tool from Adobe that allows you to process your flv / f4v file so that they can be used for dynamic HTTP streaming.

+1
source

The place in the OSMF code where this happens is a state machine with a timer inside the implementation of the HTTPNetStream class ... may be informative. I think I even added some useful comments there when I wrote this.

Regarding the general question:

If you read the entire FLV file in ByteArray and passed it to appendBytes, it will play. If you split this FLV file in half and pass the first half as an array of bytes, and then the second half as a byte array, this will also play.

If you want to be able to switch between bitrates without spaces, you need to split your FLV files into corresponding points of the key frame ... and remember that only the first call to appendBytes has the initial header of the FLV file ('F', 'L', 'V' , flags, offset) ... the rest just expect the FLV byte sequence to continue.

+1
source

I recently found a similar project for node.js to achieve m3u8 transcoding (https://github.com/andrewschaaf/media-server), but have not heard of one yet, except that Wowza did it outside the Origin module for Apache. Since the payload is almost identical, you'd better look for a good mp4 segmentation solution (a lot there) than looking for f4m segmentation. The problem is that moov atoms, especially on large mp4 videos, are difficult to manage and are placed in their original (near the beginning of the file) location. Even with the optimal ffmpeg and "qtfaststart" settings, you get a noticeably slow search, inefficient bandwidth usage (usually greedy) and a few minor headaches associated with cleaning / time that you don't get with flv / f4v playback.

In my player, I either intend to switch between HTTP Dynamic Streaming (HDS) and MP4 based on Apache's download and real-time session analysis using awk / cron instead of licensing the Adobe Access product to protect the stream. Both have unique 'onmetadata' handlers .. but in the end I get sequential / byte hashes almost equivalent. Just MP4 is slower. So mod_origin is just a sync / request router for Flash clients (via http). I'm still looking for ways to speed up playback on an mp4 container. One incredible solution I read recently and was very surprised at his http://zehfernando.com/2011/flash-video-frame-time-woes/ where the video editor (guy) and flash developer came up with their own mp4 timecoding which literally added (via the Adobe Premiere script) about 50 pixels to the bottom of each video frame with a visual “binary” mark, such as a frame barcode .. and these binary values ​​are converted to high-precision timecode values. Thus, Flash could analyze video clips as they were colored (in real time) and determine exactly where the player was and what bytes were needed from any type of web server that supports mp4 byte segmentation. The fact is that (and maybe I'm not here) Flash seems to randomly select when it receives moov data, especially in large video files (.5-1.5gigs). Even if you make sure that you run mp4 through MP4Box (i.e., MP4Box -frag 10000 -inter 0 movie.mp4) I think that it was a problem. OSMF and HDS worked well now, although it is annoying that you need Apache and proprietary closed-source module to use imo it. This is probably just a matter of time before the open source implementation comes, since HDS is only 1-2 years old, and it just needs a little reverse engineering, like this guy Andrew Chaaf with stream node.js + mpegts (in direct broadcast or not). In the end, I can just use OSMF exclusively under my user interface, as it seems to have similar advantages to HDS, if not more, i.e. Strobe, if you need a sick extensible HD Player or MP4 platform for hacking to implement your own custom player.

+1
source

Adobe F4F format is based on MP4 files, can you use F4V or MP4 instead of FLV files? There are many MP4 splitter files, but you will need to make sure that the timestamps in the files are continuous, perhaps a pause occurs when she sees a zero timestamp in the audio or video stream inside the file.

0
source

Source: https://habr.com/ru/post/1332134/


All Articles