Ffmpeg outputs a separate channel from an input with multiple channels

I ran this command in order to be able to stream udp live stream in real time, which can be played using the mobile application that is being created. it's just a stream with audio streams.

ffmpeg -i udp: // @ localhost: 1111 -map 0: a http: // localhost: 8090 / feed1.ffm

Input #0, mpegts, from 'udp://@localhost:1111': Duration: N/A, start: 54616.264622, bitrate: 768 kb/s Program 1 Metadata: service_name : Service 1 service_provider: TLK Stream #0:0[0x101]: Audio: mp2 ([3][0][0][0] / 0x0003), 48000 Hz, stereo, s16p, 96 kb/s Program 2 Metadata: service_name : Service 2 service_provider: TLK Stream #0:1[0x111]: Audio: mp2 ([3][0][0][0] / 0x0003), 48000 Hz, stereo, s16p, 96 kb/s Program 3 Metadata: service_name : Service 3 service_provider: TLK Stream #0:2[0x121]: Audio: mp2 ([3][0][0][0] / 0x0003), 48000 Hz, stereo, s16p, 96 kb/s Program 4 Metadata: service_name : Service 4 service_provider: TLK Stream #0:3[0x131]: Audio: mp2 ([3][0][0][0] / 0x0003), 48000 Hz, stereo, s16p, 96 kb/s Program 5 Metadata: service_name : Service 5 service_provider: TLK Stream #0:4[0x141]: Audio: mp2 ([3][0][0][0] / 0x0003), 48000 Hz, stereo, s16p, 96 kb/s Program 6 Metadata: service_name : Service 6 service_provider: TLK Stream #0:5[0x151]: Audio: mp2 ([3][0][0][0] / 0x0003), 48000 Hz, stereo, s16p, 96 kb/s Program 7 Metadata: service_name : Service 7 service_provider: TLK Stream #0:6[0x161]: Audio: mp2 ([3][0][0][0] / 0x0003), 48000 Hz, stereo, s16p, 96 kb/s Program 8 Metadata: service_name : Service 1 service_provider: TLK Stream #0:7[0x171]: Audio: mp2 ([3][0][0][0] / 0x0003), 48000 Hz, stereo, s16p, 96 kb/s 

I get this conclusion

 Stream mapping: Stream #0:0 -> #0:0 (mp2 (native) -> mp2 (native)) Stream #0:0 -> #0:1 (mp2 (native) -> mp2 (native)) Press [q] to stop, [?] for help size= 100kB time=00:00:07.19 bitrate= 113.8kbits/s speed=3.01x video:0kB audio:84kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 18.358242% 

I really need a way that I can map each input stream to an output stream that can be accessed separately from the other, for example, if I need the first stream, maybe I just

 ffplay http://localhost:8090/feed1.ffm 

and if I need the 7th thread, I just

 ffplay http://localhost:8090/feed7.ffm 

Please someone can help me crack this problem. FFmpeg has no complicated examples for my current situation.

this is my header info

 ubuntu@ip-localhost :~$ \ ffmpeg -i udp://@localhost:4000 \ -map 0:a:0 http://localhost:8090/feed1.ffm -map 0:a:1 http://localhost:8090/feed2.ffm -map 0:a:2 http://localhost:8090/feed3.ffm -map 0:a:3 http://localhost:8090/feed4.ffm -map 0:a:4 http://localhost:8090/feed5.ffm -map 0:a:5 http://localhost:8090/feed6.ffm -map 0:a:6 http://localhost:8090/feed7.ffm -map 0:a:7 http://localhost:8090/feed8.ffm 

ffmpeg version 3.2.4-1 ~ 16.04.york0 Copyright (c) 2000-2017 FFmpeg developers built using gcc 5.4.1 (Ubuntu 5.4.1-5ubuntu2 ~ 16.04.york1) 20170210 Configuration: --prefix = / usr --extra-version = '1 ~ 16.04.york0' - toolchain = hardened --libdir = / usr / lib / x86_64-linux-gnu --incdir = / usr / include / x86_64 -linux-gnu -enable-gpl - disable-stripping --enable-avresample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libass --enable-libbluray --enable-libbs2b - -enable-libcaca --enable-libcdio - enable-libebur128 --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libmp3lame - -enable-libopenjpeg --enable-libopenmpt - enable-libopus --enable-libpulse --enable-librubberband --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh - -enable-libtheora --enable-libtwolame - enable-libvorbis --e nable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxvid --enable-libzmq --enable-libzvbi - -enable-omx --enable-openal --enable-opengl - enable-sdl2 --enable-libdc1394 --enable-libiec61883 --enable-chromaprint --enable-frei0r --enable-libopencv --enable-libx264 --enable-shared libavutil 55.34.101 / 55. 34.101 libavcodec 57. 64.101 / 57. 64.101 libavformat 57. 56.101 / 57. 56.101 libavdevice 57.1.100 / 57. 1.100 libavfilter 6. 65.100 / 6. 65.100 libavresample 3. 1. 0/3. 1.0 libswscale 4. 2.100 / 4. 2.100 libswresample 2. 3.100 / 2. 3.100 libpostproc 54.1.100 / 54. 1.100

 Input #0, mpegts, from 'udp://@localhost:1111': Duration: N/A, start: 60047.944622, bitrate: 768 kb/s Program 1 Metadata: service_name : Service 1 service_provider: TKL Stream #0:0[0x101]: Audio: mp2 ([3][0][0][0] / 0x0003), 48000 Hz, stereo, s16p, 96 kb/s Program 2 Metadata: service_name : Service 2 service_provider: TKL Stream #0:1[0x111]: Audio: mp2 ([3][0][0][0] / 0x0003), 48000 Hz, stereo, s16p, 96 kb/s Program 3 Metadata: service_name : Service 3 service_provider: TKL Stream #0:2[0x121]: Audio: mp2 ([3][0][0][0] / 0x0003), 48000 Hz, stereo, s16p, 96 kb/s Program 4 Metadata: service_name : Service 4 service_provider: TKL Stream #0:3[0x131]: Audio: mp2 ([3][0][0][0] / 0x0003), 48000 Hz, stereo, s16p, 96 kb/s Program 5 Metadata: service_name : Service 5 service_provider: TKL Stream #0:4[0x141]: Audio: mp2 ([3][0][0][0] / 0x0003), 48000 Hz, stereo, s16p, 96 kb/s Program 6 Metadata: service_name : Service 6 service_provider: TKL Stream #0:5[0x151]: Audio: mp2 ([3][0][0][0] / 0x0003), 48000 Hz, stereo, s16p, 96 kb/s Program 7 Metadata: service_name : Service 7 service_provider: TKL Stream #0:6[0x161]: Audio: mp2 ([3][0][0][0] / 0x0003), 48000 Hz, stereo, s16p, 96 kb/s Program 8 Metadata: service_name : Service 8 service_provider: TKL Stream #0:7[0x171]: Audio: mp2 ([3][0][0][0] / 0x0003), 48000 Hz, stereo, s16p, 96 kb/s Output #0, ffm, to 'http://localhost:8090/feed1.ffm': Metadata: creation_time : now encoder : Lavf57.56.101 Stream #0:0: Audio: mp2, 44100 Hz, mono, s16, 32 kb/s Metadata: encoder : Lavc57.64.101 mp2 Stream #0:1: Audio: mp2, 44100 Hz, stereo, s16, 64 kb/s Metadata: encoder : Lavc57.64.101 mp2 Output #1, ffm, to 'http://localhost:8090/feed2.ffm': Metadata: creation_time : now encoder : Lavf57.56.101 Stream #1:0: Audio: mp2, 44100 Hz, stereo, s16, 64 kb/s Metadata: encoder : Lavc57.64.101 mp2 Output #2, ffm, to 'http://localhost:8090/feed3.ffm': Metadata: creation_time : now encoder : Lavf57.56.101 Stream #2:0: Audio: mp2, 44100 Hz, stereo, s16, 64 kb/s Metadata: encoder : Lavc57.64.101 mp2 Output #3, ffm, to 'http://localhost:8090/feed4.ffm': Metadata: creation_time : now encoder : Lavf57.56.101 Stream #3:0: Audio: mp2, 44100 Hz, stereo, s16, 64 kb/s Metadata: encoder : Lavc57.64.101 mp2 Output #4, ffm, to 'http://localhost:8090/feed5.ffm': Metadata: creation_time : now encoder : Lavf57.56.101 Stream #4:0: Audio: mp2, 44100 Hz, stereo, s16, 64 kb/s Metadata: encoder : Lavc57.64.101 mp2 Output #5, ffm, to 'http://localhost:8090/feed6.ffm': Metadata: creation_time : now encoder : Lavf57.56.101 Stream #5:0: Audio: mp2, 44100 Hz, stereo, s16, 64 kb/s Metadata: encoder : Lavc57.64.101 mp2 Output #6, ffm, to 'http://localhost:8090/feed7.ffm': Metadata: creation_time : now encoder : Lavf57.56.101 Stream #6:0: Audio: mp2, 44100 Hz, stereo, s16, 64 kb/s Metadata: encoder : Lavc57.64.101 mp2 Output #7, ffm, to 'http://localhost:8090/feed8.ffm': Metadata: creation_time : now encoder : Lavf57.56.101 Stream #7:0: Audio: mp2, 44100 Hz, stereo, s16, 64 kb/s Metadata: encoder : Lavc57.64.101 mp2 Stream mapping: Stream #0:0 -> #0:0 (mp2 (native) -> mp2 (native)) Stream #0:0 -> #0:1 (mp2 (native) -> mp2 (native)) Stream #0:0 -> #1:0 (mp2 (native) -> mp2 (native)) Stream #0:0 -> #2:0 (mp2 (native) -> mp2 (native)) Stream #0:0 -> #3:0 (mp2 (native) -> mp2 (native)) Stream #0:0 -> #4:0 (mp2 (native) -> mp2 (native)) Stream #0:0 -> #5:0 (mp2 (native) -> mp2 (native)) Stream #0:0 -> #6:0 (mp2 (native) -> mp2 (native)) Stream #0:0 -> #7:0 (mp2 (native) -> mp2 (native)) Press [q] to stop, [?] for help 

thats my full console.

+5
source share
2 answers

You can do this by running 7 different instances of ffmpeg simultaneously with parallel processes; if you are open to this.

Let each ffmpeg instance demultiplex one audio stream and output it to the corresponding port.

It looks like this:

(ffmpeg -i udp://@localhost:1111 -map 0:a:0 http://localhost:8090/feed1.ffm ) & (ffmpeg -i udp://@localhost:1111 -map 0:a:1 http://localhost:8090/feed2.ffm ) & (ffmpeg -i udp://@localhost:1111 -map 0:a:2 http://localhost:8090/feed3.ffm ) & and so on.

I would not recommend this, but it will consume the power of your processor.

+2
source

The basic syntax is ffmpeg -i input1 -i input2 {switches} output1 {switches} output2... , therefore

 ffmpeg -i udp://@localhost:1111 -map 0:a:0 http://localhost:8090/feed1.ffm -map 0:a:1 http://localhost:8090/feed2.ffm -map 0:a:2 http://localhost:8090/feed3.ffm ... 
0
source

Source: https://habr.com/ru/post/1264244/


All Articles