I have a raspberry Pi with a new camera module connected to (in this case) Bambuser. You can see the stream here , this is from a windmill in the Netherlands (the camera position will be better in a few weeks).
I have a stream working successfully, but now I want to add an image (alpha-transparent png) on ββtop of the input stream, which is transmitted over the channel to ffmpeg for streaming to Bambuser.
Currently, I am using the following command (user details destroyed) to successfully transfer the stream from the raspberry camera module (this is great, HD and all, hardware rendering) to Bambuser, following an excellent tutorial from Slickstreamer:
raspivid -t 999999999 -w 960 -h 540 -fps 25 -b 500000 -o - | ffmpeg -i - -vcodec copy -an -metadata title="STREAM NAME" -f flv rtmp:
I followed the docs about ffmpeg and it seems to me that I should use the -vf command to apply the "movies:" filter, for example:
raspivid -t 999999999 -w 960 -h 540 -fps 25 -b 500000 -o - | ffmpeg -i - -vf "movie='/home/USER/watermark.png' [logo]; [in][logo] overlay=main_w-overlay_w-10:10 [out]" -vcodec copy -an -metadata title="STREAM NAME" -f flv rtmp://USER_X.fme.bambuser.com/b-fme/USER_STREAM_KEY_X
and various other -vf commands, such as '-vf vflip' or '-vf mandelbrot'. But this doesn't seem to work, as the stream simply shows a direct input from Raspberry's camera.
This is the result when the following -vf command is run:
raspivid -t 999999999 -w 960 -h 540 -fps 25 -b 500000 -o - | ffmpeg -i - -vcodec copy -vf 'movie=0:png:/home/USER/watermark.png [watermark];[in] [watermark]overlay=0:0:1[out]' -an -metadata title="STREAM NAME" -f flv rtmp:
ffmpeg version N-54036-g6c4516d Copyright (c) 2000-2013 the FFmpeg developers
built on Jun 15 2013 XX: XX with gcc 4.6 (Debian 4.6.3-14 + rpi1)
configuration:
libavutil 52.35.101 / 52.35.101
libavcodec 55.16.100 / 55.16.100
libavformat 55.8.102 / 55.8.102
libavdevice 55.2.100 / 55.2.100
libavfilter 3. 77.101 / 3. 77.101
libswscale 2. 3.100 / 2. 3.100
libswresample 0. 17.102 / 0. 17.102
[h264 @ 0x1917cc0] max_analyze_duration 5,000,000 reached at 5,000,000 microseconds
Input # 0, h264, from 'pipe:':
Duration: N / A, bitrate: N / A
Stream # 0: 0: Video: h264 (High), yuv420p, 960x540, 25 fps, 25 tbr, 1200k tbn, 50 tbc
Output # 0, flv, to 'rtmp: //USER_X.fme.bambuser.com/b-fme/USER_STREAM_KEY_X':
Metadata:
title: STREAM NAME
encoder: Lavf55.8.102
Stream # 0: 0: Video: h264 ([7] [0] [0] [0] / 0x0007), yuv420p, 960x540, q = 2-31, 25 fps, 1k tbn, 1200k tbc
Stream mapping:
Stream # 0: 0 -> # 0: 0 (copy)
frame = 2344 fps = 27 q = -1.0 size = 4827kB time = 00: 01: 33.72 bitrate = 421.9kbits / s
As mentioned above, the other -vf filters also don't seem to be applied in the output stream on Bambuser, I think I'm basically doing something wrong.
- Should I display Raspivid-stream and display the image "watermark.png" on top of this? Would this be a solution? Does anyone come across this?
Thanks so much for your thoughts in advance.