How to capture video stream using OpenCV (Python)

I want to handle mms video stream using OpenCV using Python. The stream comes from an IP camera that I do not control (traffic monitor). The stream is available as mms or mmst circuits -

mms://194.90.203.111/cam2 

Playable on both VLC and Windows Media Player.

 mmst://194.90.203.111/cam2 

only works with VLC. I tried changing the scheme to HTTP by re-streaming using FFmpeg and VLC, but this did not work.

As I understand it, mms uses Windows Media Video to encode the stream. Failed to add '.mjpeg' at the end of the URI. I have not yet found what types of streaming are accepted by OpenCV.

Here is my code -

 import cv2, platform #import numpy as np cam = "mms://194.90.203.111/cam2" #cam = 0 # Use local webcam. cap = cv2.VideoCapture(cam) if not cap: print("!!! Failed VideoCapture: invalid parameter!") while(True): # Capture frame-by-frame ret, current_frame = cap.read() if type(current_frame) == type(None): print("!!! Couldn't read frame!") break # Display the resulting frame cv2.imshow('frame',current_frame) if cv2.waitKey(1) & 0xFF == ord('q'): break # release the capture cap.release() cv2.destroyAllWindows() 

What am I missing? What types of video streams can be captured by OpenCV? Is there an elegant solution without changing the circuit or transcoding?

Thanks!

Python ver 2.7.8, OpenCV ver 2.4.9, Both x86. Win7 x64

+5
source share
1 answer

It was decided to use FFmpeg and FFserver. Note. FFserver only works with Linux. The solution uses the python code from here , as suggested by Ryan .

The stream is as follows:

  • Launch the FFserver background process using the desired configuration (mjpeg in this case).
  • FFmpeg input - mmst stream, output stream to the local host.
  • Run the python script to open the localhost stream and decode frame by frame.

Run ffserver

 ffserver -d -f /etc/ffserver.conf 

On the second terminal run, FFmpeg

 ffmpeg -i mmst://194.90.203.111/cam2 http://localhost:8090/cam2.ffm 

Python code. In this case, the code will open a window with a video stream.

 import cv2, platform import numpy as np import urllib import os cam2 = "http://localhost:8090/cam2.mjpeg" stream=urllib.urlopen(cam2) bytes='' while True: # to read mjpeg frame - bytes+=stream.read(1024) a = bytes.find('\xff\xd8') b = bytes.find('\xff\xd9') if a!=-1 and b!=-1: jpg = bytes[a:b+2] bytes= bytes[b+2:] frame = cv2.imdecode(np.fromstring(jpg, dtype=np.uint8),cv2.CV_LOAD_IMAGE_COLOR) # we now have frame stored in frame. cv2.imshow('cam2',frame) # Press 'q' to quit if cv2.waitKey(1) & 0xFF == ord('q'): break cv2.destroyAllWindows() 

ffserver.config -

 Port 8090 BindAddress 0.0.0.0 MaxClients 10 MaxBandWidth 50000 CustomLog - #NoDaemon <Feed cam2.ffm> File /tmp/cam2.ffm FileMaxSize 1G ACL allow 127.0.0.1 ACL allow localhost </Feed> <Stream cam2.mjpeg> Feed cam2.ffm Format mpjpeg VideoFrameRate 25 VideoBitRate 10240 VideoBufferSize 20480 VideoSize 320x240 VideoQMin 3 VideoQMax 31 NoAudio Strict -1 </Stream> <Stream stat.html> Format status # Only allow local people to get the status ACL allow localhost ACL allow 192.168.0.0 192.168.255.255 </Stream> <Redirect index.html> URL http://www.ffmpeg.org/ </Redirect> 

Please note that ffserver.config requires more fine-tuning, but they work quite well and create a frame very close to the original one, with a slight freezing of the frame.

+5
source

Source: https://habr.com/ru/post/1206014/


All Articles