I'm using openCV 1.1pre1 under Windows. I have a network camera and I need to grab frames from openCV. That camera can stream a standard mpeg4 stream over RTSP or mjpeg over http. I've seen many threads talking about using ffmpeg with openCV but I cannot make it work.
How I can grab frames from an IP camera with openCV?
Thanks
Andrea
I just do it like this:
Also make sure this dll is available at runtime else cvCreateFileCapture will return NULL
The camera needs to allow unauthenticated access too, usually set via its web interface. MJPEG format worked via rtsp but MPEG4 didn't.
hth
Si
rtsp protocol did not work for me. mjpeg worked first try. I assume it is built into my camera (Dlink DCS 900).
Syntax found here: http://answers.opencv.org/question/133/how-do-i-access-an-ip-camera/
I did not need to compile OpenCV with ffmpg support.
OpenCV can be compiled with FFMPEG support. From ./configure --help:
You can then use cvCreateFileCapture_FFMPEG to create a CvCapture with e.g. the URL of the camera's MJPG stream.
I use this to grab frames from an AXIS camera:
Use ffmpeglib to connect to the stream.
These functions may be useful. But take a look in the docs
You would need a little algo to get a complete frame, which is available here
Once you get a frame you could copy the video data (for each plane if needed) into a IplImage which is an OpenCV image object.
You can create an IplImage using something like...
Once you have an IplImage, you could perform all sorts of image operations available in the OpenCV lib
I enclosed C++ code for grabbing frames. It requires OpenCV version 2.0 or higher. The code uses cv::mat structure which is preferred to old IplImage structure.
Update You can grab frames from H.264 RTSP streams. Look up your camera API for details to get the URL command. For example, for an Axis network camera the URL address might be: