What I am trying to achieve is, I have a raspberry 3 with pi camera v2 connected to my local wifi. I want to transmit a live video from the raspberry pi to a computer running Ubuntu. On my computer I am trying to process that video with opencv in real time. The code below is just a sample code to test the video coming from the raspberry pi on my Ubuntu computer. I am using netcat to stream the video and I have listed the shell script below the code.
#include <opencv2/highgui/highgui.hpp>
#include <opencv2/imgproc/imgproc.hpp>
#include <iostream>
using namespace std;
using namespace cv;
int main(int argc, char** argv)
{
VideoCapture cap;
cap.open("/dev/stdin");
if (!cap.isOpened())
return -1;
Mat edges;
namedWindow("edges",1);
for(;;){
Mat frame;
cap>>frame;
imshow("edges",frame);
if(waitKey(30)>=0)break;
}
return 0;
}
This is the code to play the stream in opencv.
- First I redirect the stream to my opencv app using. nc -l -p 5001 | ./app
- then run I the raspberry camera and stream it using netcat raspivid -t 999999 -o | nc x.x.x.x 5001 x being the client pc ip address.
This doesn't work for me but I have tried it with mplayer by running nc -l -p 5001 | mplayer -fps 31 -cache 1024 - and it works perfectly.
I think my problem is I am not capturing the stream properly on my opencv application. I need help please.