I've trying pipe audio and video raw data to ffmpeg and push realtime stream through RTSP protocol on android. the command-line is look like this
"ffmpeg -re -f image2pipe -vcodec mjpeg -i "+vpipepath
+ " -f s16le -acodec pcm_s16le -ar 8000 -ac 1 -i - "
+ " -vcodec libx264 "
+ " -preset slow -pix_fmt yuv420p -crf 30 -s 160x120 -r 6 -tune film "
+ " -g 6 -keyint_min 6 -bf 16 -b_strategy 1 "
+ " -acodec libopus -ac 1 -ar 48000 -b:a 80k -vbr on -frame_duration 20 "
+ " -compression_level 10 -application voip -packet_loss 20 "
+ " -f rtsp rtsp://remote-rtsp-server/live.sdp";
I'm using libx264 for video codec and libopus for audio codec. the yuv frames are feed through a named pipe created by mkfifo, the pcm frames are feed through stdin.
It works, and I can fetch and play the stream by ffplay. But there is serverely audio/video sync issue. Audio is 5~10 seconds later than video. I guess the problem is both yuv frame and pcm frame doesn't have any timestamp on them. FFmpeg add timestamp when it feed with the data. but audio/video capture thread is impossible to run at the same rate. Is there a way to add timestamp to each raw data frame? (something like PST/DST?)
the way I used was from this thread: Android Camera Capture using FFmpeg