I would like to wrap real time encoded data to webm or ogv and send it to an html5 browser.
Can webm or ogv do this, Mp4 can not do this due to its MDAT atoms. (one can not wrap h264 and mp3 in real time and wrap it and send it to the client) Say I am feeding the input from my webcam and audio from my built in mic. Fragmented mp4 can handle this but its an hassle to find libs to do that).
I need to do this cuz I do not want to send audio and video separably.
If I did send it separably, sending audio over audio tag and video over video>(audio and video are demuxed and sent) Can I sync them on client browser with javascript. I saw some examples but not sure yet.
I did this with ffmpeg/ffserver running on Ubuntu as follows for webm (mp4 and ogg are a bit easier, and should work in a similar manner from the same server, but you should use all 3 formats for compatibility across browsers).
First, build ffmpeg from source to include the libvpx drivers (even if your using a version that has it, you need the newest ones (as of this month) to stream webm because they just did add the functionality to include global headers). I did this on an Ubuntu server and desktop, and this guide showed me how - instructions for other OSes can be found here.
Once you've gotten the appropriate version of ffmpeg/ffserver you can set them up for streaming, in my case this was done as follows.
On the video capture device:
Relevant ffserver.conf excerpt:
This ffmpeg command is executed on the machine previously referred to as server_ip (it handles the actual mpeg --> webm conversion and feeds it back into the ffserver on a different feed):
Once these have all been started up (first the ffserver, then the feeder_ip ffmpeg process then then the server_ip ffmpeg process) you should be able to access the live stream at http://:8090/0.webm and check the status at http://:8090/
Hope this helps.
Evren,
Since you have asked this question initially, the Media Source Extensions https://www.w3.org/TR/media-source/ have matured enough to be able to play very short (30ms) ISO-BMFF video/mp4 segments with just a little buffering.
Refer to HTML5 live streaming
So your statement
is out of date now. Yes you can do it with h264 + AAC.
There are several implementations out there; take a look at Unreal Media Server. From Unreal Media Server FAQ: http://umediaserver.net/umediaserver/faq.html
Their demos webpage has a live HTML5 feed from RTSP camera: http://umediaserver.net/umediaserver/demos.html Notice that the latency in HTML5 player is comparable to that in Flash player.
Not 100% sure you can do this. HTML5 has not ratified any live streaming mechanism. You could use websockets and send data in real time to the browser to do this. But you have to write the parsing logic yourself and I do not know how you will feed the data as it arrives to the player.
As for video and audio tag: Video tag can play container files that have both audio and video. So wrap your content in a container that is compatible. If you modify your browser to write your live streaming to this video file as the live content keeps coming in and stream out that data for every byte requested by the browser this could be done. But it is definitely non trivial.