-->

Chrome native messaging: can I stream a MediaStrea

2019-05-20 06:12发布

问题:

I am writing a web application which needs to show a native window in the host window system. That window must display a video which is being streamed to the web application.

I have written a native program for OS X which displays a video in the way I need, and in the web application I have a MediaStream being sent via WebRTC. I need to connect these together.

I would like to use Chrome's native messaging, which lets me stream JSON objects to a native program. If I can access the raw data stream from the MediaStream, I should be able to transform this into JSON objects, stream those to the native application, where I can reconstruct the raw video stream.

Is something like this possible?

回答1:

If possible, I strongly recommend to implement a WebRTC media server in your native application and directly communicate between the browser's WebRTC APIs and your server. Anything else has much more overhead.

For example, to go from MediaSource to native messaging, you need a way to serialize the audio and video feed in MediaSource to a sequence of bytes, and then send it over the native messaging channel (which will be JSON-encoded by the browser and then JSON-decoded by your native app).

  • For audio, you could use audioContext.createMediaStreamSource to bridge from a MediaStream (from WebRTC) to an Audio node (in the Web Audio API), and then use offlineAudioCtx.startRendering to convert from an audio node to raw bytes.
  • For video, you could paint the video on a canvas and then continously use toDataURL or toBlob to get the underlying data to send it over the wire. (See "Taking still photos with WebRTC" on MDN for a tutorial on taking a single picture, this can be generalized to multiple frames)

This sounds very inefficient, and it probably is, so you'd better implement a WebRTC media server in your native app to get some reasonable performance.