可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试):
问题:
PROBLEM:
WebRTC gives us peer-to-peer video/audio connections. It is perfect for p2p calls, hangouts. But what about broadcasting (one-to-many, for example, 1-to-10000)?
Lets say we have a broadcaster "B" and two attendees "A1", "A2". Of course it seems to be solvable: we just connect B with A1 and then B with A2. So B sends video/audio stream directly to A1 and another stream to A2. B sends streams twice.
Now lets imagine there are 10000 attendees: A1, A2, ..., A10000. It means B must send 10000 streams. Each stream is ~40KB/s which means B needs 400MB/s outgoing internet speed to maintain this broadcast. Unacceptable.
ORIGINAL QUESTION (OBSOLETE)
Is it possible somehow to solve this, so B sends only one stream on some server and attendees just pull this stream from this server? Yes, this means the outgoing speed on this server must be high, but I can maintain it.
Or maybe this means ruining WebRTC idea?
NOTES
Flash is not working for my needs as per poor UX for end customers.
SOLUTION (NOT REALLY)
26.05.2015 - There is no such a solution for scalable broadcasting for WebRTC at the moment, where you do not use media-servers at all. There are server-side solutions as well as hybrid (p2p + server-side depending on different conditions) on the market.
There are some promising techs though like https://github.com/muaz-khan/WebRTC-Scalable-Broadcast but they need to answer those possible issues: latency, overall network connection stability, scalability formula (they are not infinite-scalable probably).
SUGGESTIONS
- Decrease CPU/Bandwidth by tweaking both audio and video codecs;
- Get a media server.
回答1:
As it was pretty much covered here, what you are trying to do here is not possible with plain, old-fashionned WebRTC (strictly peer-to-peer). Because as it was said earlier, WebRTC connections renegotiate encryption keys to encrypt data, for each session. So your broadcaster (B) will indeed need to upload its stream as many times as there are attendees.
However, there is a quite simple solution, which works very well: I have tested it, it is called a WebRTC gateway. Janus is a good example. It is completely open source (github repo here).
This works as follows: your broadcaster contacts the gateway (Janus) which speaks WebRTC. So there is a key negotiation: B transmits securely (encrypted streams) to Janus.
Now, when attendees connect, they connect to Janus, again: WebRTC negotiation, secured keys, etc. From now on, Janus will emit back the streams to each attendees.
This works well because the broadcaster (B) only uploads its stream once, to Janus. Now Janus decodes the data using its own key and have access to the raw data (that it, RTP packets) and can emit back those packets to each attendee (Janus takes care of encryption for you). And since you put Janus on a server, it has a great upload bandwidth, so you will be able to stream to many peer.
So yes, it does involve a server, but that server speaks WebRTC, and you "own" it: you implement the Janus part so you don't have to worry about data corruption or man in the middle. Well unless your server is compromised, of course. But there is so much you can do.
To show you how easy it is to use, in Janus, you have a function called incoming_rtp()
(and incoming_rtcp()
) that you can call, which gives you a pointer to the rt(c)p packets. You can then send it to each attendee (they are stored in sessions
that Janus makes very easy to use). Look here for one implementation of the incoming_rtp()
function, a couple of lines below you can see how to transmit the packets to all attendees and here you can see the actual function to relay an rtp packet.
It all works pretty well, the documentation is fairly easy to read and understand. I suggest you start with the "echotest" example, it is the simplest and you can understand the inner workings of Janus. I suggest you edit the echo test file to make your own, because there is a lot of redundant code to write, so you might as well start from a complete file.
Have fun! Hope I helped.
回答2:
As @MuazKhan noted above:
https://github.com/muaz-khan/WebRTC-Scalable-Broadcast
works in chrome, and no audio-broadcast yet, but it seems to be a 1st Solution.
A Scalable WebRTC peer-to-peer broadcasting demo.
This module simply initializes socket.io and configures it in a way
that single broadcast can be relayed over unlimited users without any
bandwidth/CPU usage issues. Everything happens peer-to-peer!
This should definitely be possible to complete.
Others are also able to achieve this: http://www.streamroot.io/
回答3:
"Scalable" broadcasting is not possible on the Internet, because the IP UDP multicasting is not allowed there. But in theory it's possible on a LAN.
The problem with Websockets is that you don't have access to RAW UDP by design and it won't be allowed.
The problem with WebRTC is that it's data channels use a form of SRTP, where each session has own encryption key. So unless somebody "invents" or an API allows a way to share one session key between all clients, the multicast is useless.
回答4:
AFAIK the only current implementation of this that is relevant and mature is Adobe Flash Player, which has supported p2p multicast for peer to peer video broadcasting since version 10.1.
http://tomkrcha.com/?p=1526.
回答5:
There is the solution of peer-assisted delivery, meaning the approach is hybrid. Both server and peers help distribute the resource. That's the approach peer5.com and peercdn.com have taken.
If we're talking specifically about live broadcast it'll look something like this:
- Broadcaster sends the live video to a server.
- The server saves the video (usually also transcodes it to all the relevant formats).
- A metadata about this live stream is being created, compatible with HLS or HDS or MPEG_DASH
- Consumers browse to the relevant live stream there the player gets the metadata and knows which chunks of the video to get next.
- At the same time the consumer is being connected to other consumers (via WebRTC)
- Then the player downloads the relevant chunk either directly from the server or from peers.
Following such a model can save up to ~90% of the server's bandwidth depending on bitrate of the live stream and the collaborative uplink of the viewers.
disclaimer: the author is working at Peer5
回答6:
My masters is focused on the development of a hybrid cdn/p2p live streaming protocol using WebRTC. I've published my first results at http://bem.tv
Everything is open source and I'm seeking for contributors! :-)
回答7:
The answer from Angel Genchev seems to be correct, however, there is a theoretical architecture, that allows low-latency broadcasting via WebRTC. Imagine B (broadcaster) streams to A1 (attendee 1). Then A2 (attendee 2) connects. Instead of streaming from B to A2, A1 starts streaming video being received from B to A2. If A1 disconnects then A2 starts receiving from B.
This architecture could work if there are no latencies and connection timeouts. So theoretically it is right, but not practically.
At the moment I am using server side solution.
回答8:
There are some solutions available in the market for WebRTC scalable solution.
They provide low latency, scalable webrtc streaming. Here are some samples.
Janus, Jitsi, Wowza, Red5pro, Ant Media Server
I am developer for Ant Media Server, we provide both community and enterprise edition including android and iOS SDK as well. Let us know if we can help you somehow.
回答9:
As peer1 is only the peer who invokes getUserMedia() i.e. peer1 creates a room.
- So, peer1 captures media and starts room.
- peer2 joins the room and get stream(data) from peer1 and also open parallel connection named as "peer2-connection"
- When peer3 joins the room and get stream(data) from peer2 and also open parallel connection named as 'peer3-connection" and so on.
This process continous as many peer gets connected to each other.
Hence, by this a single broadcast can be transfered over unlimited users without any bandwidth/ CPU usages problems.
Finally, all above contained is reference from Link.
回答10:
You are describing using WebRTC with a one-to-many requirement. WebRTC is designed for peer-to-peer streaming, however there are configurations that will let you benefit from the low latency of WebRTC while delivering video to many viewers.
The trick is to not tax the streaming client with every viewer and, like you mentioned, have a "relay" media server. You can build this yourself but honestly the best solution is often to use something like Wowza's WebRTC Streaming product.
To stream efficiently from a phone you can use Wowza's GoCoder SDK but in my experience a more advanced SDK like StreamGears works best.