I would like to record the Audio stream from my Angular Web App to my Asp.net Core Api.
I think, using SignalR and its websockets it a good way to do that.
With this typescript code, I m able to get a MediaStream:
import { HubConnection } from '@aspnet/signalr';
[...]
private stream: MediaStream;
private connection: webkitRTCPeerConnection;
@ViewChild('video') video;
[...]
navigator.mediaDevices.getUserMedia({ audio: true })
.then(stream => {
console.trace('Received local stream');
this.video.srcObject = stream;
this.stream = stream;
var _hubConnection = new HubConnection('[MY_API_URL]/webrtc');
this._hubConnection.send("SendStream", stream);
})
.catch(function (e) {
console.error('getUserMedia() error: ' + e.message);
});
And I handle the stream in the .NetCore API with
public class MyHub: Hub{
public void SendStream(object o)
{
}
}
But when I cast o to System.IO.Stream, I got a null.
When I read the documentation of WebRTC, I saw information about RTCPeerConnection. IceConnection ... Do I need that?
How can I stream the audio from a WebClient to Asp.netCore API using SignalR? Documentation? GitHub?
Thanks for your help
I found the way to get access to the microphone stream and transmit it to the server, here is the code:
Next step will be to convert my int32Array to a wav file.
sources which helped me:
Note: I didnt add the code on how to configure SignalR, it was not the purpose here.