I'm trying to use IBM's websocket implementation of their speech-to-text service. Currently I'm unable to figure out how to send a .wav file over the connection. I know I need to transform it into a blob, but I'm not sure how to do it. Right now I'm getting errors of:
You must pass a Node Buffer object to WebSocketConnec
-or-
Could not read a WAV header from a stream of 0 bytes
...depending on what I try to pass to the service. It should be noted that I am correctly sending the start message and am making it to the state of listening.
Starting from the v1.0 (still in beta) the watson-developer-cloud npm module has support for websockets.
Recognize a wav file:
See more examples here.