SignalR
loses many messages when I transfer chunks of bytes from client over server to client (or client to server; or server to client).
I read the file into a stream and sent it over a hub or persistent connection to other client. This runs very fast, but there are always messages dropped or lost.
How can I transfer large files (in chunks or not) from client to client without losing messages?
As @dfowler points out, it's not the right technology for the job. What I would recommend doing is sending a message that there is a file to be downloaded that includes the link and then you can download that file using standard GET requests against either static files or some web service written with ASP.NET WebAPI.
SignalR isn't for file transfer, it's for sending messages.
Why isn't it the right technology? If a client needs to send some data to a signalR hub it should be able to over the signalR connection without requiring additional stuff.
In fact it works fine when sending a byte array, at least for me, however I encountered similar problems when transferring chunks.
Perhaps you can do some tests to check if the order in which you send the chunks is the same as the order they are received.
UPDATE
I did a test myself and in my case the order was indeed the problem. Modified the hub method receiving the chunks to accept an order parameter which I then use to reconstruct the byte array at the end and it works fine. Having said this I however now understand that this wouldn't work well with large file transfers.
In my case I don't need to transfer very large amounts of data, I just wanted to give my UI an indication of progress, requiring the data to be sent in chunks.