I have a Delphi 10.1 Berlin Datasnap Server, that can't return Data packets (through a TStream) bigger than around 260.000 bytes.
I have programmed it following the \Object Pascal\DataSnap\FireDAC sample from Delphi, which also shows this problem.
The problem can be seen just opening that sample, setting blank the IndexFieldName of the qOrders component on ServerMethodsUnit.pas, and changing its SQL property to :
select * from Orders
union
select * from Orders
Now the amount of data to be send is beyond 260.000 bytes, which seems to be the point where you can't retrieve it from the client. Getting a EFDException [FireDAC][Stan]-710. Invalid binary storage format.
The data is sent as a Stream that you get from a FDSchemaAdapter on the server, and you load on another FDSchemaAdpater on the client. The connection between Client and Server is also FireDAC.
This is how the Server returns that Stream :
function TServerMethods.StreamGet: TStream;
begin
Result := TMemoryStream.Create;
try
qCustomers.Close;
qCustomers.Open;
qOrders.Close;
qOrders.Open;
FDSchemaAdapter.SaveToStream(Result, TFDStorageFormat.sfBinary);
Result.Position := 0;
except
raise;
end;
end;
And this is how the Client retrieves it :
procedure TClientForm.GetTables;
var
LStringStream: TStringStream;
begin
FDStoredProcGet.ExecProc;
LStringStream := TStringStream.Create(FDStoredProcGet.Params[0].asBlob);
try
if LStringStream <> nil then
begin
LStringStream.Position := 0;
DataModuleFDClient.FDSchemaAdapter.LoadFromStream(LStringStream, TFDStorageFormat.sfBinary);
end;
finally
LStringStream.Free;
end;
end;
The Client doesn't get all the data on the Blob parameter. I save the content of the Stream on the Server, and the content that arrives at the Blob parameter on the Client, and they have the same size, but the content of the Blob parameter has its content truncated, and the last few Kbytes are zeroes.
This is how I save on the Server the content that will go to the Stream:
FDSchemaAdapter.SaveToFile('C:\Temp\JSON_Server.json', TFDStorageFormat.sfJSON);
This is how I check what I get on the Client blob parameter:
TFile.WriteAllText('C:\Temp\JSON_Client.json', FDStoredProcGet.Params[0].asBlob);
I can see that the Client gets the data truncated.
Do you know how to fix it, or a workaround to retrieve all the Stream content from the Datasnap Server to my Client ?.
Update: I have updated to Delphi 10.1 Berlin Update 2, but the problem remains.
Thank you.
The problem seems to be neither the TStream class nor the underlying DataSnap communication infrastructure, but that TFDStoredProc component creates a return parameter of type ftBlob. In first place, change the output parameter from ftBlob to ftStream. Then, change GetTables procedure to:
Compress the stream on the server and uncompress it on the client. Delphi 10.1 provides the necessary classes (
System.ZLib.TZCompressionStream
andSystem.ZLib.TZDecompressionStream
). The online documentation contains an example that shows how to use these routines to compress and uncompress data from and to a stream. Save the output to a ZIP file to check whether it is smaller than 260 KB.A workaround: run a HTTP server which serves requests for the big files. The code generates and stores the file as shown in your question, and returns its URL to the client:
If you already use Apache as reverse proxy, you can configure Apache to route HTTP GET requests to resources at /files/.
For more control (authentication), you can run a HTTP server (Indy based) on a different port which serves the requests to these files. Apache may be configured to map HTTP requests to the correct destination, the client will only see one HTTP port.
@Marc: I think Henrikki meant a single function, not a single function call...
I've modified your code so that only one function is enough and so that projects with different SchemaAdapters/StoredProcedures can be used.
The maximum streamsize is declared as a constant (MaxDataSnapStreamSize) and is set to $F000, wich is the MaxBuffSize a TStream.CopyFrom function handles (see System.Classes).
FComprStream is a private field of type TMemorySTream, taken care of in the constructor and destructor of the servermodule.
On the server side:
On the client side:
I have coded a workaround. Seeing that I can't pass data bigger than 255Kb then I split it in different 255Kb packets and send them separately (I have also added compression to minimize the bandwidth and roundtrips).
On the server I have changed StremGet to two different calls : StreamGet and StreamGetNextPacket.
CommStream: TStream is declared as private on TServerMethods.
And the Client retrieves it this way :
It works fine now.
I get a similar problem with Seattle (I don't have Berlin installed) with a DataSnap server that doesn't involve FireDAC.
On my DataSnap server I have:
As you can see, both these functions build a string of the specified size using the same
BuildString
function and return it as a stream and a string respectively.On two Win10 systems here,
GetStream
works fine for sizes up to 30716 bytes but above that, it returns an empty stream and a "size" of -1.Otoh,
GetString
works fine for all sizes I have tested up to and including a size of 32000000. I have not yet managed to trace whyGetStream
fails. However, based on the observation thatGetString
does work, I tested the following work-around, which sends a stream as a string, and that works fine up to 32M as well:I appreciate you might prefer your own work-around of sending the result in chunks.
Btw, I have tried calling my
GetStream
on the server by creating an instance of TServerMethodsin a method of the server's main form and calling
GetStreamdirectly from that, so that the server's
TDSTCPServerTransport` isn't involved. This correctly returns the stream so the problem seems to be in the transport layer or the input and/or output interfaces to it.