Can't get TStreams more than about 260,000 bytes from Datasnap server

I have a Datasnap Delphi 10.1 Berlin server that cannot return data packets (via TStream) more than about 260,000 bytes.

I programmed it after the sample \ Object Pascal \ DataSnap \ FireDAC from Delphi, which also shows this problem.

The problem is that you can simply open this sample by running the indexFieldName value of the qOrders component on ServerMethodsUnit.pas and changing its SQL property to:

select * from Orders union select * from Orders 

Now the amount of data sent exceeds 260,000 bytes, which, apparently, is the place where you can not receive it from the client. Getting EFDException [FireDAC] [Stan] -710. Invalid binary storage format.

Data is sent as a stream that you receive from the FDSchemaAdapter on the server, and you load another FDSchemaAdpater client on the client. The connection between the client and server is also FireDAC.

So the server returns this stream:

 function TServerMethods.StreamGet: TStream; begin Result := TMemoryStream.Create; try qCustomers.Close; qCustomers.Open; qOrders.Close; qOrders.Open; FDSchemaAdapter.SaveToStream(Result, TFDStorageFormat.sfBinary); Result.Position := 0; except raise; end; end; 

And this is how the client retrieves it:

 procedure TClientForm.GetTables; var LStringStream: TStringStream; begin FDStoredProcGet.ExecProc; LStringStream := TStringStream.Create(FDStoredProcGet.Params[0].asBlob); try if LStringStream <> nil then begin LStringStream.Position := 0; DataModuleFDClient.FDSchemaAdapter.LoadFromStream(LStringStream, TFDStorageFormat.sfBinary); end; finally LStringStream.Free; end; end; 

The client does not receive all the data for the Blob parameter. I save the Stream content on the server and the content that comes to the Blob parameter on the client and they are the same size, but the contents of the Blob parameter have truncated content, and the last few kilobytes are zeros.

This is how I save the content that will be streamed to Stream on the server:

 FDSchemaAdapter.SaveToFile('C:\Temp\JSON_Server.json', TFDStorageFormat.sfJSON); 

This is how I check what I get in the client blob parameter:

 TFile.WriteAllText('C:\Temp\JSON_Client.json', FDStoredProcGet.Params[0].asBlob); 

I see that the client is clearing the data.

Do you know how to fix it, or a workaround to get all Stream content from the Datasnap server for my client?

Update . I updated Delphi 10.1 Berlin Update 2, but the problem remains.

Thanks.

+5
source share
5 answers

I encoded a workaround. Seeing that I can not transfer data larger than 255 KB, I divided them into different 255 KB packets and sent them separately (I also added compression to minimize bandwidth and rounding).

On the server, I changed StremGet to two different calls: StreamGet and StreamGetNextPacket.

 function TServerMethods.StreamGet(var Complete: boolean): TStream; var Data: TMemoryStream; Compression: TZCompressionStream; begin try // Opening Data qCustomers.Close; qCustomers.Open; qOrders.Close; qOrders.Open; // Compressing Data try if Assigned(CommStream) then FreeAndNil(CommStream); CommStream := TMemoryStream.Create; Data := TMemoryStream.Create; Compression := TZCompressionStream.Create(CommStream); FDSchemaAdapter.SaveToStream(Data, TFDStorageFormat.sfBinary); Data.Position := 0; Compression.CopyFrom(Data, Data.Size); finally Data.Free; Compression.Free; end; // Returning First 260000 bytes Packet CommStream.Position := 0; Result := TMemoryStream.Create; Result.CopyFrom(CommStream, Min(CommStream.Size, 260000)); Result.Position := 0; // Freeing Memory if all sent Complete := (CommStream.Position = CommStream.Size); if Complete then FreeAndNil(CommStream); except raise; end; end; function TServerMethods.StreamGetNextPacket(var Complete: boolean): TStream; begin // Returning the rest of 260000 bytes Packets Result := TMemoryStream.Create; Result.CopyFrom(CommStream, Min(CommStream.Size - CommStream.Position, 260000)); Result.Position := 0; // Freeing Memory if all sent Complete := (CommStream.Position = CommStream.Size); if Complete then FreeAndNil(CommStream); end; 

CommStream: TStream is declared closed in TServerMethods.

And the client retrieves it as follows:

 procedure TClientForm.GetTables; var Complete: boolean; Input: TStringStream; Data: TMemoryStream; Decompression: TZDecompressionStream; begin Input := nil; Data := nil; Decompression := nil; try // Get the First 260000 bytes Packet spStreamGet.ExecProc; Input := TStringStream.Create(spStreamGet.ParamByName('ReturnValue').AsBlob); Complete := spStreamGet.ParamByName('Complete').AsBoolean; // Get the rest of 260000 bytes Packets while not Complete do begin spStreamGetNextPacket.ExecProc; Input.Position := Input.Size; Input.WriteBuffer(TBytes(spStreamGetNextPacket.ParamByName('ReturnValue').AsBlob), Length(spStreamGetNextPacket.ParamByName('ReturnValue').AsBlob)); Complete := spStreamGetNextPacket.ParamByName('Complete').AsBoolean; end; // Decompress Data Input.Position := 0; Data := TMemoryStream.Create; Decompression := TZDecompressionStream.Create(Input); Data.CopyFrom(Decompression, 0); Data.Position := 0; // Load Datasets DataModuleFDClient.FDSchemaAdapter.LoadFromStream(Data, TFDStorageFormat.sfBinary); finally if Assigned(Input) then FreeAndNil(Input); if Assigned(Data) then FreeAndNil(Data); if Assigned(Decompression) then FreeAndNil(Decompression); end; end; 

Now it works great.

0
source

I get a similar problem with Seattle (I don't have Berlin) with a DataSnap server that does not include FireDAC.

On my DataSnap server, I have:

 type TServerMethods1 = class(TDSServerModule) public function GetStream(Size: Integer): TStream; function GetString(Size: Integer): String; end; [...] uses System.StrUtils; function BuildString(Size : Integer) : String; var S : String; Count, LeftToWrite : Integer; const scBlock = '%8d bytes'#13#10; begin LeftToWrite := Size; Count := 1; while Count <= Size do begin S := Format(scBlock, [Count]); if LeftToWrite >= Length(S) then else S := Copy(S, 1, LeftToWrite); Result := Result + S; Inc(Count, Length(S)); Dec(LeftToWrite, Length(S)); end; if Length(Result) > 0 then Result[Length(Result)] := '.' end; function TServerMethods1.GetStream(Size : Integer): TStream; var SS : TStringStream; begin SS := TStringStream.Create; SS.WriteString(BuildString(Size)); SS.Position := 0; OutputDebugString('Quality Suite:TRACING:ON'); Result := SS; end; function TServerMethods1.GetString(Size : Integer): String; begin Result := BuildString(Size); end; 

As you can see, both of these functions build a string of the specified size using the same BuildString function and return it as a stream and a string respectively.

On two Win10 systems, GetStream works fine here for sizes up to 30716 bytes, but above that, it returns an empty stream and a "size" of -1.

Otoh, GetString works great for all the sizes that I tested before and including the size of 32000000. I have not yet been able to track why GetStream fails. However, based on the observation that GetString working, I checked after the workflow, which sends the stream as a string, and works fine up to 32M:

 function TServerMethods1.GetStreamAsString(Size: Integer): String; var S : TStream; SS : TStringStream; begin S := GetStream(Size); S.Position := 0; SS := TStringStream.Create; SS.CopyFrom(S, S.Size); SS.Position := 0; Result := SS.DataString; SS.Free; S.Free; end; 

I appreciate that you may prefer your own job of sending the result to pieces.

Btw, I tried calling my GetStream on the server by creating an instance of TServerMethods in a method of the server main form and calling GetStream directly from that, so that the server's TDSTCPServerTransport` is not involved. This returns the thread correctly, so the problem seems to be related to the transport layer or to the input and / or output interfaces.

+2
source

Compress the stream on the server and unpack it on the client. Delphi 10.1 provides the necessary classes ( System.ZLib.TZCompressionStream and System.ZLib.TZDecompressionStream ). The online documentation contains an example that shows how to use these procedures to compress and decompress data from and to a stream. Save the output to a ZIP file to check if it is less than 260 KB.

0
source

Workaround: Launch an HTTP server that serves requests for large files. The code generates and saves the file, as shown in your question, and returns the URL to the client:

 https://example.com/ds/... -> for the DataSnap service https://example.com/files/... -> for big files 

If you are already using Apache as a reverse proxy, you can configure Apache to route HTTP GET requests to resources in / files /.

For more control (authentication), you can run an HTTP server (based on Indy) on a different port that serves requests for these files. Apache can be configured to map HTTP requests to the correct destination, the client will see only one HTTP port.

0
source

The problem is neither the TStream class nor the underlying DataSnap communication infrastructure, but the TFDStoredProc component creates a return parameter of type ftBlob. In the first place, change the output parameter from ftBlob to ftStream. Then change the GetTables procedure to:

 procedure TClientForm.GetTables; var LStringStream: TStream; begin spStreamGet.ExecProc; LStringStream := spStreamGet.Params[0].AsStream; LStringStream.Position := 0; DataModuleFDClient.FDSchemaAdapter.LoadFromStream(LStringStream, TFDStorageFormat.sfBinary); end; 
0
source

Source: https://habr.com/ru/post/1263335/


All Articles