FStream reads a binary file written using a Delphi binary writer

im creates a dll in MS Visual Studio 2010 Express, which loads the binary data file (* .mgr extension โ†’ used exclusively in my corporate applications) using the fstream library in C ++. The file is created using an application developed by someone else in my company that uses Delphi. He says that the first 15 bytes should be some characters that indicate the date the file was created and some other things, such as the version of the application:

"XXXX 2012."

The result after downloading using fstream (in binary mode) and writing another file using fstream (string mode) is as follows:

"[] XXXX 2 0 1 2"

The first char is an unknown char (rectangle), then there are spaces between each char. Finally, the width is 31 bytes. 15 for actual characters + 15 for spaces + 1 for straight lines char = 31.

Other information: I use C ++, the application developer uses Delphi. Im using fstream. It uses the BW.Write () function. (BW == Binary Writer?) It uses Windows 7 while I use Windows XP Professional.

Can you diagnose the problem?

Thank you in advance

The first edit . I am adding C ++ code that loads the first bytes.

Firstly, it uses Delphi XE2 from embarcadero Rad Studio XE2.

From what I know, PChar is a zero-terminated string consisting of wide-format (since delphi 2009) that are 2 bytes wide, as opposed to regular characters (one byte). So basically it saves words instead of bytes.

here is the code loading mgr:

wchar_t header[15]; DXFLIBRARY_API void loadMGR(const char* szFileName, const char* szOutput) { fstream file; file.open( szFileName, ios::binary | ios::in ); if(file.is_open()) { file.read(reinterpret_cast<char*>(header),sizeof(header)); } file.close(); //zapis fstream saveFile; saveFile.open( szOutput, ios::out ); if(saveFile.is_open()) { saveFile.write(reinterpret_cast<const char*>(header),sizeof(header)); } saveFile.close(); } 

The header contains 15 wchar_t, so we get 30 bytes. Even after the investigation, I have no idea how to convert.

+4
source share
1 answer

It seems pretty obvious that somewhere along the way, the data is distorted between 8-bit text encoding and 16-bit encoding. The fake first character is almost certainly a UTF-16 specification.

One possible explanation is that the Delphi developer writes UTF-16 encoding text to a file. And presumably you expect 8-bit encoding.

Another explanation is that Delphi code correctly writes out 8-bit text, but your code distorts it. Perhaps your read / write code does this.

Use the hex editor in the file output from the Delphi program to narrow down where the manipulation occurs.

In the absence of any code in the question, it is difficult to be more specific than that.

+5
source

Source: https://habr.com/ru/post/1440564/


All Articles