Writing unsigned int to binary

My first time working with binary files, and I have pieces of hair in my hands. Anyway, I defined the following:

unsigned int cols, rows; 

These variables can be from 1 to 500. When I get them in a binary, I do this:

  myFile.write(reinterpret_cast<const char *>(&cols), sizeof(cols)); myFile.write(reinterpret_cast<const char *>(&rows), sizeof(rows)); 

When I get back to read the file, at cols = 300 , I get this as a result:

 44 1 0 0 

Can someone explain to me why I get this result? I can’t say that something is wrong there, because I honestly consider it to those who do not understand things. What I would like to do is save the value, as it is, in the file, so that when I read it, I also get this. And maybe I know, I just don't know that.

I would like to get some explanation of how this works and how to get the data that I read.

+4
source share
4 answers

You have not yet shown how you cancel the data and how to print this text that you specified. 44 01 00 00 looks like a phased decimal representation of each of the low bytes of the data you wrote (decimal "300").

If you read the data this way, it should give you the effect you want (assuming you are okay with the restriction that the computer that writes this file has the same meaning as the one that reads it):

 unsigned int colsReadFromFile = 0; myOtherFile.read(reinterpret_cast<char *>(&colsReadFromFile), sizeof(colsReadFromFile)); if (!myOtherFile) { std::cerr << "Oh noes!" << std::endl; } 
+2
source

You just look at four bytes from a 32-bit integer interpreted on the little-endian platform.

300 base 10 = 0x12C

So, a little gives you 0x2C 0x01 and, of course, 0x2C=44 .

+2
source

Each byte in the file has 8 bits, so it can represent values ​​from 0 to 255. It is written in the order of number, first with the low byte. So, starting from the other end, treat the numbers as numbers in the database 256. The value is 0 * 256 ^ 3 + 0 * 256 ^ 2 + 1 * 256 ^ 1 + 44 * 256 ^ 0 (where ^ means exponentiation, not XOR )

+2
source
  300 in binary is 100101100 which is 9 bits long. 

But when you say char *, the compiler only looks for the first 1 byte (8 bits)

  so it is 00101100(bits) of (1 00101100) = 44 ^^^^^^^^ 
+1
source

Source: https://habr.com/ru/post/1485731/


All Articles