Is there a performance justification for mixing a binary file?

I am writing a parser for the most common type of geographic data storage, a set of files called a shapefile. This is my first project in which I had to think about content.

It turns out that the geometry repository is mixed endian; some parts of the file have a large endian, but most of them are slightly oriented. The shapefile standard is described here .

Is there a noticeable rationale for performance, or is it just born out of a historical context? If so, do you know what a historical context is?

The double-precision integers and integers that make up the data description fields in the file header (indicated below) and the contents of the record in the main file are in little endian byte order (PC or Intel®). The double-precision integer and floating-point numbers that make up the rest of the file and file management are in long bytes (Sun® or Motorola®) order.

+4
source share
1 answer

Although there is no clear answer for this, what I saw is a mixture of "confusion when trying to create a format that works on all platforms" and "a lot of poorly designed formats were developed then." More details here: https://gis.stackexchange.com/questions/18969/oddities-in-the-shapefile-technical-specification

+1
source

Source: https://habr.com/ru/post/1390719/


All Articles