Is it legal for a compiler to degrade the time complexity of a program? Is this observable behavior?

(Note: this is supposed to be a language-lawyer , I don't mean any specific existing compilers.)

When, if ever, does the compiler allow the program time complexity to degrade?
Under what circumstances (if any) is this considered "observable behavior" and why? (For example, can a compiler legally “shorten” a program with polynomial time to exponential time?)

If the answer is different in C and C ++ or in different versions, then please explain the differences.

+42
c ++ c language-lawyer
Oct 07 '14 at 7:32
source share
4 answers

The C standard does not actually have a time complexity model, neither for its primitive operations, nor for its library functions, so compilers are allowed to do almost anything that preserves the semantics of the program (observed behavior).

The C ++ standard gives only a guarantee of complexity only for some of its library functions and says (17.5.1.4 [structure.specifications]):

The complexity requirements specified in the sections of the library are upper bounds, and implementations that provide the best guarantees of complexity satisfy the requirements.

The compiler better preserves these boundaries (and since many of the functions are templated / can be embedded, the compiler is involved), but the boundaries are related to the number of elements in containers and limit the number of calls to compare operators, etc. Otherwise, the compiler will again be free to do what he likes.

+42
Oct 07 '14 at 7:46
source share

Code performance is not considered observable behavior and could potentially be changed by the compiler in any direction. From a practical point of view, for reasons of implementation quality (QoI), compilers do not degrade your programs, but there are times when QoI does not work.

The compiler, taking into account the corresponding flags, could add the toolkit to the program that it creates for debugging purposes (this often happens in library implementations, for example, with proven iterators).

Please note that the simple answer to when the compiler will degrade your program is twofold: when the client requests it or when the developer does not want to have users for the compiler.

+24
Oct 07 '14 at 7:52
source share

5.1.2.3 in standard C says

The semantic descriptions in this International Standard describe the behavior of an abstract machine in which optimization issues are not relevant.

The C ++ standard has a similar wording in 1.9 [intro.execution]

Both standards have the same definition of observed behavior:

Least requirements for the corresponding implementation:
- Access to unstable objects is evaluated strictly in accordance with the rules of an abstract machine.
- At the end of the program, all the data written to the files should be identical to the result, which is the execution of the program in accordance with abstract semantics. - The dynamics of the input and output signals of interactive devices should take place, as indicated in 7.21.3. The purpose of these requirements is that an unbuffered or string buffer appear as soon as possible to make sure that the prompts actually appear before the program is waiting for input.
This is the observed behavior of the program.

So everything else, for example. the performance of the for loop or the number of read / write operations performed on non-volatile variables is not considered observable and therefore there are no corresponding performance requirements for the compiler.

If the compiler wanted to re-evaluate the code block 100 times (assuming that it has no observable side effects, only changing the state of non-volatile variables) and check that the same results were obtained each time (and not affected by cosmic rays or faulty equipment), which will be allowed by the standard.

+16
Oct 07 '14 at 11:12
source share

Others noted that the standard does not limit the operation of the C-runtime, but only its observed behavior. There is no reason why you cannot interpret or compile CIT, for example.

Consider the implementation of C, where all memory cells are stored in a linked list in some basic system. Pointers are an index to this linked list. All pointer operations will function as normal, except that the runtime will have to go through a linked list with every memory access. All sorts of general algorithms will suddenly acquire an additional coefficient N in their complexity, for example, ordinary operations with a null character.

+9
07 Oct '14 at 15:59
source share



All Articles