When the C standard was written, if the effect of a particular action will vary on different platforms, it will not always be possible for a particular platform to guarantee any particular exact effect, and if there could be plausible implementations in which the action could cause a hardware trap whose behavior It didn’t depend on the C compiler, there was little perceived value in that the Standard says nothing about behavior. Even if there wasn’t any significant likelihood of a hardware trap, the likelihood of “surprising” behavior was sufficient for brand behavior like Undefined.
Consider, for example, unsigned long x,*p; ... *p=(x++); unsigned long x,*p; ... *p=(x++); . If p==&x , it is not only possible that *p can contain not only the old value of x , but also the value 1 is greater, but if x was, for example, 0x0000FFFF, it could also plausibly end up containing 0x00000000, or 0x0001FFFF. Even if no machine starts a hardware trap, I don’t think that the authors of the Standard would consider "Any value changed more than once will contain an undefined value and any reading of the value of l in the same expression that writes it in a different way than allowed here, may give an indefinite meaning, "to be more useful than simply declaring actions such as Undefined Behavior. In addition, from the point of view of the authors of the Standard, the rejection of the Standard by the mandate for specific behavior in cases where some platforms can provide for free, while others cannot but create obstacles to the specification of such behavior on platforms that could provide them.
In practice, even very poorly described behavior can often be very useful for programs that have the following two requirements with the vast majority of programs written today:
- If entered correctly, enter the correct result.
- When using invalid data, do not launch nuclear missiles.
Unfortunately, someone came up with the idea that if the C-standard does not impose the action of any action X in a specific situation Y, even if most compilers have behavior that would be adequate for programs aimed at satisfying the above requirement (for example , most compilers will generate code for the expression p < q , which will either give 0 or 1, or will not have other side effects, even if p and q identify unrelated objects), then the action X should be considered as an indication of k the compiler, that the program will never receive any input that could cause situation Y.
The indicated (*p=*p) & (*q=*q) intended to represent such a "promise." The logic is that since the standard would not say anything about what the compiler can do if p==q , the compiler should assume that the programmer does not mind if the program launches nuclear missiles in response to any input that may cause code to be executed when p==q .
This idea and its consequences fundamentally contradict the nature and design tasks of C, and also use the system programming language. Almost all systems offer some features and warranties beyond those provided by the Standard, although features vary from one system to another. I find it absurd that the language is better served by overriding x < y from "I am ready to accept any pointer comparison methods used by any equipment on which this program will actually run", "I am so convincing that these two pointers will be connected so that I would put my life on it "than it would be, adding a new tool to tell the compiler to assume that" x and y are related pointers ", but somehow it seems to be accepted.