Will the compiler optimize repetitive math?

Will the Java Compiler optimize simple repetitive math operations, for example:

if (prevX / width != curX / width) { // Do something with prevX / width value } else { // Do something with curX / width value } 

I know that I can simply assign the results to variables before the if statement and return the variables, but this is pretty cumbersome. If the compiler automatically recognizes that the same calculations are being performed and caches the results for temporary variables on its own, I would prefer to stick to the above convention.

* Edit - I'm an idiot. I tried to just / distract my question too much. It is not so simple: if (x> y)

+6
source share
3 answers

The answer is yes. This is called Common Subexpression Elimination and is the standard (and powerful) compiler optimization used in Java, C / C ++ and others ...

This page confirms that JSM HotSpot will perform this optimization.


However, the compiler / runtime will be able to do this optimization when you expect this to be another story. Therefore, I usually prefer to do these optimizations myself if it also improves readability.

 double xw = x / width; double yw = y / width; if (xw > yw) { return xw; } else { return yw; } 
+7
source

The compiler can perform such optimizations. In fact, it depends on the answers to the following questions:

Is the compiler allowed to do this using JLS?

In some cases this is not so. For example, if prevX was a volatile instance variable, then it should be retrieved from memory each time the source code says it is being used. Another case is that a general subexpression involves invoking a method with an observed side effect; that is, somewhere else in the program it could determine if the method is called once or twice.

Can this compiler do this?

The compiler should analyze the code to detect common subexpressions that can be legally optimized. There are two questions here:

  • Is the compiler capable of performing the necessary reasoning? For example, it can be assumed that the compiler can determine that a call to a particular method will be free from side effects and therefore can be optimized. However, creating a compiler that is actually capable of doing this is ... and an interesting problem.

  • Is optimization worth it? There is a trade-off between optimization costs and benefits. This is not a direct compromise. He must consider the cost of the search to see if optimization can be performed ... when it really cannot. In other words, the effect on compilation time. (Remember that in Java, optimization is mostly done at runtime by the JIT compiler ... so this affects application performance.)

In a simple example like yours, optimization is legal (modulo volatile ), and you should expect it to be a semi-decent JIT compiler.


Another question is whether you should try to help the compiler by explicitly evaluating the general expressions in your code and assigning the results to temporary resources.

IMO, usually no answer.

  • A good compiler will probably work just as well as you. And if this is not so, then the next generation can do it.

  • The code probably does not require manual optimization. If you haven’t profiled your code to determine where the bottlenecks are, your hand optimizations have a good chance of being out of date for real application performance ... and wasting your time.

  • There is a chance that you will fill it; for example, forgetting that calling a method has an important side effect or that the variable is volatile for a good reason.

On the other hand, if rewriting makes your code more readable, this is a good reason for this.

+3
source

In general, β€œyes” - the compiler will optimize the code, if possible, and JSM HotSpot can also improve repeatedly executed blocks of code.

In this case, however, you'd better reorganize the code as follows:

 if (x > y) return x / width; return y / width; 

which avoids a single division operation if x > y .

+2
source

Source: https://habr.com/ru/post/906515/


All Articles