Java: Bitwise OR and AND FASTER than equivalent logical operators?

Cut and dry ... until I never have enough logical operations to make this a performance bottleneck. Interestingly, I would be better off using bitwise and (&) and bitwise or (|) as opposed to the same called logical operators (& & and ||), if possible? Perhaps the question may be preceded by the fact that I do not know the libraries for converting Java to assembly in order to see the number of operations.

+6
source share
8 answers

Bitwise operators avoid branching instructions, even when executing Java code. As a result, you do not have expensive misses for predicting branches and generally no jumps.

In my experience, they can be significantly faster when used in code that runs quite often. Keep in mind, however, that bitwise operators are not short-circuited, which in some cases can have a negative impact on performance.

Nevertheless, such microoptimizations should be used only as a last resort and only after the profiler tells you about this - readability and maintainability come first.

+18
source

I suggest you watch Josh Bloch Performance Concerns at Parleys.com. http://www.parleys.com/#st=5&id=2103&sl=1

+5
source

In most cases, this will be optimized by the compiler. Quick Google shows this handy guide for viewing your Java as assembler. I always thought that legible, readable code was more important than a few seconds of nanoseconds of processor time.

Java is not the best language for speed limits due to the extra JVM layer. If you are interested in such precise optimization, you can switch to another language, for example, C / C ++. This list shows which languages โ€‹โ€‹you can watch.

+4
source

Interestingly, I would be better off using bitwise and (&) and bitwise or (|) as opposed to logical operators, if possible?

Strange, you went from the performance trivia question to asking if you really should do this in your code. Well, the second is easy. Not. The cost to you as a developer who wrote less legible code will overwhelm the nanosecond difference in processor cost. If you need to optimize this anyway, use C or C ++.

+4
source

The Java compiler simply compiles to byte code, which is pretty far from real machine code. The responsibility for the JVM lies with this, and modern JVMs such as HotSpot may well do so. So write the simplest, clearest code that does what you need to do.

In short, you most likely will not be able to measure the difference.

To see the actual machine code, you need to ask the JVM to show you. It depends on the supplier.

+2
source

Not.

First, the use of bitwise operators, unlike logical operators, is error prone (for example, performing a right shift of 1 is NOT equivalent to multiplying by two). Secondly, the performance advantage is negligible (if any).

Last but not least, using logical operators is much better.

+2
source

you can try to write a small program with 100,000 bitwise operations, use the timer function to determine the runtime. Then do the same for logical operations. Run them several times and check the results.

+1
source

Like schrodinger cat Yes and No at the same time.

It depends on what you are actually doing! I once did a sudoku solver with and without a bitwise operation. Here is my benchmark:

  • C: 0.9 milliseconds
  • Without: 50 millis

I used the backtracking algorithm, so it explains a lot why it was much faster with bitwise processing, since sodoku is NP-Complete (possibly NP-Hard).

But, like the others that have already told you, it is very difficult to read and maintain (I will never return to my sudoku solver to make any changes, I would not understand at some point what I did).

In general, a bitwise operation is always faster than any copy, but if you do not do this by the bottle neck of critical software, I would not recommend using it without any reason other than this one.

+1
source

Source: https://habr.com/ru/post/918191/


All Articles