There are some values โโof p , where the condition is true for large values โโof n . For example, with p = 3 condition becomes true for n = 50_331_648 . In this case, the limit of 5000 will certainly win in terms of performance, but two calculations will not return the same result.
I accidentally chose p (3002), which returns true for n less than 5000, and the results are very close (although the version with limit bit slower, probably due to the additional condition n < 5000 ).
Test results (in microseconds per call to anyMatch ):
Benchmark Mode Samples Mean Mean error Units capSO24003674.limit avgt 5 130.165 2.663 us/op capSO24003674.noLimit avgt 5 126.876 2.440 us/op
Benchmark code (using jmh):
@BenchmarkMode(Mode.AverageTime) @OutputTimeUnit(TimeUnit.MICROSECONDS) @State(Scope.Thread) @Warmup(iterations = 5, time = 500, timeUnit = TimeUnit.MILLISECONDS) @Measurement(iterations = 5, time = 1000, timeUnit = TimeUnit.MILLISECONDS) @Fork(1) public class SO24003674 { private int p = 3002; @GenerateMicroBenchmark public boolean limit() { return LongStream.iterate(1, n -> n + 1).limit(5000) .anyMatch(n -> isPerfectCube((n * n * n) + ((n * n) * p))); } @GenerateMicroBenchmark public boolean noLimit() { return LongStream.iterate(1, n -> n + 1) .anyMatch(n -> isPerfectCube((n * n * n) + ((n * n) * p))); } private static boolean isPerfectCube(long n) { long tst = (long) (Math.cbrt(n) + 0.5); return tst * tst * tst == n; } }
source share