While trying to develop an algorithm, I came across this question. This is not homework.
Let P_i = an array of first i primes. Now I need the smallest i such that
Sum<n=0..i> 1 / (P_i[n]*P_i[n]) >= 1.
(if such i exists).
The approximation for i'th prime is i*log(i) . So I tried this in Java:
public static viod main(String args[]) { double sum = 0.0; long i = 2; while(sum<1.0) { sum += 1.0 / (i*Math.log(i)*i*Math.log(i)); i++; } System.out.println(i+": "+sum); }
However, the above does not end because it converges to 0.7. However, 1/100000000^2 rounded to 0.0 in Java, so why doesn't this work. For the same reason, it doesn’t even work if you replace the 6th line with
sum += 1.0 / (i*i)
while this should reach 1 , if I’m not mistaken, because the sum must exceed 1/2^i , and the latter converges to 1 . In other words, this shows that Java rounding causes the sum to not reach 1 . I think that the minimum i my problem should exist.
source share