Calculating Pi with decimal point in Python

I am trying to calculate pi with arbitrary precision in Python using one of the Ramanujan formulas: http://en.wikipedia.org/wiki/Approximations_of_%CF%80#20th_century . This basically requires a large number of factorials and division of floating numbers with high accuracy.

Here is my code: http://pastie.org/private/pa6ijmoowiwiw4xwiqmq

I get an error somewhere around the fifteenth digit pi (3.1415926535897930, and that should be 3.1415926535897932). Can you give some advice why this is happening? I use the decimal type, and the docs say that it allows arbitrary values ​​of floating point numbers and integers.

PS: This is homework, so I can’t use a different formula. PSS: I am using python 2.7

Thanks:)

+6
source share
1 answer

Use Decimal(2).sqrt() instead of Decimal(sqrt(2)) .

I checked the first 1000 digits and it seems to work fine. By the way, for some reason, your code outputs 1007 decimal places instead of 1000.

+3
source

Source: https://habr.com/ru/post/946385/


All Articles