Consider the code below
the code:
class Foo():
def __init__(self, b):
self.a = 0.0
self.b = b
def count_a(self):
self.a += 0.1
foo = Foo(1)
for i in range(0, 15):
foo.count_a()
print "a =", foo.a, "b =", foo.b, '"a == b" ->', foo.a == foo.b
Conclusion:
a = 0.2 b = 1 "a == b" -> False
a = 0.4 b = 1 "a == b" -> False
a = 0.6 b = 1 "a == b" -> False
a = 0.8 b = 1 "a == b" -> False
a = 1.0 b = 1 "a == b" -> True
a = 1.2 b = 1 "a == b" -> False
a = 1.4 b = 1 "a == b" -> False
a = 1.6 b = 1 "a == b" -> False
a = 1.8 b = 1 "a == b" -> False
a = 2.0 b = 1 "a == b" -> False
a = 2.2 b = 1 "a == b" -> False
a = 2.4 b = 1 "a == b" -> False
a = 2.6 b = 1 "a == b" -> False
a = 2.8 b = 1 "a == b" -> False
a = 3.0 b = 1 "a == b" -> False
But if I change the code on the line 11to foo = Foo(2), the output will look like this:
a = 0.2 b = 2 "a == b" -> False
a = 0.4 b = 2 "a == b" -> False
a = 0.6 b = 2 "a == b" -> False
a = 0.8 b = 2 "a == b" -> False
a = 1.0 b = 2 "a == b" -> False
a = 1.2 b = 2 "a == b" -> False
a = 1.4 b = 2 "a == b" -> False
a = 1.6 b = 2 "a == b" -> False
a = 1.8 b = 2 "a == b" -> False
a = 2.0 b = 2 "a == b" -> False *
a = 2.2 b = 2 "a == b" -> False
a = 2.4 b = 2 "a == b" -> False
a = 2.6 b = 2 "a == b" -> False
a = 2.8 b = 2 "a == b" -> False
a = 3.0 b = 2 "a == b" -> False
You will see that the conclusion is a = 2.0 b = 2 "a == b" -> Falseabsolutely strange. I think I might misunderstand the concept of OOP in Python. Please explain to me why this unexpected exit occurred and how to solve this problem.
source
share