I had a strange problem that I want to understand about:
func printIntervals(_ colors: Int) {
let start = 0.4 - (Double(colors) - 1) / 10
print("start: \(start)")
for i in stride(from: start, through: 0.4, by: 0.1) { print ("i:\(i)")}
}
printIntervals(1)
printIntervals(2)
printIntervals(3)
printIntervals(4)
printIntervals(5)
printIntervals(6)
printIntervals(7)
printIntervals(8)
Gives me this result:
colors: 1 start: 0.4
i:0.4
colors: 2 start: 0.3
i:0.3
i:0.4
colors: 3 start: 0.2
i:0.2
i:0.3
i:0.4
colors: 4 start: 0.1
i:0.1
i:0.2
i:0.3
colors: 5 start: 0.0
i:0.0
i:0.1
i:0.2
i:0.3
i:0.4
colors: 6 start: -0.1
i:-0.1
i:2.77555756156289e-17
i:0.1
i:0.2
i:0.3
i:0.4
colors: 7 start: -0.2
i:-0.2
i:-0.1
i:5.55111512312578e-17
i:0.1
i:0.2
i:0.3
colors: 8 start: -0.3
i:-0.3
i:-0.2
i:-0.0999999999999999
i:1.11022302462516e-16
i:0.1
i:0.2
i:0.3
As you can see, some function calls will include a value of 0.4 and others will not ... My quick workaround is to take a step of 0.41 instead, so this is definitely a problem with accuracy.
But if the previous value is printed as 0.3 (not 0.300001), then why does it skip 0.4 when increased by 0.1? And why does this happen only in some cases - but sequentially?
source
share