In C (and therefore also in Objective-C), expressions are almost always evaluated regardless of the context in which they appear.
The 1/120 expression is a division of the two operands of int , so it gives the result of int . Integer division truncates, so 1/120 gives 0 . The fact that the result is used to initialize the float object does not change the way you evaluate 1 / 120 .
This can be contradictory from time to time, especially if you are used to the way calculators usually work (usually they store all floating point results).
As the other answers say, in order to get a result close to 0.00833 (which cannot be represented exactly, BTW), you need to perform floating point division, not integer division, by making one or both of the floating point operands. If one operand is a floating point and the other is an integer, the integer operand is first converted to floating point; during integer division operation there is no direct floating point.
Please note that as @ 0x8badf00d comment says, the result should be 0 . Something else must be wrong for the print result inf . If you can show us more code, preferably a small complete program, we can help understand this.
(There are languages โโin which integer division gives a floating point result. Even in these languages, the rating does not necessarily depend on its context. Python version 3 is one of these languages: C, Objective-C, and Python version 2 not.)
source share