Recently, I was involved in a discussion about the use of pseudo-code in computer science exams. There was a question that used integer division. I said that DIV b is the same as INT (a / b), but another participant said that the effect of INT () depends on the implementation of the language, and it can sometimes be rounded.
My understanding (and 36 years of experience) is that int () always truncates, i.e. rounded up. Do you know about any programming languages in which this is not so in design, for example. will int (1.7) give 2?
source
share