Computers just don't work that way, at least unless they are programmed for it. Assumption is the amount you give them. If you create number 2/3 as 0.666666666666666667, then all operations treat it like that. This error in the least significant figure may eventually spread to big errors in subsequent calculations, but this is what good code should handle, using algorithms that minimize these problems when possible.
As I said, computers do what they are told. Thus, there are written packages that use the so-called interval arithmetic. Then the number can be described as an interval, so we could create 2/3 as an interval [0.6666666666666666,0,6666666666666667]. We can work at intervals, add, subtract, multiply, etc. These operations will often see the width of the intervals when we work with them.
However, the fact is, even if you use interval arithmetic tools, it is you who must know at the beginning the number of significant digits in your numbers. If you create the number as 2.2, saving it as double, then the computer will actually try to save the number as 2.200000000000000, and suppose that all the numbers are exactly correct. In fact, since floating point arithmetic is used, the number will actually be stored inside the binary number. Thus, 2.2 is likely to be effectively stored as a number:
2.20000000000000017763568394002504646778106689453125
because most decimal fractional numbers are not represented exactly in binary form. Again, care should be used in all software, but also always by the person who uses these tools to understand what their numbers really mean.
This last point is important. Many people refer to the number generated by the computer as truth, as transmitted by the computer god on a stone tablet. If the computer prints 1.4523656535725, they count every digit of what they see. In fact, common sense should be used here to know that perhaps this number was created from data that had only 3 significant digits, so you can only rely on the first few significant digits of that number. And, of course, that is why you are taught about this concept in school, they know what to trust and what not to trust. Remember, however, that computers tend to trust forever. You must apply a filter.