Why is covariance and contravariance named so?

I was wondering why the terms covariance and contravariance are so named in the field of programming. These terms are usually heard in probability theory or in statistics, indicating the distribution of quantities measured relative to the Average.

  • What is the idea of โ€‹โ€‹borrowing these conditions from statistics?
  • How do these terms describe this spread in programming?
  • What would be average in programming?

I know that Covariance is the ability to assign an expression of a more specific type to a variable of a less defined type, but is there any other interpretation for this?
Examples for both of these domains would be helpful.

+5
source share
1 answer

I am not a mathematician, so I would not answer my question, but you can find a wonderful explanation of your question on the Tomas Petrichek Blog .

He explains in detail how covariance and contravariance in programming is related to the theory of a pure mathematical category.

+1
source

Source: https://habr.com/ru/post/1201234/


All Articles