I was wondering why the terms covariance and contravariance are so named in the field of programming. These terms are usually heard in probability theory or in statistics, indicating the distribution of quantities measured relative to the Average.
- What is the idea of โโborrowing these conditions from statistics?
- How do these terms describe this spread in programming?
- What would be average in programming?
I know that Covariance is the ability to assign an expression of a more specific type to a variable of a less defined type, but is there any other interpretation for this?
Examples for both of these domains would be helpful.
source share