For the row function y = a*x + b ( a and b are previously known constants), it is easy to calculate the distance between the squares between the line and the window of the samples (1, Y1), (2, Y2), ..., (n, Yn) (where Y1 is the oldest sample and Yn is the newest):
sum((Yx - (a*x + b))^2 for x in 1,...,n)
I need a quick algorithm to calculate this value for a rolling window (length n ). I cannot re-view all the samples in the window every time a new sample arrives.
Obviously, some state needs to be saved and updated for each new sample that enters the window, and each old sample leaves the window.
Note that when the sample leaves the window, the indices of the remaining samples also change - each Yx becomes Y (x-1). Therefore, when a sample leaves the window, each other sample in the window adds a new value to the new sum: (Yx - (a*(x-1) + b))^2 instead of (Yx - (a*x + b))^2 .
Is there a known algorithm for calculating this? If not, can you think about it? (It is normal to have some errors due to first order linear approximations).
source share