I have a problem with the algorithm. I don't know if stackoverflow is the right place to post, but since I use Matlab and want to do this with this, I post it there. My problem is this: I have a data set, and I know little about it, except for the fact that at the end of this set the points should be pretty linear. I want to make a linear fit of these points that are linearly distributed without using a part that is not.
(the image is always better understood): 
As you can see, I have blue data that is not linear, but at the end there is a linear part (red part). I would like to find an algorithm that lets me know when the behavior of the data curve ends its linearity.
I do not know if I am clear?
I tried to make a few points on the right and make a linear fit of these few points. Then add a few points to the list and see if they are close enough to the linear fit. Then make a linear landing again with the added points and so on, but I think this is not the best solution, because the "first" points have a lot of noise (which is not shown here) ...
Do you have any ideas or suggestions or links?
Thanks!
mwoua source share