How to use least squares with a weight matrix in python?

I know how to solve AX = B by least squares using python:

Example:

A=[[1,1,1,1],[1,1,1,1],[1,1,1,1],[1,1,1,1],[1,1,0,0]] B=[1,1,1,1,1] X=numpy.linalg.lstsq(A, B) print X[0] # [ 5.00000000e-01 5.00000000e-01 -1.66533454e-16 -1.11022302e-16] 

But what about solving the same equation with a weight matrix that is not an identity:

 AX = B (W) 

Example:

 A=[[1,1,1,1],[1,1,1,1],[1,1,1,1],[1,1,1,1],[1,1,0,0]] B=[1,1,1,1,1] W=[1,2,3,4,5] 

Thanks in advance,

+6
source share
3 answers

I found a different approach (using W as a diagonal matrix and matrix products):

 A=[[1,1,1,1],[1,1,1,1],[1,1,1,1],[1,1,1,1],[1,1,0,0]] B = [1,1,1,1,1] W = [1,2,3,4,5] W = np.sqrt(np.diag(W)) Aw = np.dot(W,A) Bw = np.dot(B,W) X = np.linalg.lstsq(Aw, Bw) 

The same values ​​and the same results.

+5
source

I do not know how you determined your weights, but you can try this if necessary:

 import numpy as np A=np.array([[1,1,1,1],[1,1,1,1],[1,1,1,1],[1,1,1,1],[1,1,0,0]]) B = np.array([1,1,1,1,1]) W = np.array([1,2,3,4,5]) Aw = A * np.sqrt(W[:,np.newaxis]) Bw = B * np.sqrt(W) X = np.linalg.lstsq(Aw, Bw) 
+8
source

The scikit package offers weighted regression directly. https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LinearRegression.html#sklearn.linear_model.LinearRegression.fit

 import numpy as np # generate random data N = 25 xp = [-5.0, 5.0] x = np.random.uniform(xp[0],xp[1],(N,1)) e = 2*np.random.randn(N,1) y = 2*x+e w = np.ones(N) # make the 3rd one outlier y[2] += 30.0 w[2] = 0.0 from sklearn.linear_model import LinearRegression # fit WLS using sample_weights WLS = LinearRegression() WLS.fit(x, y, sample_weight=w) from matplotlib import pyplot as plt plt.plot(x,y, '.') plt.plot(xp, xp*WLS.coef_[0]) plt.show() 

weighted regression without outlier

0
source

Source: https://habr.com/ru/post/978681/


All Articles