The problem is this:
features.transpose().dot(features) cannot be reversible. And numpy.linalg.inv only works for a full-sized matrix as per docs. However, the (non-zero) regularization term always makes the equation non-degenerate.
By the way, you are right in the implementation. But it is not effective. An effective way to solve this equation is to use the least squares method.
np.linalg.lstsq(features, labels) can do the job for np.linalg.pinv(features).dot(labels) .
In general, you can do this
def get_model(A, y, lamb=0): n_col = A.shape[1] return np.linalg.lstsq(ATdot(A) + lamb * np.identity(n_col), ATdot(y))
source share