In [`ridge_regression.py`](https://github.com/ctgk/PRML/blob/master/prml/linear/ridge_regression.py#L9) , wrote `w* = argmin |t - X @ w| + alpha * |w|_2^2`. But according to https://en.wikipedia.org/wiki/Tikhonov_regularization , the loss is in second order. Did I miss something?