Linear regression 线性回归模型: $$ h_\theta(x)=\theta^Tx $$
代价函数: $$ J(\theta)={(1/2m)}{\sum_{i=1}^{m}{(h_\theta(x^{(i)})-y^{(i)})^2}} $$
1 2 3 4 5 6 7 8 9 10 11 12 function J = computeCost (X, y, theta) m = length (y); 方法1 : for i =1 :m J=J+(X(i ,:)*theta-y(i ,:))^2 /(2 *m); end 方法2 : J=sum((X*theta-y).^2 )/(2 *m); 方法3 :(通用于单变量、多变量回归) J=((X*theta-y)'*(X*theta-y))/(2 *m); end
梯度下降法更新theta: $$ \theta_j:=\theta_j-\alpha(1/m){\sum_{i=1}^{m}{(h_\theta(x^{(i)})-y^{(i)}){x_j^{(i)}}}} $$
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 function [theta, J_history] = gradientDescent (X, y, theta, alpha, num_iters) m = length (y); J_history = zeros (num_iters, 1 ); for iter = 1 :num_iters theta = theta - alpha / m * X' * (X * theta - y); J_history(iter) = computeCost(X, y, theta); end end
正规方程法直接得到theta解析解: $$ \theta=(X^TX)^{-1}{X^T}y $$
1 2 3 4 5 function [theta] = normalEqn (X, y) theta = zeros (size (X, 2 ), 1 ); theta = pinv(X' * X) * X' * y; end
特征归一化: $$ x:=(x^i-mu)/s^i $$
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 function [X_norm, mu, sigma] = featureNormalize (X) X_norm = X; mu = zeros (1 , size (X, 2 )); sigma = zeros (1 , size (X, 2 )); m = length (X(:,1 )); mu=mean (X); sigma=std(X); for i = 1 :size (X,2 ), X_norm(:,i ) = (X(:,i ) - mu(i )) / sigma(i ); end end
最后更新时间:2019-08-06 20:36:01