0
点赞
收藏
分享

微信扫一扫

Machine learning 吴恩达第二周coding作业(必做题)

乐百川 2022-05-27 阅读 55

1.warmUpExercise:

function A = warmUpExercise()
%WARMUPEXERCISE Example function in octave
% A = WARMUPEXERCISE() is an example function that returns the 5x5 identity matrix

A = [];
% ============= YOUR CODE HERE ==============
% Instructions: Return the 5x5 identity matrix
% In octave, we return values by defining which variables
% represent the return values (at the top of the file)
% and then set them accordingly.

A=eye(5);





% ===========================================


end

 

2.Computing Cost:

function J = computeCost(X, y, theta)
%COMPUTECOST Compute cost for linear regression
% J = COMPUTECOST(X, y, theta) computes the cost of using theta as the
% parameter for linear regression to fit the data points in X and y

% Initialize some useful values
m = length(y); % number of training examples

% ====================== YOUR CODE HERE ======================
% Instructions: Compute the cost of a particular choice of theta
% You should set J to the cost.

pred=X*theta;
errors=(pred-y).^2;
% You need to return the following variables correctly
J = 1/(2*m)*sum(errors);




% =========================================================================

end

 3.Gradient Desecnt:

换成矩阵的形式操作;

function [theta, J_history] = gradientDescent(X, y, theta, alpha, num_iters)
%GRADIENTDESCENT Performs gradient descent to learn theta
% theta = GRADIENTDESCENT(X, y, theta, alpha, num_iters) updates theta by
% taking num_iters gradient steps with learning rate alpha

% Initialize some useful values
m = length(y); % number of training examples
J_history = zeros(num_iters, 1);

for iter = 1:num_iters

% ====================== YOUR CODE HERE ======================
% Instructions: Perform a single gradient step on the parameter vector
% theta.
%
% Hint: While debugging, it can be useful to print out the values
% of the cost function (computeCost) and gradient here.
%

tmp=X'*(X*theta-y);
theta=theta-alpha/m*tmp;





% ============================================================

% Save the cost J in every iteration
J_history(iter) = computeCost(X, y, theta);

end

end

 

EPFL - Fighting

举报

相关推荐

0 条评论