吴恩达Coursera, 机器学习专项课程, Machine Learning:Advanced Learning Algorithms第一周所有jupyter notebook文件:
吴恩达,机器学习专项课程, Advanced Learning Algorithms第一周所有Python编程文件
本次作业
Exercise 1
# UNQ_C1
# GRADED CELL: Sequential model
model = Sequential(
[
tf.keras.Input(shape=(400,)), #specify input size
### START CODE HERE ###
Dense(25, activation='sigmoid', name = 'layer1'),
Dense(15, activation='sigmoid', name = 'layer2'),
Dense(1, activation='sigmoid', name = 'layer3')
### END CODE HERE ###
], name = "my_model"
)
Exercise 2
# UNQ_C2
# GRADED FUNCTION: my_dense
def my_dense(a_in, W, b, g):
"""
Computes dense layer
Args:
a_in (ndarray (n, )) : Data, 1 example
W (ndarray (n,j)) : Weight matrix, n features per unit, j units
b (ndarray (j, )) : bias vector, j units
g activation function (e.g. sigmoid, relu..)
Returns
a_out (ndarray (j,)) : j units
"""
units = W.shape[1]
a_out = np.zeros(units)
### START CODE HERE ###
for j in range(units):
w = W[:,j]
z = np.dot(w, a_in) + b[j]
a_out[j] = g(z)
### END CODE HERE ###
return(a_out)
Exercise 3
# UNQ_C3
# GRADED FUNCTION: my_dense_v
def my_dense_v(A_in, W, b, g):
"""
Computes dense layer
Args:
A_in (ndarray (m,n)) : Data, m examples, n features each
W (ndarray (n,j)) : Weight matrix, n features per unit, j units
b (ndarray (1,j)) : bias vector, j units
g activation function (e.g. sigmoid, relu..)
Returns
A_out (ndarray (m,j)) : m examples, j units
"""
### START CODE HERE ###
# units = W.shape[1]
# a_out = np.zeros(units)
# for j in range(units):
# w = W[:,j]
# z = np.dot(w, a_in) + b[j]
# a_out[j] = g(z)
z = np.matmul(A_in,W) + b
A_out = g(z)
### END CODE HERE ###
return(A_out)
作者:楚千羽
本文作者:楚千羽
本文版权归作者共有,欢迎转载,但未经作者同意必须在文章页面给出原文连接,否则保留追究法律责任的权利!