0
点赞
收藏
分享

微信扫一扫

Tensorflow(十九) —— 梯度下降简介

Tensorflow(十九) —— 梯度下降简介

1. Auoto Grad

# ****************** Auoto Grad
with tf.GradientTape() as tape:
    build computation graph
    loss = f(x)
[w_grad] = tape.gradient(loss,[w])

2. 一阶导数

# ****************** 一阶导数
w = tf.constant(1.)
x = tf.constant(2.)

y = x*w

with tf.GradientTape() as tape:
    tape.watch([w])
    y1 = x*w
try :
    [w_grad] = tape.gradient(y,[w])
    print("w_grad:",w_grad)
except Exception as error:
    print("error:",error)

with tf.GradientTape() as tape:
    tape.watch([w])
    y1 = x*w
[w_grad1] = tape.gradient(y1,[w])
print("w_grad1:",w_grad1.numpy())

w = tf.Variable(w)
with tf.GradientTape() as tape:
    y1 = x*w
print("w_grad2:",w_grad1.numpy())

3. GradientTape(persistent = True)

# ********************** GradientTape(persistent = True)
"""
tape 进行一次.gradient()梯度计算后便会释放掉相关信息,
若想多次进行梯度计算,则应将persistent参数设置为True,
但是计算完毕后 应将tape关闭
"""
x = tf.Variable(tf.constant(2.))
w = tf.Variable(tf.constant(5.))

with tf.GradientTape(persistent = True) as tape:
    y = x*w

[grad1] = tape.gradient(y,[x])
print("grad1:",grad1.numpy())

[grad2] = tape.gradient(y,[w])
print("grad2:",grad2.numpy())

4. 计算二阶导数

# ******************** 计算二阶导数
x = tf.Variable(tf.constant(5.))
y = tf.Variable(tf.constant(2.))

with tf.GradientTape() as tape1:
    with tf.GradientTape() as tape2:
        z = x**2 + y**2 + x*y
    [dz_dy] = tape2.gradient(z,[y])
    print("dz_dy:",dz_dy.numpy())
[d2z_dy2] = tape1.gradient(dz_dy,[y])
print("d2z_dy2:",d2z_dy2.numpy())

本文为参考龙龙老师的“深度学习与TensorFlow 2入门实战“课程书写的学习笔记

by CyrusMay 2022 04 17

举报

相关推荐

0 条评论