0
点赞
收藏
分享

微信扫一扫

3-线性回归-backwards-grad计算梯度

墨香子儿 2022-05-03 阅读 20
'''
线性模型为 y = w*x + b
'''

import torch
import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
torch.__version__
data = pd.read_csv('Income1.csv')
print(data.Education.to_numpy())

X = torch.from_numpy(np.array(data.Education)).type(torch.FloatTensor)   # 作为自变量,求解模型计算的预测值y_pred
Y = torch.from_numpy(np.array(data.Income)).type(torch.FloatTensor)      # 作为真实值y,计算 loss= y - y_pred


# 初始化参数
w = torch.randn(1, requires_grad=True)
b = torch.zeros(1, requires_grad=True)
learnning_rate = 0.0001

for epoch in range(500):   # 训练次数
    for x, y in zip(X, Y):
        y_pred = x*w + b
        # 损失函数:(真实值-预测值)的平方
        loss = ((y-y_pred)**2)
        # 梯度置零
        if not w.grad is None:
            w.grad.data.zero_()   # 梯度置零,因为每次的梯度会相加
        if not b.grad is None:
            b.grad.data.zero_()   # 梯度置零,因为每次的梯度会相加
        # 计算梯度
        loss.backward()
        # 梯度下降,更新参数
        w.data = w.data - w.grad.data*learnning_rate
        b.data = b.data - b.grad.data * learnning_rate

plt.scatter(data.Education, data.Income)
plt.plot(np.array(X), np.array(X*w.data + b.data))
plt.show()

在这里插入图片描述

数据来源:基础部分参考代码和数据集----第2章----Income1.CSV,提取码1234 https://pan.baidu.com/s/1xzxiQyHwNKiDFxFJi-jD9w

举报

相关推荐

0 条评论