0
点赞
收藏
分享

微信扫一扫

神经网络--线性层

线性层

线性层:每个神经元与上一层所有神经元相连
 torch.nn.Linear(in_features, out_features, bias=True, device=None, dtype=None)
 in_features:输入x样本特征大小
 out_features:输出x样本特征大小
 bias:False:图层不会学习附加偏差 True:增加学习偏置 默认True
 实现输入尺寸的重塑造,也就是将输入矩阵拉直的两种方法:
 torch.reshape()
 torch.flatten()
 代码实战:

import torch
import torchvision
from torch import nn
from torch.nn import Linear
from torch.utils.data import DataLoader

dataset = torchvision.datasets.CIFAR10("../data", train=False, transform=torchvision.transforms.ToTensor(), download=True)
dataloader = DataLoader(dataset, batch_size=64)

class Test(nn.Module):
    def __init__(self):
        super(Test, self).__init__()
        #输入样本特征值196608,输出样本特征值:10
        self.linear1 = Linear(196608, 10)

    def forward(self, input):
        output = self.linear1(input)
        return output

test1 = Test()

for data in dataloader:
    imgs, targets = data
    print(imgs.shape)

    #将输入矩阵拉直成为一维两种方法
    output1 = torch.reshape(imgs, (1, 1, 1, -1))
    output2 = torch.flatten(imgs)
    print(output1.shape)
    print(output2.shape)

    output = test1(output1)
    print(output.shape)

结果:

torch.Size([64, 3, 32, 32])
torch.Size([1, 1, 1, 196608])
torch.Size([196608])
torch.Size([1, 1, 1, 10])
举报

相关推荐

0 条评论