0
点赞
收藏
分享

微信扫一扫

Spark RDD各种join算子从源码层分析实现方式

        在使用torch.nn.functional模块时,需要导入包:

from torch.nn import functional

        以下是常见激活函数的介绍以及对应的代码示例:

tanh (双曲正切)

import torch
x = torch.tensor([-2.0, -1.0, 0.0, 1.0, 2.0])
y = torch.nn.funcationl.tanh(x)
print(y)  # 输出:tensor([-0.9640, -0.7616,  0.0000,  0.7616,  0.9640])

sigmoid (S形函数)

y = torch.nn.funcational.sigmoid(x)
print(y)  # 输出:tensor([0.1192, 0.2689, 0.5000, 0.7311, 0.8808])

SiLU (Sigmoid Linear Unit,也称Swish) 

y = torch.nn.funcationl.silu(x)
print(y)  # 输出:tensor([-0.2384, -0.2689,  0.0000,  0.7311,  1.7616])

GELU (Gaussian Error Linear Unit)

y = torch.nn.functional.gelu(x)
print(y)  # 输出:tensor([-0.0454, -0.1588,  0.0000,  0.8413,  1.9546])

ReLU (Rectified Linear Unit)

y = torch.nn.funcationl.relu(x)
print(y)  # 输出:tensor([0., 0., 0., 1., 2.])

ReLU_ (In-place ReLU)

x.relu_()  # 注意:会改变x本身
print(x)  # x的值被修改为:tensor([0., 0., 0., 1., 2.])

Leaky ReLU

x = torch.tensor([-2.0, -1.0, 0.0, 1.0, 2.0])
y = torch.nn.functional.leaky_relu(x, negative_slope=0.01)
print(y)  # 输出:tensor([-0.0200, -0.0100,  0.0000,  1.0000,  2.0000])

Leaky ReLU_ (In-place Leaky ReLU)

x.leaky_relu_(negative_slope=0.01)
print(x)  # x的值被修改

Softmax

x = torch.tensor([1.0, 2.0, 3.0])
y = torch.nn.functional.softmax(x, dim=0)
print(y)  # 输出:tensor([0.0900, 0.2447, 0.6652])

Threshold

x = torch.tensor([-1.0, 0.0, 1.0, 2.0])
y = torch.nn.functional.threshold(x, threshold=0.5, value=0.0)
print(y)  # 输出:tensor([0., 0., 0., 2.])

Normalize

x = torch.tensor([[1.0, 2.0, 3.0], [4.0, 5.0, 6.0]])
y = torch.nn.functional.normalize(x, p=2, dim=1)
print(y)  # 输出:标准化到单位向量
举报

相关推荐

0 条评论