0
点赞
收藏
分享

微信扫一扫

【PyTorch】nn.ReLU()与F.relu()的区别


问题

使用PyTorch编程实现CNN卷积层的时候,经常会遇到

class BasicConv2d(nn.Module):
def __init__(self, in_channels: int, out_channels: int, **kwargs: Any) -> None:
super().__init__()
self.conv = nn.Conv2d(in_channels, out_channels, bias=False, **kwargs)
self.bn = nn.BatchNorm2d(out_channels, eps=0.001)

def forward(self, x: Tensor) -> Tensor:
x = self.conv(x)
x = self.bn(x)
return F.relu(x, inplace=True)

方法

结语


举报

相关推荐

0 条评论