0
点赞
收藏
分享

微信扫一扫

深度循环神经网络

半秋L 2022-02-19 阅读 64

在这里插入图片描述

import torch
from torch import nn
from d2l import torch as d2l

batch_size,num_steps = 32,35
train_iter,vocab = d2l.load_data_time_machine(batch_size,num_steps)

vocab_size,num_hiddens,num_layers = len(vocab),256,2
num_inputs = vocab_size
device = torch.device('cuda')
lstm_layer = nn.LSTM(num_inputs,num_hiddens,num_layers)
model = d2l.RNNModel(lstm_layer,len(vocab))
model = model.to(device)

num_epochs,lr = 500,2
d2l.train_ch8(model,train_iter,vocab,lr,num_epochs,device)
d2l.plt.show()
perplexity 1.0, 159965.3 tokens/sec on cuda
time travelleryou can show black is white by argument said filby
travelleryou can show black is white by argument said filby

在这里插入图片描述

总结:

  • 在深度循环神经⽹络中,隐状态的信息被传递到当前层的下⼀时间步和下⼀层的当前时间步。
  • 深度神经网络使用多个隐藏层来获取更多的非线性
举报

相关推荐

0 条评论