0
点赞
收藏
分享

微信扫一扫

各种attention的代码实现

飞空之羽 2022-08-08 阅读 83


base attention
dot attention
mlp attention
multihead attention
no attention
pooling attention
​​​https://github.com/pytorch/translate/tree/master/pytorch_translate/attention​​

attention
bilinear attention
cosine attention
dot product attention
legacy attention
linear attention
​​​https://github.com/allenai/allennlp/tree/master/allennlp/modules/attention​​

intra sentence attention
multi head self attention
stacked self attention
​​​https://github.com/allenai/allennlp/tree/master/allennlp/modules/seq2seq_encoders​​

bilinear matrix attention
cosine matrix attention
dot product matrix attention
legacy matrix attention
linear matrix attention
matrix attention
​​​https://github.com/allenai/allennlp/tree/master/allennlp/modules/matrix_attention​​


举报

相关推荐

0 条评论