各种attention的代码实现

阅读 83

2022-08-08


base attention
dot attention
mlp attention
multihead attention
no attention
pooling attention
​​​https://github.com/pytorch/translate/tree/master/pytorch_translate/attention​​

attention
bilinear attention
cosine attention
dot product attention
legacy attention
linear attention
​​​https://github.com/allenai/allennlp/tree/master/allennlp/modules/attention​​

intra sentence attention
multi head self attention
stacked self attention
​​​https://github.com/allenai/allennlp/tree/master/allennlp/modules/seq2seq_encoders​​

bilinear matrix attention
cosine matrix attention
dot product matrix attention
legacy matrix attention
linear matrix attention
matrix attention
​​​https://github.com/allenai/allennlp/tree/master/allennlp/modules/matrix_attention​​


精彩评论(0)

0 0 举报