0
点赞
收藏
分享

微信扫一扫

GRAPH ATTENTION NETWORKS

驚鴻飛雪 2022-07-18 阅读 71

基本就是第一层concatenate,第二层不concatenate.

GRAPH ATTENTION NETWORKS_sed

 

 

相关论文:

​​Semi-Supervised Classification with Graph Convolutional Networks​​

​​Geometric deep learning on graphs and manifolds using mixture model CNNs​​

​​Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering​​

黄世宇/Shiyu Huang's Personal Page:​​https://huangshiyu13.github.io/​​



举报

相关推荐

GAT(Graph Attention Network)

0 条评论