You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In the book, Chapter 12, page 209, where a "Hierarchical self-Attention Network" (HAN) model was introduced to handle heterogeneous graphs, the reference [5] (J. Liu, Y. Wang, S. Xiang, and C. Pan. HAN: An Efficient Hierarchical Self-Attention Network for Skeleton-Based Gesture Recognition. arXiv, 2021. DOI: 10.48550/ARXIV.2106.13391. Available: https://arxiv.org/abs/2106.13391) points to a different architecture than the one actually implemented in Chapter 12.
Instead, all mathmetical presentations, down to the diagram illustrating 3-level architecture (node level, semantic level and prediction) seem to be directly borrowed from HAN for "Heterogeneous graph attention network": by Xiao Wang, Houye Ji, Chuan Shi, Bai Wang, Yanfang Ye, Peng Cui, and Philip S Yu. 2019.
Here is the actual paper: https://par.nsf.gov/servlets/purl/10135600,
Hope this helps.
The text was updated successfully, but these errors were encountered:
Thank you for all this work!
In the book, Chapter 12, page 209, where a "Hierarchical self-Attention Network" (HAN) model was introduced to handle heterogeneous graphs, the reference [5] (J. Liu, Y. Wang, S. Xiang, and C. Pan. HAN: An Efficient Hierarchical Self-Attention Network for Skeleton-Based Gesture Recognition. arXiv, 2021. DOI: 10.48550/ARXIV.2106.13391. Available: https://arxiv.org/abs/2106.13391) points to a different architecture than the one actually implemented in Chapter 12.
Instead, all mathmetical presentations, down to the diagram illustrating 3-level architecture (node level, semantic level and prediction) seem to be directly borrowed from HAN for "Heterogeneous graph attention network": by Xiao Wang, Houye Ji, Chuan Shi, Bai Wang, Yanfang Ye, Peng Cui, and Philip S Yu. 2019.
Here is the actual paper: https://par.nsf.gov/servlets/purl/10135600,
Hope this helps.
The text was updated successfully, but these errors were encountered: