(1)
Amir Khan. Self-Attention Mechanisms in Transformer Architectures: Studying Self-Attention Mechanisms in Transformer Architectures and Their Role in Capturing Long-Range Dependencies in Sequential Data. Journal of AI in Healthcare and Medicine 2021, 1 (1), 11-21.