[1]
Amir Khan, “Self-attention Mechanisms in Transformer Architectures: Studying self-attention mechanisms in transformer architectures and their role in capturing long-range dependencies in sequential data”, Journal of AI in Healthcare and Medicine, vol. 1, no. 1, pp. 11–21, May 2021, Accessed: Sep. 19, 2024. [Online]. Available: https://healthsciencepub.com/index.php/jaihm/article/view/24