1.
Amir Khan. Self-attention Mechanisms in Transformer Architectures: Studying self-attention mechanisms in transformer architectures and their role in capturing long-range dependencies in sequential data. Journal of AI in Healthcare and Medicine [Internet]. 2021 May 30 [cited 2024 Sep. 19];1(1):11-2. Available from: https://healthsciencepub.com/index.php/jaihm/article/view/24