Amir Khan. (2021). Self-attention Mechanisms in Transformer Architectures: Studying self-attention mechanisms in transformer architectures and their role in capturing long-range dependencies in sequential data. Journal of AI in Healthcare and Medicine, 1(1), 11-21. https://healthsciencepub.com/index.php/jaihm/article/view/24