[1]
Amir Khan 2021. Self-attention Mechanisms in Transformer Architectures: Studying self-attention mechanisms in transformer architectures and their role in capturing long-range dependencies in sequential data. Journal of AI in Healthcare and Medicine. 1, 1 (May 2021), 11–21.