Why do some things grab our attention while we ignore others? What is going on in our brains?
A recent article by AI/ML researcher Ekrem Aksoy attempts to describe the latest thinking in both neuroscience and machine learning.
Attention is a concept that has been successfully used in machine learning (see the seminal paper "Attention is all you need"). In natural language processing it is used to emphasise parts of language that are salient in the understanding of the text. It has also been applied successfully in computer vision.
Attention is how the brain selects a subset of information it perceives as being particularly important. This mechanism is still not completely understood in the brain, but clues as to how it works could be useful in inspiring the next generation of ML models.
🛎️ Why this matters: Attention is a key concept in ML, inspired by the brain.