skip to main content
Language:
Search Limited to: Search Limited to: Resource type Show Results with: Show Results with: Search type Index

Attention in Natural Language Processing

IEEE transaction on neural networks and learning systems, 2021-10, Vol.32 (10), p.4291-4308

ISSN: 2162-237X ;EISSN: 2162-2388 ;DOI: 10.1109/TNNLS.2020.3019893 ;PMID: 32915750 ;CODEN: ITNNAL

Digital Resources/Online E-Resources

Citations Cited by
  • Title:
    Attention in Natural Language Processing
  • Author: Galassi, Andrea ; Lippi, Marco ; Torroni, Paolo
  • Subjects: Computational modeling ; Computer architecture ; Natural language processing ; Natural language processing (NLP) ; neural attention ; Neural networks ; review ; survey ; Task analysis ; Taxonomy ; Visualization
  • Is Part Of: IEEE transaction on neural networks and learning systems, 2021-10, Vol.32 (10), p.4291-4308
  • Description: Attention is an increasingly popular mechanism used in a wide range of neural architectures. The mechanism itself has been realized in a variety of formats. However, because of the fast-paced advances in this domain, a systematic overview of attention is still missing. In this article, we define a unified model for attention architectures in natural language processing, with a focus on those designed to work with vector representations of the textual data. We propose a taxonomy of attention models according to four dimensions: the representation of the input, the compatibility function, the distribution function, and the multiplicity of the input and/or output. We present the examples of how prior information can be exploited in attention models and discuss ongoing research efforts and open challenges in the area, providing the first extensive categorization of the vast body of literature in this exciting domain.
  • Publisher: IEEE
  • Language: English
  • Identifier: ISSN: 2162-237X
    EISSN: 2162-2388
    DOI: 10.1109/TNNLS.2020.3019893
    PMID: 32915750
    CODEN: ITNNAL
  • Source: IEEE Xplore Open Access Journals

Searching Remote Databases, Please Wait