skip to main content
LOTERRE

LOTERRE

Search from vocabulary

Content language

| español français
Search help

Concept information

Preferred term

transformer  

Definition(s)

  • "sequence transduction model based entirely on attention, replacing the recurrent layers most commonly used in encoder-decoder architectures with multi-headed self-attention." (Vaswani et al., 2017, p. 10).

Broader concept(s)

Synonym(s)

  • self-attention model

Bibliographic citation(s)

  • • Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A. N., Kaiser, L., & Polosukhin, I. (2017). Attention is all you need. arXiv:1706.03762 [cs]. http://arxiv.org/abs/1706.03762

In other languages

  • French

  • modèle auto-attentif
  • modèle d'auto-attention

URI

http://data.loterre.fr/ark:/67375/LTK-PFDPNVQ7-3

Download this concept:

RDF/XML TURTLE JSON-LD Last modified 6/22/23