skip to main content
LOTERRE

LOTERRE

Search from vocabulary

Content language

| español français
Search help

Concept information

Preferred term

mBART  

Definition(s)

  • "a sequence-to-sequence denoising auto-encoder pre-trained on large-scale monolingual corpora in many languages using the BART objective" (Liu et al., 2020).

Broader concept(s)

Bibliographic citation(s)

  • • Liu, Y., Gu, J., Goyal, N., Li, X., Edunov, S., Ghazvininejad, M., Lewis, M., & Zettlemoyer, L. (2020). Multilingual denoising pre-training for neural machine translation. arXiv:2001.08210 [cs]. http://arxiv.org/abs/2001.08210

based on

has application field

has design country

  • United States

is executed in

In other languages

URI

http://data.loterre.fr/ark:/67375/LTK-CX7MNQZF-F

Download this concept:

RDF/XML TURTLE JSON-LD Last modified 6/20/24