skip to main content
LOTERRE

LOTERRE

Search from vocabulary

Content language

| español français
Search help

Concept information

Preferred term

M-BERT  

Definition(s)

  • A multilingual langage model based on BERT.

Broader concept(s)

Synonym(s)

  • Multilingual BERT

Bibliographic citation(s)

  • • Devlin, J., Chang, M.-W., Lee, K., & Toutanova, K. (2018). BERT : Pre-training of deep bidirectional transformers for language understanding. https://arxiv.org/abs/1810.04805v2

based on

has application field

implements

is encoded in

is executed in

In other languages

URI

http://data.loterre.fr/ark:/67375/LTK-DP8XXP9W-H

Download this concept:

RDF/XML TURTLE JSON-LD Last modified 6/20/24