skip to main content
LOTERRE

LOTERRE

Search from vocabulary

Content language

| español français
Search help

Concept information

Preferred term

BERT  

Definition(s)

  • "BERT is designed to pretrain deep bidirectional representations from unlabeled text by jointly conditioning on both left and right context in all layers." (Devlin et al., 2019).

Broader concept(s)

Synonym(s)

  • Bidirectional Encoder Representations from Transformers

Bibliographic citation(s)

  • • Devlin, J., Chang, M.-W., Lee, K., & Toutanova, K. (2019). BERT : Pre-training of deep bidirectional transformers for language understanding. arXiv:1810.04805 [cs]. http://arxiv.org/abs/1810.04805

has application field

has design country

  • United States

has for input language

is implemented by

is encoded in

is executed in

In other languages

URI

http://data.loterre.fr/ark:/67375/LTK-SSWGBD85-7

Download this concept:

RDF/XML TURTLE JSON-LD Last modified 6/22/23